OmnipotentEntity

joined 2 years ago
[–] OmnipotentEntity 3 points 2 hours ago (1 children)

2025 - 2007 = 18, actually.

It's kind of crazy remembering 1) horrible Bush was, 2) how much better Bush was than the current administration in retrospect.

The point about the conspiricists now toeing the Republican party line was prescient.

Good article, thanks for sharing.

[–] OmnipotentEntity 18 points 3 days ago

Shadi Hamid wrote Monday, "I’m more critical of Democrats precisely because I expect more from them."

Ok...

On the other hand, "Democrats consistently fall short of the very ideals they profess to champion."

With you so far...

Examples of Democrats' "hypocrisy," according to Hamid, include, preaching tolerance and inclusion "while marginalizing pro-life Democrats, talking down to Black and brown voters, ignoring religious conservatives and dismissing the growing ranks of Americans who felt the party had become too radical on issues such as gender identity. On policy, what was once the working-class party chose to prioritize things such as college debt relief, which disproportionately benefits the wealthy."

Who knew the problem with Dems is actually they need to be more conservative? Really what we need is there to be absolutely no distinction between parties at all, not even relatively minor ones. Instead of an ineffective resistance, we, the American public, demand Chuck Schumer to actually give Donald Trump and Elon Musk a blow job live on C-SPAN. Thanks Hamid. Fuck you.

[–] OmnipotentEntity 38 points 1 week ago* (last edited 1 week ago)

"You mean I hitched my wagon to Grabthor The Wagon Destroyer, and he destroyed my wagon? This is an outrage!"

[–] OmnipotentEntity 7 points 2 weeks ago

Bloom County was so great.

[–] OmnipotentEntity 4 points 4 weeks ago (1 children)

I also don't know, but from context it seems like they are associated with mental institutions.

[–] OmnipotentEntity 3 points 1 month ago* (last edited 1 month ago) (1 children)

They have not actually!

Lloyd has though, and Anya was very conspicuously absent from that interaction.

Also very recently Anya has met the older Desmond brother and found it unusual that she could not read his mind. Probably related.

[–] OmnipotentEntity 73 points 1 month ago (3 children)

Or instead of targeting tiktok specifically, they could have chosen to pass a data privacy law and actually did something worthwhile instead of pointless, unpopular grandstanding. Haha just kidding, they would never do anything to reduce even slightly shareholder value.

[–] OmnipotentEntity 3 points 1 month ago

With Zuckerberg, Bezos, and Musk all leading Trump around with a $100 bill on a fishing line, you'd have to be profoundly naive or dishonest to actually take the stance that Republicans will do anything about Big Tech.

[–] OmnipotentEntity 19 points 1 month ago

Well, you see, it was only a problem for him when it turned against him. When he actively supported it his entire career it was the obvious and natural order of things

[–] OmnipotentEntity 3 points 1 month ago

6 times larger but 80 times as massive?

Per the paper the 6(.41) is indeed referring to the radius. Volume scales as R^3, so if the density of this planet and the earth were the same we would expect the mass to be 263 times as large as Earth's.

Neptune, for instance, is 3.8 time Earth's radius, and 17 times its mass instead of 54.9 time its mass as you might naively expect from the radius.

These ratios (54.9/17 vs 263/80) are almost the same. So the new planet is about as dense as Neptune.

[–] OmnipotentEntity 15 points 1 month ago (1 children)

The hottest year on record so far.

 

Abstract:

Hallucination has been widely recognized to be a significant drawback for large language models (LLMs). There have been many works that attempt to reduce the extent of hallucination. These efforts have mostly been empirical so far, which cannot answer the fundamental question whether it can be completely eliminated. In this paper, we formalize the problem and show that it is impossible to eliminate hallucination in LLMs. Specifically, we define a formal world where hallucina- tion is defined as inconsistencies between a computable LLM and a computable ground truth function. By employing results from learning theory, we show that LLMs cannot learn all of the computable functions and will therefore always hal- lucinate. Since the formal world is a part of the real world which is much more complicated, hallucinations are also inevitable for real world LLMs. Furthermore, for real world LLMs constrained by provable time complexity, we describe the hallucination-prone tasks and empirically validate our claims. Finally, using the formal world framework, we discuss the possible mechanisms and efficacies of existing hallucination mitigators as well as the practical implications on the safe deployment of LLMs.

 

You might know the game under the name Star Control 2. It's a wonderful game that involves wandering around deep space, meeting aliens, and navigating a sprawling galaxy while trying to save the people of Earth, who are being kept under a planetary shield.

 

Subverting Betteridge's law of headlines. Yes.

 

Sometimes, because I am ancient, I automatically type in www. before I type in beehaw.org into my address bar. It would be nice and comfy to have that give a CNAME redirect instead of just completely failing to DNS resolve.

28
submitted 2 years ago* (last edited 2 years ago) by OmnipotentEntity to c/science
41
submitted 2 years ago* (last edited 2 years ago) by OmnipotentEntity to c/gaming
 

the Logitech F710 is a solid controller to get if you’re on a tight budget, but perhaps not exactly the type of equipment you want to stake your life on. [...] Reviewers on sites like Amazon frequently mention issues with the wireless device's connection.

The reporter, who followed an expedition of the Titan from the launch ship, wrote that “it seems like this submersible has elements of MacGyver jerry-riggedness.”

view more: next ›