Dominic

joined 1 year ago
[–] Dominic 11 points 1 year ago (5 children)

Nintendo’s exclusives are where the Switch really shines. Unfortunately, they’re expensive. I’ll echo the DekuDeals recommendation for finding sales.

Other Nintendo titles that are worthwhile, aside from the obvious Breath of the Wild and Tears of the Kingdom and depending on your tastes:

  • Super Mario Odyssey
  • Mario Kart 8
  • Super Smash Bros. Ultimate
  • Animal Crossing New Horizons
  • Splatoon 3 (2 is good too, but 3 is an improvement and more active)
  • Donkey Kong Country Tropical Freeze
  • Pikmin (the whole series)
  • Metroid Dread
  • Metroid Prime Remastered
  • Fire Emblem Three Houses
  • Pokemon Legends Arceus

There are also tons of great indie games that play well on Switch (especially handheld):

  • Hades
  • Dead Cells
  • Hollow Knight
  • Slay the Spire
  • Into the Breach
  • Shovel Knight
[–] Dominic 7 points 1 year ago (2 children)

Firstly, the term “globalists” is an anti-Semitic dogwhistle. Beyond that usage, it’s meaningless.

Secondly, YouTube is riddled with disinformation. This is primarily due to the algorithm which drives receptive users to extremist videos (and skeptical users who might refute those videos away from them). It’s also because it’s a lot more difficult to fact-check spoken language than written language.

[–] Dominic 12 points 1 year ago

To my knowledge, Reddit is owned by private companies and investors. Blackrock and Vanguard have no ownership stake, or a very small and very indirect ownership stake.

For what it’s worth, a significant percentage of every (reasonably liquid) public company on Earth is owned by Vanguard and Blackrock, because those companies manage trillions of dollars in assets (many of which are middle-class people’s retirement investments). They aren’t a conspiracy. They’re asset managers, and mostly passive managers at that.

[–] Dominic 6 points 1 year ago

Into the Breach’s soundtrack is also outstanding, by the same composer for the same developer.

[–] Dominic 7 points 1 year ago* (last edited 1 year ago)

I’m extrinsically motivated, but my definition of “extrinsic” is pretty loose. I’ll do things that aren’t necessary to beat the game (I don’t even need the game to be “beatable”). As long as I’m finishing something and getting a reward for it, I’m content.

I’m having a great time doing side content in Tears of the Kingdom: completing as many shrines and side quests as I can, hoarding materials for armor upgrades, etc. Those are optional objectives that you can truly complete. However, I don’t spend much time experimenting with Ultrahand.

Similarly in Minecraft, I liked accumulating resources in survival mode, but I bounced off of creative mode.

EDIT: apparently my Lemmy app went haywire and posted this about 8 times. Very sorry.

[–] Dominic 5 points 1 year ago

His whole channel is delightful. I found out about it a few weeks ago and binged a few dozen of his videos.

[–] Dominic 6 points 1 year ago

For now, we're special.

LLMs are far more training data-intensive, hardware-intensive, and energy-intensive than a human brain. They're still very much a brute-force method of getting computers to work with language.

[–] Dominic 3 points 1 year ago

For what it’s worth, it may just be a related bacteria.

[–] Dominic 13 points 1 year ago (1 children)

Didn’t know he mentioned smallpox; that’s much worse considering how extremely well-documented it has been over the past few centuries. There was already a vaccine for it in the 18th century!

The HIV claim is pretty nuts considering that it was already widespread when it was discovered in 1981, and there were no known human retroviruses until 1980. There was no reason for vaccine researchers to touch retroviruses in the 70s.

Doubly nuts that RFK Jr. has previously denied that AIDS is caused by HIV.

[–] Dominic 6 points 1 year ago

Also, how you know it read the book, and not a summary of it, of which there are loads on the internet?

In the case of ChatGPT, it's hard to tell. OpenAI won't even reveal what their training dataset was.

Researchers have done some tests to tease this out, and they're pretty confident that it has read quite a few books and memorized them verbatim. See one of my favorite papers in a while, Speak, Memory: An Archaeology of Books Known to ChatGPT/GPT-4.

[–] Dominic 5 points 1 year ago

AIs are trained for the equivalent of thousands of human lifetimes (if not more). There's no precedent for anything like this.

[–] Dominic 4 points 1 year ago* (last edited 1 year ago) (1 children)

There are a few reasons why music models haven't exploded the way that large-language models and generative image models have. Maybe the strength of the copyright-holders is part of it, but I think that the technical issues are a bigger obstacle right now.

  • Generative models are extremely data-inefficient. The Internet is loaded with text and images, but there isn't as much music.

  • Language and vision are the two problems that machine learning researchers have been obsessed with for decades. They built up "good" datasets for these problems and "good" benchmarks for models. They also did a lot of work on figuring out how to encode these types of data to make them easier for machine learning models. (I'm particularly thinking of all of the research done on word embeddings, which are still pivotal to large language models.)

Even still, there are fairly impressive models for generative music.

view more: ‹ prev next ›