this post was submitted on 02 Oct 2024
28 points (100.0% liked)
Technology
37742 readers
74 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It is undeniably satisfying though to turn all settings in a game up to maximum without performance tanking, but you and I (same card, but 1440p screen) are not the target audience. This is for people who want (and can afford) at least 4K with ray-tracing in the latest games and all of this at triple-digit frame rates - or they are actually using it for non-gaming applications: Even our old 2080 is a beast for tasks like offline rendering, scientific calculations, machine learning, etc. - and a 4090 is of course several times better at this.
I know this is going way off-topic, but I love providing a bit of perspective: The fastest supercomputer in 1996 was the Hitachi CP-PACS/2048 at 368.20 GFLOPS. In 1997, it was the Intel ASCI Red/9152 at 1.338 TFLOPS. An RTX 2080 achieves 314.6 GFLOPS at 64-bit precision (as used by the TOP 500 list of supercomputers) and an RTX 4090 1.290 TFLOPS. Granted, despite similar processing power on paper (and FLOPS being hardly an objective measure to compare vastly different architectures and systems), even ancient supercomputers still have modern GPUs beat in terms of the amount of memory alone (although latency is of course far worse): 128 GB (2,048 * 64 ~~GB~~ MB) in case of the Hitachi system, for example.
Sure, but are there really enough people who fit into that category to justify these cards? Based on the 4080 series sales, it seems not, but they're still coming out with even beefier, more expensive cards anyways.
In a similar vein, I grew up around IT because my mom worked on mainframes. I remember lots of nights of sitting under her desk at 3am because she got called in as Production Support when jobs would ABEND. When I was in high school, wanting to learn more about mainframes I set up Hercules, a mainframe System3xx emulator (looks like it supports z/OS now as well), and managed to find boot media for System370 and MVS. The desktop computer I was running the emulator on (a Gateway, showing my age), was more powerful than the original mainframe hardware.