this post was submitted on 02 May 2024
274 points (100.0% liked)

Nostalgia

29 readers
1 users here now

nostalgia noun nos·tal·gia nä-ˈstal-jə nə-, also nȯ-, nō-; nə-ˈstäl- 1: a wistful or excessively sentimental yearning for return to or of some past period or irrecoverable condition also : something that evokes nostalgia

Rules for Nostalgia Lemmy Community

1. Respectful Nostalgia Share nostalgic content and memories respectfully. Avoid offensive or insensitive references that may be hurtful to others.

2. Relevant Nostalgia Posts should focus on nostalgic content, including memories, media, and cultural references from the past. Stay on topic to preserve the nostalgic theme of the community.

3. Source Verification If you share nostalgic media or content, provide accurate sources or background information when possible.

4. No Spamming Avoid excessive posting of similar nostalgic topics to keep content diverse and engaging for all members.

5. Positive Discussions Encourage positive discussions and interactions related to nostalgic topics. Respect different viewpoints and memories shared by community members.

6. Quality Content Strive to post high-quality content that sparks nostalgia and meaningful conversations among members.

7. Moderation Guidelines

By adhering to these rules and guidelines, we can create a welcoming and enjoyable space to relive nostalgic moments together. If you have any questions or suggestions, feel free to reach out to the moderators. Thank you for sharing your nostalgia responsibly!

founded 1 year ago
MODERATORS
 

cross-posted from: https://lemmit.online/post/2823044

This is an automated archive made by the Lemmit Bot.

The original was posted on /r/nostalgia by /u/Character-Emotion237 on 2024-05-02 22:39:26.

you are viewing a single comment's thread
view the rest of the comments
[–] MystikIncarnate@lemmy.ca 26 points 6 months ago (2 children)

The aero interface was a really good addition, in the early days of it, it sucked because it required alpha blending which wasn't very optimized in graphics cards of the era. So even if it was supported, it ran like shit and ate performance.

I liked Vista. I can still point to examples of stuff in Windows 7 which were broken that worked perfectly fine in Vista. AFAIK, that stuff was never fixed because it was a niche item that most people didn't bother with.

I've been up, down, inside and out of all of these OSes throughout my time in IT, and I can see all the problems. The sidebar/widgets in Vista were a mistake, poorly implemented fluff that consumed too many resources for what they could realistically do. If they were a lot lighter in terms of performance demands, then they would have been fine. Beyond that 7 was basically a reskinned Windows Vista, with some "updates". You could get the same updates for Vista by the time 7 came out, and it made Vista quite reasonable.

IMO, the biggest problem Vista faced wasn't that it was bloated or slow or buggy (though it was very buggy at the beginning - again, mostly fixed with patches by the time 7 landed), it was that Vista was built for the best computers at the time, with the idea that everything would improve to the point where the best computers "today" would become the minimum standard tomorrow. They were right of course, but most people were buying the cheapest HP, Dell, Acer, etc computers they could find with Intel Celeron processors and basically no graphics hardware worth a damn.... They came shipped with Vista and it sucked because you bought a shit PC. The industry was going through a bargain basement type of phase where there were a lot of "discount" CPUs coming out. Before Celeron, you bought an Intel Pentium something or other, and they were all the same. You couldn't get a premium Pentium processor, or a discount Pentium processor.... Celeron was the first foray into what would become Intel atom, or at most the core i3. Intel was looking to expand into the budget households at a time when Microsoft put out their most demanding version of Windows. So people snapped up these Celeron shit boxes pre-installed with Vista and to nobody's surprise, it sucked.

So Microsoft cobbled together a new, somewhat less demanding UI, threw away the widgets, and released the same thing as Windows 7. By the time it hit shelves, most people had figured out (specifically OEMs) that the Celeron wasn't just a cheaper Intel CPU, it was handicapped. So the focus was turned to the core series of CPUs, and away from the hobbled Celeron line.

Look, Celeron had a place, just like Intel Atom. In consumer desktops, is not that place. Something like a point of sale, yeah, that's fine. A glorified web browser or kiosk, sure. All good. A multitasking desktop? Not so much. But OEMs put that shit in everything, and they stopped doing that pretty quickly.

I very successfully ran a core 2 duo system on Vista for a long time in my younger years. 4G of RAM, Nvidia GPU... It worked really well, and I used it for many years. It was even a laptop. I still have it, it still works, but I have more powerful systems, so it's sitting in a protective case and has been untouched for many years at this point. That system was my daily driver for pretty much the entirety of college. I also had a core 2 duo at home, but I spent most of my days on campus with my laptop.

It was excellent.

Yet, everyone praises Windows 7, despite it having features that were non-functional for its entire lifespan, which people either didn't notice, or didn't care that those things didn't work. I don't hate 7. I still think that it's UI and everything was superior over 8/8.1, and that it was a really good OS for average use. As an administrator and a power user, I'm happy on 10. I also used 8/8.1 for a while and though it was functional, it was pretty painful overall. Especially compared to 10.

Don't get me started on W11.

[–] ColonelPanic@lemm.ee 8 points 6 months ago (1 children)

I agree and think the main issue with Vista, as you alluded to, was that Microsoft set the minimum specs far too low and gave companies an excuse to add the absolute minimum bargain basement components, then blame Vista for being slow.

However, if they'd increased the minimum requirements those same companies would have a fit and refuse to ship Vista at all.

[–] MystikIncarnate@lemmy.ca 3 points 6 months ago

Microsoft has always had ridiculously low system requirements. Even if they seemed high at the time, they were rapidly outpaced by the technology.

After XP, if someone fell below the minimum spec, it was more of a question of, "why do you still own something so shit?" Rather than, "why are the requirements so high?"

Even with W11, which I'll only lightly touch on, the only concerning requirement is the TPM. Until Microsoft required that you have one to run 11, nobody really knew WTF a TPM is, outside of IT security circles. Most still don't know, and only understand it's a requirement for Windows 11.

It's fine, for most people the role of a TPM is entirely technical mumbo jumbo. What hurts is that a lot of motherboard OEMs put little to no thought into even allowing an option for a TPM; so anyone who built their own machine essentially got screwed by the requirement.

Beyond the TPM, 11 requires a 64 bit processor, 1Ghz+, 4G of RAM, 64G disk, UEFI, a GPU with support for DX12, 720p display with at least 8 bits per color of depth, and internet.

Given that I'm not even sure they make single core or less than 1Ghz processors anymore, and who tf doesn't have a 64 bit compatible CPU (that was added to Intel CPUs at the core 2 series... No, not core i* 2nd gen, the core 2, two generations prior), if you have less than 8G of RAM for anything now, WTF can you even use your computer for? 64G of storage is a joke. I don't even think they still manufacture SSDs smaller than 64G now. HDDs are easily 500G+ pretty much exclusively. UEFI replaced BIOS booting a while ago. Like 5-10 years ago. It was at least an option for booting on systems for nearly a decade or more. DX12 is kinda new, if you consider 2015 (DX12's launch date) "new". 720p is tiny, are you using a netbook? I don't even think you can find anything smaller than 720p as a full monitor anymore to buy, and 8 bits per channel of color is 24bit, which has been the standard to strive for since XP.

The only pinch was the TPM. Most people didn't have one, many couldn't even get one, and those that could, couldn't find them. Most custom builds didn't have one included and the main board OEM didn't make one available, even if they had a port for you to put one in. People using prebuilt systems were mixed, some laptops (and some desktops too, I suppose) shipped with one whether you asked for it or not, notably business systems were often in this category, and if they didn't, you were SOL. Buy a new computer.

For CPUs, anything before the Intel core i series 8th gen was generally "unsupported" according to Microsofts supported CPU list for W11. 8th gen came out in what? 2017? Making them 4 years old by the time W11 landed, at which point you'll probably want to upgrade to something newer before upgrading to W11, since an unsupported CPU would be at least 5+ years old at that point.

The required specs are so far below what I would have recommended for a build when W11 launched. Most people only failed on the TPM. A product which they had never heard about before, didn't know what it did, and didn't know why they would even want one at all, at the time.

I'm not saying older hardware isn't useful, but most people upgrade more than once every 5 years. Enthusiasts like myself are the ones running systems from 2010, and they're happy about it. The TPM was, and is, the issue.

[–] SimplyTadpole@lemmy.dbzer0.com 2 points 6 months ago

I get what you mean. I used Vista from sometime around 2011-2013, and even though it was slow, I always thought it was really beautiful and still have fond memories of it.