Who said anything about fully validating hardware? "Hardware vendors should solve their own problems" is not the same as "hardware vendors should fully validate their products".
firelizzard
Is there supposed to be a link?
My comment game has gotten far better since I started doing live code reviews. Essentially I ask myself, “Would I feel the need to explain this to someone during a code review?” and if the answer is yes I add a comment.
That’s a hot take. If you want your code to be maintainable at all, it needs comments. If you’re part of a team, write comments for them. If someone else may take over your project after you move on, leave comments for them. And have you ever tried to read uncommented code you wrote a year ago? Leave comments for yourself.
The con is that it’s not very powerful. I haven’t attempted to code on a gaming handheld, but I’ve had issues with a midrange laptop being under powered. RAM is probably the biggest issue. My life improved noticeably when I upgraded my main machine to 64 GB. Granted I was doing particularly heavy work. It really depends on what you’re doing. You could get away with it for some work, but it’s going to be painfully slow for other stuff.
The key difference is that compilers don’t fuck up, outside of the very rare compiler bug. LLMs do fuck up, quite often.
Copilot frequently produces results that need to be fixed. Compilers don’t do that. Anyone who uses copilot to generate code without understanding how that code works is a shit developer. The same is true of anyone who copies from stack overflow/etc without understanding what they’re copying.
I'd create my own macro or function for that. I have enough ADD that I cannot stand boring shit like that and I will almost immediately write a pile of code to avoid having to do boring crap like that, even with copilot.
Using git reset --keep
would just make more work since I'll have to throw away uncommitted changes anyways. Removing uncommitted changes is kind of the whole point, it is called 'reset' after all. If I want to preserve uncommitted changes, I'll either stash them or commit them to a temporary branch. That has the added benefit of adding those changes to the reflog so if I screw up later I'll be able to recover them.
If you’re using reset with uncommitted changes and you’re not intentionally throwing them away, you’re doing something wrong. git reset —hard
means “fuck everything, set the state to X”. I only ever use it when I want to throw away the current state.
I have not and will not ever use AI generated code that I don’t thoroughly understand. If you properly understand the code you’re committing there shouldn’t be any damage. And beyond AI you should never commit code that you don’t properly understand unless it’s a throw away project.
But that's not the question. There are two questions: Who should be responsible for patching hardware vulnerabilities? And if the answer is "the kernel" then should speculative but never demonstrated vulnerabilities be patched? Linus' answer is the hardware manufacturer, and no.
Maybe we're running into the ambiguity of language. If you mean to say, "Who does it cause a problem for? The consumer." then sure. On the other hand what I mean, and what I think Linus means, is "Who's responsible for the vulnerability existing? Hardware vendors. Who should fix it? Hardware vendors."
Depends on what you/we/they mean by "speculative". IMO, we need to do something (microcode, kernel patches, whatever) to patch Spectre and Meltdown. Those have been demonstrated to be real vulnerabilities, even if no one has exploited them yet. But "speculative" can mean something else. I'm not going to read all the LMK emails so maybe they're talking about something else. But I've seen plenty of, "Well if X, Y, and Z happen then that could be a vulnerability." For that kind of speculative vulnerability, one that has not been demonstrated to be a real vulnerability, I am sympathetic to Linus' position.