this post was submitted on 19 Jul 2023
648 points (100.0% liked)

Programmer Humor

852 readers
1 users here now

Post funny things about programming here! (Or just rant about your favourite programming language.)

Rules:

founded 5 years ago
MODERATORS
 
top 36 comments
sorted by: hot top controversial new old
[–] Lakso@ttrpg.network 81 points 1 year ago (2 children)

...then don't study computer science. I study CS and it's annoying when someone in a more math/logic oriented course is like "If I get a job at a tech company I won't need this". All that IS computer science, if you just wanna code, learn to code.

[–] Zetaphor@zemmy.cc 31 points 1 year ago (1 children)

The problem is a lot of people who want to learn to code, and are conditioned to desire the college route of education, don't actually know that there is a difference and that you can be completely self-taught in the field without ever stepping foot in a university.

[–] oce@jlai.lu 14 points 1 year ago

We're not closing schools despite having libraries and the internet, having (good) teachers is useful to learn faster and get pushed further. There are some good programming schools that can make it more efficient for you. I think the main problem is rather the insane cost of higher education in the USA which create anxiety about being certain that you can repay it in the future it may open for you. It is sad.

[–] Neato@kbin.social 11 points 1 year ago (5 children)

Can you get well paying coding jobs with upward mobility without at least a BA in CS?

[–] AnarchoYeasty 15 points 1 year ago

It's harder to break into but I make 150k and barely graduated high school. Software engineering is largely a field that doesn't care about degrees but about ability. It's harder these days to break into the field than it was 10 years ago when I did but it's absolutely still possible

[–] fred@lemmy.ml 8 points 1 year ago

I have a fine arts degree and I'm a lead dev 🤷‍♂️

[–] oce@jlai.lu 5 points 1 year ago

Maybe not what you're asking but people with a non-CS M.Sc or PhD commonly switch to coding, especially in the data fields.

[–] Zetaphor@zemmy.cc 2 points 1 year ago* (last edited 1 year ago)

I've never been to college and my job title today is Software Architect, I've been doing this for nearly 20 years.

It was extremely hard at first to get a job because everyone wanted a BA, but that was also 20 years ago. Once I had some experience and could clearly demonstrate my capabilities they were more open to hiring me. The thing a degree shows is that you have some level of experience and commitment, but the reality is a BA in CompSci doesn't actually prepare you for the reality of 99% of software development.

I think most companies these days have come to realize this. Unless you're trying to apply to one of the FANG corps (or whatever the acronym is now) you'll be just fine if you have a decent portfolio and can demonstrate an understanding of the fundamentals.

[–] sheepyowl@lemmy.sdf.org 1 points 1 year ago

If you entered the field 10 years ago, sure. If you're trying to enter the field now, I have bad news...

[–] HairHeel@programming.dev 59 points 1 year ago (2 children)

4 years later: "this button is the wrong color. fix it ASAP"

[–] Zetaphor@zemmy.cc 7 points 1 year ago

I was interviewed with complex logic problems and a rigorous testing of my domain knowledge.

Most of what I do is updating copy and images.

[–] AkumaFoxwell@feddit.de 3 points 1 year ago

This hurts so much because it's my life :(

[–] magic_lobster_party@kbin.social 25 points 1 year ago (2 children)

I wonder how many in that class will ever need to think about multitape Turing machines ever again.

[–] jakoma02@czech-lemmy.eu 28 points 1 year ago (1 children)

The point of these lectures is mostly not to teach how to work with Turing machines, it is to understand the theoretical limits of computers. The Turing machine is just a simple to describe and well-studied tool used to explore that.

For example, are there things there that cannot be computed on a computer, no matter for how long it computes? What about if the computer is able to make guesses along the way, can it compute more? Because of this comic, no — it would only be a lot faster.

Arguably, many programmers can do their job even without knowing any of that. But it certainly helps with seeing the big picture.

[–] riskable@programming.dev 7 points 1 year ago (2 children)

Arguably, a much more important thing for the students to learn is the limits of humans. The limits of the computer will never be a problem for 99% of these students or they'll just learn on the job the types of problems they're good at solving and the ones that aren't.

[–] SkyeStarfall@lemmy.blahaj.zone 6 points 1 year ago (1 children)

The limits of computers would be the same as the limits for humans. We have no reason to think the human brain has a stronger computation power than a Turing machine.

So, in a way, learning about the limits of computers is the exact same as learning the limits of humans.

But also, learning what the limits of computers are is absolutely relevant. You get asked to create an algorithm for a problem and its useful to be able to figure out whether it actually is solvable, or how fast it theoretically can be. Avoids wasting everyone's time trying to build an infinite loop detector.

[–] riskable@programming.dev 5 points 1 year ago (1 children)

The "limits of humans" I was referring to were things like:

  • How long can you push a deadline before someone starts to get really mad
  • How many dark patterns you can cram into an app before the users stop using it
  • The extremes of human stupidity

👍

[–] SkyeStarfall@lemmy.blahaj.zone 5 points 1 year ago* (last edited 1 year ago)

..none of which would be relevant for most people working in back-end, which would be most people that take compsci.

I would hate to go to a compsci study and learn management instead. It's not what I signed up for.

University also shouldn't just be a job training program.

[–] bh11235@infosec.pub 3 points 1 year ago

Two govt spooks are hunting a dangerous fugitive who is also a humanities graduate. He escapes into a sprawling maze of tunnels. "It's hopeless," one of the spooks says. But the other simply says, "Watch." then proclaims loudly, "studying linear algebra is important because of its use in stochastic processes and image manipulation." Before he finishes the sentence, the fugitive emerges back out the tunnel and shouts, "but what's even more important --" and is immediately knocked unconscious and taken for questioning

[–] z500@startrek.website 2 points 1 year ago (1 children)

Never used a Turing machine, but I have a project that generates NFAs and converts them to DFAs so they run faster.

[–] riskable@programming.dev 3 points 1 year ago (1 children)

How does one convert a No Fear Article into a Definitely Fear Article?

[–] CallumWells@lemmy.ml 2 points 1 year ago

I thought it was Non-Fungible Articles and Decentralised Federated Articles

[–] GTG3000@programming.dev 17 points 1 year ago

But you can make games that much more interesting if your algorithms are on point.

Otherwise it's all "well I don't know why it generated map that's insane". Or "well AI has this weird bug but I don't understand where it's coming from".

[–] klemptor@lemmy.ml 11 points 1 year ago* (last edited 1 year ago) (1 children)
[–] blivet@kbin.social 5 points 1 year ago

I’m grateful to this strip because reading it caused me to learn the correct spelling of “abstruse”. I’ve never heard anyone say the word, and for some reason I had always read it as “abtruse”, without the first S.

[–] linuxduck@nerdly.dev 7 points 1 year ago (1 children)

I loved learning lambda calculus (though for me it was super hard)

[–] Gork@lemm.ee 2 points 1 year ago (3 children)

I never really understood the point of Lambda calculus. Why have an anonymous function? I thought it was good practice to meticulously segment code into functions and subroutines and call them as needed, rather than have some psuedo-function embedded somewhere.

[–] linuxduck@nerdly.dev 2 points 1 year ago

I suppose it has to do with being stateless.

I just loved learning about lambda calculus.

I think the idea is to remove complexity by never dealing with state, so you just have one long reduction till you get to the final state...

But someone who's more into lambdas etc should speak about this and not me (a weirdo)

[–] rabirabirara@programming.dev 2 points 1 year ago

I think you're confusing lambdas with lambda calculus. Lambda calculus is more than just anonymous functions.

To put it extremely simply, let's just say functional programming (the implementation of lambda calculus) is code with functions as data and without shared mutable state (or side effects).

The first one increases expressiveness tremendously, the second one increases safety and optimization. Of course, you don't need to write anonymous functions in a functional language if you don't want to.

As for why those "pseudo-functions" are useful, you're probably thinking of closures, which capture state from the context they are defined in. That is pretty useful. But it's not the whole reason lambda calculus exists.

[–] Zangoose@lemmy.one 1 points 1 year ago

See the other comments about lambdas vs. lambda calculus, but lambdas are supposed to be for incredibly simple tasks that don't need a full function definition, things that could be done in a line or two, like simple comparisons or calling another function. This is most useful for abstractions like list filtering, mapping, folding/reducing, etc. where you usually don't need a very advanced function call.

I was always taught in classes that if your lambda needs more than just the return statement, it should probably be its own function.

[–] Lmaydev@programming.dev 7 points 1 year ago

I did games technology at university. We had a module that was just playing board games and eventually making one. Also did an unreal engine module that ended with making a game and a cinematic.

It was awesome.

[–] static_motion@programming.dev 6 points 1 year ago (1 children)

Sipser is an absolute banger of a book though.

[–] christian@lemmy.ml 1 points 1 year ago

I read it cover-to-cover like fifteen years ago. I've lost most of that knowledge since I haven't touched it in so long, but I remember I really enjoyed it.

[–] argv_minus_one 3 points 1 year ago

Since when were Turing machines ever nondeterministic?

[–] Bibez@lemmy.ml 3 points 1 year ago
[–] Saigonauticon@voltage.vn 1 points 1 year ago

Hm, I wonder if I could make these students more miserable by introducing a CPU that permits static operation, then clocking that with a true random number generator?

So now it has output that is deterministic from the standpoint of the CPU but nondeterministic to an outside observer. Probably wouldn't affect the O(n) notation though, come to think of it. It would be funny though.