Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can't really complain, it make sense. Just sucks.
Programmer Humor
Post funny things about programming here! (Or just rant about your favourite programming language.)
Rules:
- Posts must be relevant to programming, programmers, or computer science.
- No NSFW content.
- Jokes must be in good taste. No hate speech, bigotry, etc.
I guess you can always just add an assert not data.isna().any()
in strategic locations
That could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but MyStruct input;
would not while MyStruct input {};
will (that was the fix). Long story.
If you use the GNU libc the feenableexcept
function, which you can use to enable certain floating point exceptions, could be useful to catch unexpected/unwanted NaNs
Oof. This makes me appreciate the abstractions in Go. It's a small thing but initializing structs with zero values by default is nice.
Oof. C++ really is a harsh mistress.
Fucking over-dramatic divisions by 0, sigh.
Also applies to nulls in SQL queries.
It's not fun tracing where nulls are coming from when dealing with a 1500 line data warehouse pipeline query that aggregates 20 different tables.
Nanananana! Batman!
"Bounds checking, mobof--ker! Do you speak it?"
As I was coding in C++ my own Engine with OpenGL. I forgot something to do. Maybe forgot to assign a pointer or forgot to pass a variable. At the end I had copied a NaN value to a vertieces of my Model as the Model should be a wrapper for Data I wanted to read and visualize.
Printing the entire Model into the terminal confused me why everything is NaN suddenly when it started nicely.
Thanks. This is great
NaN is such a fun floating point virus. Some really wonky gameplay after we hit NaN in a few spots.