Just going to leave this xkcd comic here.
Yes, you already know what it is.
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
Just going to leave this xkcd comic here.
Yes, you already know what it is.
Open Document Standard (.odt) for all documents. In all public institutions (it's already a NATO standard for documents).
Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.
Actually, IMHO, there should be some better alternative to .odt as well. Something more out of a declarative/scripted fashion like LaTeX but still WYSIWYG. LaTeX (and XeTeX, for my use cases) is too messy for me to work with, especially when a package is Byzantine. And it can be non-reproducible if I share/reuse the same document somewhere else.
Something has to be made with document files.
Markdown, asciidoc, restructuredtext are kinda like simple alternatives to LaTeX
It is unbelievable we do not have standard document format.
What's messed up is that, technically, we do. Originally, OpenDocument was the ISO standard document format. But then, baffling everyone, Microsoft got the ISO to also have .docx
as an ISO standard. So now we have 2 competing document standards, the second of which is simply worse.
I was too young to use it in any serious context, but I kinda dig how WordPerfect does formatting. It is hidden by default, but you can show them and manipulate them as needed.
It might already be a thing, but I am imagining a LaTeX-based standard for document formatting would do well with a WYSIWYG editor that would hide the complexity by default, but is available for those who need to manipulate it.
There are programs (LyX, TexMacs) that implement WYSIWYG for LaTeX, TexMacs is exceptionally good. I don't know about the standards, though.
Another problem with LaTeX and most of the other document formats is that they are so bloated and depend on many other tasks that it is hardly possible to embed the tool into a larger document. That's a bit of criticism for UNIX design philosophy, as well. And LaTeX code is especially hard to make portable.
There used to be a similar situation with PDFs, it was really hard to display a PDF embedded in application. Finally, Firefox pdf.js came in and solved that issue.
The only embedded and easy-to-implement standard that describes a 'document' is HTML, for now (with Javascript for scripting). Only that it's not aware of page layout. If only there's an extension standard that could make a HTML page into a document...
Bro, trying to give padding in Ms word, when you know... YOU KNOOOOW... they can convert to html. It drives me up the wall.
And don't get me started on excel.
Kill em all, I say.
zip or 7z for compressed archives. I hate that for some reason rar has become the defacto standard for piracy. It's just so bad.
The other day I saw a tar.gz containing a multipart-rar which contained an iso which contained a compressed bin file with an exe to decompress it. Soooo unnecessary.
Edit: And the decompressed game of course has all of its compressed assets in renamed zip files.
This is the kind of thing i think about all the time so i have a few.
.tar.zst
.zip
and gzip
/.gz
) and does so faster..tar
), compressing (.zst
), and (if you so choose) encrypting (.gpg
), .tar.zst
follows the Unix philosophy of "Make each program do one thing well."..tar.xz
is also very good and seems more popular (probably since it was released 6 years earlier in 2009), but, when tuned to it's maximum compression level, .tar.zst
can achieve a compression ratio pretty close to LZMA (used by .tar.xz
and .7z
) and do it faster^1.
zstd and xz trade blows in their compression ratio. Recompressing all packages to zstd with our options yields a total ~0.8% increase in package size on all of our packages combined, but the decompression time for all packages saw a ~1300% speedup.
JPEG XL
/.jxl
.jpeg
, .png
, .gif
).AV1
.mp4
) and VP9^3.OpenDocument / ODF / .odt
.odt
is simply a better standard than .docx
.it’s already a NATO standard for documents Because the Microsoft Word ones (.doc, .docx) are unusable outside the Microsoft Office ecosystem. I feel outraged every time I need to edit .docx file because it breaks the layout easily. And some older .doc files cannot even work with Microsoft Word.
- By separating the jobs of archiving (
.tar
), compressing (.zst
), and (if you so choose) encrypting (.gpg
),.tar.zst
follows the Unix philosophy of "Make each program do one thing well.".
wait so does it do all of those things?
So there's a tool called tar that creates an archive (a .tar
file. Then theres a tool called zstd that can be used to compress files, including .tar
files, which then becomes a .tar.zst
file. And then you can encrypt your .tar.zst
file using a tool called gpg, which would leave you with an encrypted, compressed .tar.zst.gpg
archive.
Now, most people aren't doing everything in the terminal, so the process for most people would be pretty much the same as creating a ZIP archive.
By separating the jobs of archiving (.tar), compressing (.zst), and (if you so choose) encrypting (.gpg), .tar.zst follows the Unix philosophy of “Make each program do one thing well.”.
The problem here being that GnuPG does nothing really well.
Videos (Codec): AV1
- Much more efficient than x264 (used by .mp4) and VP9[3].
AV1 is also much younger than H264 (AV1 is a specification, x264 is an implementation), and only recently have software-encoders become somewhat viable; a more apt comparison would have been AV1 to HEVC, though the latter is also somewhat old nowadays but still a competitive codec. Unfortunately currently there aren't many options to use AV1 in a very meaningful way; you can encode your own media with it, but that's about it; you can stream to YouTube, but YouTube will recode to another codec.
.odt is simply a better standard than .docx.
No surprise, since OOXML is barely even a standard.
Ogg Opus for all lossy audio compression (mp3 needs to die)
7z or tar.zst for general purpose compression (zip and rar need to die)
why does ml3 need todie
It's a 30 year old format, and large amounts of research and innovation in lossy audio compression have occurred since then. Opus can achieve better quality in like 40% the bitrate. Also, the format is, much like zip, a mess of partially broken implementations in the early days (although now everyone uses LAME so not as big of a deal). Its container/stream format is very messy too. Also no native tag format so it needs ID3 tags which don't enforce any standardized text encoding.
Not the original poster, but there are newer audio codecs that are more efficient at storing data than mp3, I believe. And there's also lossless standards, compared to mp3's lossy compression.
SQLite for all “I’m going to write my own binary format because I is haxor” jobs.
There are some specific cases where SQLite isn’t appropriate (streaming). But broadly it fits in 99% of cases.
I don't know what to pick, but something else than PDF for the task of transferring documents between multiple systems. And yes, I know, PDF has it's strengths and there's a reason why it's so widely used, but it doesn't mean I have to like it.
Additionally all proprietary formats, specially ones who have gained enough users so that they're treated like a standard or requirement if you want to work with X.
oh it's x, not x... i hate our timeline
I would be fine with PDFs exactly the same except Adobe doesn't exist and neither does Acrobat.
Resume information. There have been several attempts, but none have become an accepted standard.
When I was a consultant, this was the one standard I longed for the most. A data file where I could put all of my information, and then filter and format it for each application. But ultimately, I wanted to be able to submit the information in a standardised format - without having to re-enter it endlessly into crappy web forms.
I think things have gotten better today, but at the cost of a reliance on a monopoly (LinkedIn). And I'm not still in that sort of job market. But I think that desire was so strong it'll last me until I'm in my grave.
JPEG XL for images because it compresses better than JPEG, PNG and WEBP most of the time.
XZ because it theoretically offers the highest compression ratio in most circumstances, and long decompression time isn't really an issue when the alternative is downloading a larger file over a slow connection.
Config files stored as serialized data structures instead of in plain text. This speeds up read times and removes the possibility of syntax or type errors. Also, fuck JSON.
I wish there were a good format for typesetting. Docx is closed and inflexible. LaTeX is unreadable, inefficient to type and hard to learn due to the inconsistencies that arise from its reliance on third-party packages and its lack of guidelines for their design.
TeX / LaTex documentation is infuriating. It's either "use your university's package to make a document that looks like this:" -or- program in alien assembly language.
I like postscript for graphic design, but not so much for typesetting. For a flyer or poster, PS is great.
matroska for media, we already have MKA for audio and MKV for video. An image container would be good too.
mp4 is more prone to data loss and slower to parse, while also being less flexible, despite this it seems to be a sort of pseudo standard.
(MP4, M4A, HEIF formats like heic, avif)
Markdown for all rich text that doesn't need super fancy shit like latex
UTF-8 for plain text, trying to figure out the encoding, especially with older files/equipment/software is super annoying.
i'd like there to be a way to standardise midi info in plugins for music
192 kHz for music.
The CD was the worst thing to happen in the history of audio. 44 (or 48) kHz is awful, and it is still prevalent. It would be better to wait a few more years and have better quality.
Why? What reason could there possibly be to store frequencies as high as 96 kHz? The limit of human hearing is 20 kHz, hence why 44.1 and 48 kHz sample rates are used
On top of that, 20 kHz is quite the theoretical upper limit.
Most people, be it due to aging (affects all of us) or due to behaviour (some way more than others), can't hear that far up anyway. Most people would be suprised how high up even e.g. 17 kHz is. Sounds a lot closer to very high pitched "hissing" or "shimmer", not something that's considered "tonal".
So yeah, saying "oh no, let me have my precious 30 kHz" really is questionable.
At least when it comes to listening to finished music files. The validity of higher sampling frequencies during various stages in the audio production process is a different, way less questionable topic,
I assume you're gonna back that up with a double blind ABX test?
44 KHz wasn't chosen randomly. It is based in the range of frequencies that humans can hear (20Hz to 20KHz) and the fact that a periodic waveform can be exactly rebuild as the original (in terms of frequency) when sampling rate is al least twice the bandwidth. So, if it is sampled at 44KHz you can get all components up to 22 KHz whics is more that we can hear.