this post was submitted on 01 Jul 2023
106 points (100.0% liked)
Free and Open Source Software
17934 readers
10 users here now
If it's free and open source and it's also software, it can be discussed here. Subcommunity of Technology.
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
jq
for parsing/formatting/manipulating JSON, and itsyq
wrapper for YAML. Holy shit you can do powerful queries with them.Or the even faster successor gojq.
Not sure how big the JSON files are you're messing with, but I've never had any noticeable delay using
jq
.When you're dealing with log files that are on the order of 100 MB or 1+ GB in size,
jq
can, indeed, be a bit slow. Often I usegrep
as a first-pass filter, which speeds things up tremendously. I'll have to givegojq
a try and see if it makes the initialgrep
unneeded. The downside is thatjq
is often already installed everywhere I need it (VMs, base docker images, etc.), butgojq
definitely is not (yet).I use
jq
for decoding base64 😂pbpaste | jq -R 'split(".") | .[0],.[1] | @base64d | fromjson'
Big fan of gron and grep myself. Way easier if you’re not doing anything crazy.
For sure. Often I use
grep
as a first pass to find relevant entries in JSON-lines formatted log files, and then pass that throughjq
(oryq -y
if I want YAML output) for further filtering, processing, and formatting.