this post was submitted on 04 Dec 2023
2 points (100.0% liked)
Data Hoarder
11 readers
1 users here now
We are digital librarians. Among us are represented the various reasons to keep data -- legal requirements, competitive requirements, uncertainty of permanence of cloud services, distaste for transmitting your data externally (e.g. government or corporate espionage), cultural and familial archivists, internet collapse preppers, and people who do it themselves so they're sure it's done right. Everyone has their reasons for curating the data they have decided to keep (either forever or For A Damn Long Time (tm) ). Along the way we have sought out like-minded individuals to exchange strategies, war stories, and cautionary tales of failures.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A little bit shell magic around a little Python3 helper will do the job quickly:
for i in $(seq 1 10); do grablinks.py --fix-links 'https://forums.aida64.com/topic/667-share-your-sensorpanel/page/'"${i}" --search 'file/attachment.php?' -f 'wget -c -O '\''%text%'\'' '\''%url%'\' | fgrep '.sensorpanel'; done | tee fetchscript.sh
, then verify the generated shell script, and finally:sh fetchscript.sh
.Happy waiting! :)
You can grab my
grablinks.py
Python3 script from here: https://github.com/the-real-tokai/grablinksWow thanks. I'll try this out tonight. Was just a last ditch effort here before I went and manually did 50 pages a day or something. Thanks again.