WindowlessBasement

joined 1 year ago
[–] WindowlessBasement@alien.top 1 points 11 months ago

I have tried using the in-built Pagination API to retrieve all relevant domain entries by splitting them into blocks but, due to the way the filters are applied, this only tells me if the entry is in the current block and I have to search each one manually. I have basically no coding knowledge

Short answer: you're asking questions that will take a program requesting data (the whole internet archive?) non-stop for a month or more. You are gonna need to learn to code if you want to interact with that much data.

I definitely don't have the ability to automate the search process for the paginated data.

You're going to need to automate it. A rate-limiter is going to kick in very quickly if you are just spamming the API.

explain to me like I'm 5

You need to learn for yourself if this is a project you are tackling. Also will need to familiarize yourself with the terms of service of the archive, because most services would consider scraping every piece of data they have as abusive behavior and/or malicious.

[–] WindowlessBasement@alien.top 1 points 11 months ago (1 children)

The error message tells you what to.

If this is just a random error you are unconcerned by (recent power outage, flakely cables, etc), you can clear the errors. If you believe the drive is failing, you can replace the drive. The array will remain in a degraded state until you make a decision.

[–] WindowlessBasement@alien.top 1 points 11 months ago

loseless compression doesn't exist for video. Like mathematically impossible.

[–] WindowlessBasement@alien.top 1 points 11 months ago

How long is a piece of string?

[–] WindowlessBasement@alien.top 1 points 11 months ago

Terminal.

With that many files, you should be using CLI tools.

[–] WindowlessBasement@alien.top 1 points 11 months ago

Some of the responses here are making me reconsider that statement.

[–] WindowlessBasement@alien.top 1 points 11 months ago

How is that different than feeding one big MKV into handbrake?

 

Over the last couple weeks there's been a few threads of people insisting on using DVD Decrypter. I was wondering why are people still using it? Datahoarding tends to attract relatively technical people, so there must be some reason to keep using software that hasn't been updated since Windows XP was modern.

MakeMKV seems like the better option in every use case except full backups. However a full DVD image can be made with any imaging software or even just dd. Any player that can handle the DVD menus from an ISO is going to be able to decrypt during playback.

[–] WindowlessBasement@alien.top 1 points 11 months ago

I have an array of Exos. During high usage, they be heard in the next room.

[–] WindowlessBasement@alien.top 1 points 11 months ago

I have an array of Exos. During high usage, they be heard in the next room.