You should be able to take the binlogs and upload them. Then in a restore situation you'd restore your last full db snapshot and replay your binlogs up until the point you lost the server.
DevOps
DevOps integrates and automates the work of software development (Dev) and IT operations (Ops) as a means for improving and shortening the systems development life cycle.
Rules:
- Posts must be relevant to DevOps
- No NSFW content
- No hate speech, bigotry, etc
- Try to keep discussions on topic
- No spam of tools/companies/advertisements
- It’s OK to post your own stuff part of the time, but the primary use of the community should not be promotional content.
Icon base by Lorc under CC BY 3.0 with modifications to add a gradient
Im a bit late to the show, but I personally feel like you are heading down the wrong path. Unless you are trying to completely host locally, but for some reason want your backups in the cloud, and not simply on separate local server, you are mixing your design for seemingly no reason. If you are hosting locally, you should back up to a separate local instance.
If you indeed are cloud based, you SHOULD NOT be hosting a DB separately. Since you specified S3, you are using AWS, and you should rather use RDS managed mySQL and should use the snapshot feature built in. ref
I know that we specifically don't use the snapshot feature for a reason. I think it has to do with how snapshots are restored. But I would need to ask my colleague why exactly we're not doing it.
We do full dumps and data-only dumps in regular intervals.
Thanks its very helpful 🙌🙌
I bet you could make a lambda function that periodically backs up your database. That's probably the route that I'd go down because it's more cost effective than other things. Only thing I'd be concerned ab is configuring perms for the lambda function and s3 bucket. Take this with a grain of salt, I've only recently started getting into cloud stuff.