The backups with time shift are incremental, hence most of the time the backup is taken within seconds and it only stores changes over time, something similar to git.
I used to do it exactly for that uses case, the backup was quick because there generally are not much changes outside the home directory.
I used to have Daily backups and monthly backup like 20 different dates stored in a relatively small space.
Like if my system is 30 gb then a 50 gb backup partition would store months of daily backups.
Off topic since you mentioned you are an ML engineer.
How hard is it to train a GPT at home with limited resources.
Example I have a custom use cases and limited data, I am a software developer proficient in python but my experience comes from REST frameworks and Web development
It would be great if you guide me on training at a small scale locally.
Any guides or resources would be really helpful.
I am basically planning hobby projects where I can train on my own data such as my chats with others and then do functions. Like I own a small buisness and we take a lot of orders on WhatsApp, like 100 active chats per month with each chat having 50-500 messages. It might be small data for LLM but I want to explore the capabilities.
I saw there are many ways like fine tuning and one shot models and etc but I didn’t find a good resource that actually explains how to do things.