Chris Musty's picture

What happens when a large backup is performed that takes longer than 24 hours and the cron job is set for monthly backups? Will it attempt an incrimental backup while it is backing up?

Forum: 
Chris Musty's picture

With the following example..

A file server has 100Gb and takes days to backup on amazon.

If cron is set to daily increments and monthly full backups does tklbam try to backup again in 24 hours or is it smart enough to know there is one happening and not do it until its finished backing up?

Chris Musty

Director

Specialised Technologies

Liraz Siri's picture

Yes, TKLBAM is smart enough not to create multiple backups in parallel. There's a simple locking mechanism. If you try running backup while a backup is already running you'll get this error:

A previous backup is still in progress

PS: sorry for the late reply.

Chris Musty's picture

No issue with the reply, if your busy your busy!

If you guys find a way to implement "real-time" backups, I will be your best friend and buy you a beer (or sixty).

Chris Musty

Director

Specialised Technologies

Jeremy Davis's picture

I don't know anywhere near enough about how cron does things or TKLBAM to know. Perhaps it is worth doing a test and see what it does?

Chris Musty's picture

Right now I am backing up a 100Gb File Server. This thing could easily blow out to 300Gb and if that happens there is literally not enough pipe to send it all out within a month.

So far I can transfer 7Gb per day (providing no networjk issues), thats 2 full weeks of uploading - WTF! Dont even consider daily backups!

I considered using dropbox for teams but that essentially has a fundamental issue also in that the data has to be downloaded if anything goes KAPUT! At least with TKLBAM I can fire up a server instance from the backup and have an online file server available to my client while I rebuild the disaster.

The only option I have is to enforce archiving to minimise the uploaded data when a full backup is performed.

What would be REALLY NICE (if anyone gets the hint) is to have a dropbox like server that backed up file changes on the fly and allows unlimited restore of individual files (dropbox call it ratpack).

I would seriously consider funding this if anyone is interested!

Chris Musty

Director

Specialised Technologies

Liraz Siri's picture

We've been thinking about how to go about implementing continual "real-time" uploads. Though it might not seem to be that different functionality wise, behind the scenes this would have to use a completely separate backup mechanism as Duplicity isn't built for this sort of thing. I don't yet have a design in my head for doing this that I feel comfortable with but on the other hand I haven't really thought about it enough.

Chris Musty's picture

Do you mean run their own servers as a business model or run their own servers for home/small business use?

In my shallow search I have not found what I am looking for.

I dont want it for advanced users but more like everyday users that can manage a server the way they want ie full backup at set frequency (TKLBAM method) or individual files (whats the techy term for this full vs sequential or something?)

TKLBAM is perfect for smaller databases probably upto about 5Gb considering the upload constraints in Oz. (So who thinks we dont need the NBN, anyone?)

At the end of the day I want to limit the amount of traffic being sent out. If I just hosted in the cloud it would be sweet but at the cost of waiting to download your files when you want to edit them. Imagine saving a 100Mb PDF!

Chris Musty

Director

Specialised Technologies

Jeremy Davis's picture

But perhaps a start. iFolder is a open source Dropbox-like setup but you host the server yourself. The only catch is that there are no Ubuntu/Debian compatible binaries floating about. There are rpms and exes but no debs. It has been reported that it builds ok for Ubuntu 10.04 (ie TKL v11.x) so it could be done. I've been meaning to do it cause I would have some uses for it but haven't got around to it. I originally spent a fair bit of time trying to get it to work on the previous TKL version but gave up. There is a Wiki page for it in the dev wiki here. If you scroll down a user has posted a comment where they detailed how they built it from source and had it running on TKL v11.x so it should work. The Ubuntu wiki page has a bit of info and for interest the project seems to now be hosted on SourceForge. There is also a PPA that has it although I know nothing about the PPA owner, it's certainly not official. It would probably be an easy way to test it out, but not really adequite for production IMO.

You could host an iFolder server onsite, as well as one in the cloud and have local and remote copies of files. Not sure if they are really backups in the ture sense of the word though as I don't know what happens if you say accidentally delete a file.

Anyway, we just needs someone to create a TKLPatch for it and hopefully it would become an official appliance.

Chris Musty's picture

Since Last Friday (24th June 2011) at just before 9PM I started backing up a 100Gb file server with TKLBAM.

I calculated it would take 14-15 days. Happy to announce it is rocketing along at about 340KBytes per second! (up from an estimated 225KBytes / second) and I have hit the half way mark.

This thing has endured network outages, daily business usage and has not dropped a beat.

Once I get my files in the cloud I will feel a bit safer but it looks good so far!

TKLBAM is quite resilient and although I cannot use this system for daily backups (more a pipe issue than TKLBAM) a once off snapshot is perfect.

TKLBAM rocks!

Chris Musty

Director

Specialised Technologies

Jeremy Davis's picture

Nice work Chris. Sounds like you're giving TKLBAM a pretty good workout! :)

Chris Musty's picture

Successfully backed up my file server. Total backup as reported in TKLBAM Hub is 49.5Gb from 92.1Gb on my files server thats a 1.86:1 file server size to backup ratio! Quite impressive. It took 206 hours 5 minutes and 4.86 seconds averaging out at 240.19Mb / hour or 66.72Kb / second. This was a far cry to the 340Kb / second I mentioned above which was only an instantaneous value and obviously incorrect.

Kudos to the devs!

Now I just need a fatter pipe, so to speak...

Chris Musty

Director

Specialised Technologies

Jeremy Davis's picture

Nice to see that TKLBAM got a good workout and despite a few teething issues it ended up coming through with the goods!

Out of interest, when you use TKLBAM on behalf of your customers, do you set them up with their own AWS/S3 account linked to your Hub or some other configuraton?

And yes, bring on the NBN! I wish they'd just hurry up with it!

Chris Musty's picture

In using TKLBAM for backups and restoration of a failed server I have one last test to perform and thats the time it takes to get a working fileserver in the cloud from the backup.

I am aware of the speed issues or network latency associated with such a setup but the idea is to get a fully functioning system available no matter what (some places I even have to consider redundant ISP's).

So to answer your question, I am maintaining control of all the backups through my account. Clients typically either forget or use bad passwords and it is just a nightmare. Not only that if I maintain control I can even get a server running on amazon from my mobile phone! Imagine how handy that is!

The backup I just did costs a bit over $7 so my monthly support fee's can easily accomodate that.

Funny thing about the NBN, there are workers on my street digging up the foot path. I asked one of the lollypop girls what they were doing and she said installing conduit for the NBN! Cool! Makes me wonder if they will be digging up all footpaths nationwide and filling them in with bumpy ashphalt! Might be a boon for footpath installers.

[EDIT] I might review my experiences in a formal doc and post it. Just need a bit of spare time.

Chris Musty

Director

Specialised Technologies

TomW's picture

Do you mean run their own servers as a business model or run their own servers for home/small business use?

Hi Chris...sounds like you are trying to address the same issues and users I am.   Hope we can share what we are learning to better serve the smallest of shops with virtualzation tech.

Is there a site/page where this is all addressed from that perspective that you use or can we start something like that here or away?  Muddles the learning/helping I feel.  Sorry for off topic post.  If there was a specific smb-home area here at TKL I'd post there.  This site and forum isn't geared towards any particular audience yet...it's forums are 2 choices.  Not a knock guys just a fact. I can deal with it.


Jeremy Davis's picture

I initially made a similar comment/suggestion re forums but Liraz convinced me. It makes it easier for newbs to work out where they should post. Also means less forums to monitor. You can use tags to create psuedo sub forums but it is not fully realised or utilised yet but it's getting there.

Anyway the dev wiki is a fine place to do some documentation if you're keen. It doesn't get anywhere near as much traffc as the forums but a useful place.

Add new comment