RAUL BENITEZ's picture

I am a newbie to using turnkey within the amazon enviroment but I made a new tklbam micro-instnace server and want to make backups of it and move those backups to another amazon micro isntance. 

Is that possible and if it is is there a tutorial somehwere I can take a look and learn how to do this?

 

Thanks

Forum: 
Jeremy Davis's picture

So you have a TKL (TurnKey Linux) Micro instance which I assume you launced via the Hub?

Assuming that's the case, you have 2 ways to create a new AWS instance based on the first.

  1. You could use TKLBAM (TKL Backup And Migration).
    • Run a backup of your server using TKLBAM either from the commandline or from Webmin. Have a look here and here.
    • Then from the Hub Launch a new server from the TKLBAM backup. See here.
  2. You could use the new feature that allows 'snapshots' of AWS instances. See here.
RAUL BENITEZ's picture

Thank you. Yes I created the instance in the Hub so I'll take a look at those options.

RAUL BENITEZ's picture

Just another question, one of my instances is just a postgres server and I know that there isent a solution yet within turnkey to backup a postgres database but could I used one of these two solutions to just backup that whole instances instead of just parts of it?

 

Thanks again.

L. Arnold's picture

Snapshots should (I hope anyway) work for this.

Alternatively you could and should set an exclusion for the Postgres files (somewhere documented or commented on in the forums)..  then manually move the files from within Postgres or with a separate Datamanagment tool like Navicat.

I have not done what you are trying but I would like to get the protocol down as well.

RAUL BENITEZ's picture

Well right now I have a script that does backups for the database to another directory. I want to send those backups, which are gzip, to a backup and thought maybe i could just take a snapshot of that directory and backup it up? is that possible?

 

 

Thanks again. 

Jeremy Davis's picture

so that may not suit your purpose exactly.

But seeing as you already have a script to dump DB then you could use TKLBAM with the hooks mechanism to launch your script. The only catch is that you'll have to trick the Hub/TKLBAM into thinking that the appliance is something else (perhaps Core? - PostgreSQL appliance is not currently supported by TKLBAM). For info about TKLBAM hooks, see the docs. It's actually just a copy/paste of the blog post announcement (make sure you read the comments on how to trick TKLBAM into allowing backup).

RAUL BENITEZ's picture

Thanks for your response. So in reading that blog post and the coments I have to say I was a little lost. From what I got from it I would have to run this piece of code to "trick" it to save my lapp instance?

 

echo turnkey-core-11.2-lucid-x86 > /etc/turnkey_version

 

So are there more steps involved in this that I didnt pick up on? is there a how-to that I can take a look at or is this someting that somebody with more experiance should be doing?

or do I use that fixclock hook script in that blog post along with the backup script to run the back up?

This might be a tad over my head ;)

 

Thanks!

Jeremy Davis's picture

That's pretty much it. Although TBH I haven't tried it (I don't have any appliances with PostgreSQL). You may also need to set directories to include (such as /var/www in a LAPP appliance).

You don't need to use the fixclock script because it is already included in TKL appliance, it is just given as an example. The point I was making is that you could call your script as a TKLBAM hook so it will automatically dump your DB when doing a backup.

Unfortunately there isn't a really clear 'how-to' written so it will involve a bit of trial and error.

The only other thing to consider is that you may need to delay the running of the TKLBAM backup while your dump runs. Someone else trying to acheive the same ends as you posted on how they were having issues with it not including the latest DB dump when the backup ran and I'm assuming that it is because the backup starts before the dump has finshed. Unfortunately they never posted back on how (or if) they solved this issue. One easy workaround I suggested was running the dump from a separate cron job prior to the backup running.

Sorry I can't give more step-by-step help on how to do all this but as i said I haven't actually done any of it myself, just read about it. If you get it sorted and have the time, it'd be great if you could share your findings.

RAUL BENITEZ's picture

Well as for running it after the datbase dump, there really is no need since I'm not even using the postgres on this server i have the sever shut down. All i'm doing is running apache and have installed coldfusion as i'm budliing and testing a web application. So I dont have to worry about running a postgres database dump. I actually have a seperate instance thats just a postgres server that the web app is using. So I should just be able to run that script.

 

So how does this work then? do i run the script and then run the tklbam-backup  ?

 

Thanks for all your help so far!

Jeremy Davis's picture

And how you installed ColdFusion.

If you simply want the backup to include your data then it's just a case of including the directory(s) that you want TKLBAM to backup. See this page in the docs. If you want this to be a one off, then you can use the switch to run the backup manually at the commandline. But if you want this to be an ongoing inclusion, then from what I can gather from that page, you need to add your path to /var/lib/tklbam/profile/dirindex.conf (but I haven't done it before so I can't 100% confirm).

If you want it to recreate your server as it is then it will depend on how you installed ColdFusion. Any dependancies (ie anything that it requires) that come from the standard repos will be auto installed in your new instance. Many other things that you setup (such as a separate user account for ColdFusion) should also be auto handled, but ColdFusion install itself and any other bits of config may need to be scripted (and can be handled by TKLBAM hooks). OTOH if you include the directory(s) where all the ColdFusion files, config, etc are then that should work too. It may take a bit of trial and error to get it all working.

L. Arnold's picture

I would assume you have some Database that ColdFusion is working with...  However if you just want to backup some files I would recomend going into WEBMIN (yourIP:12321) then finding "File Manager" under Tools...  then finding the files you want to backup and issuing a "Compress" or related command, then finding the compressed file and downloading it to your workstation.

I would tell you the exact command in FileManager but evidently Java is not working on my web browser just now, and File Manager requires Java.

Once you get this moved out of there go and install the "Core" or "Lamp" or some other server but not the Postgres server as you can't "out of the box" use TKLBAM on it (in my understanding of this thread anyway)..  I would think you could install TKLBAM on the Postgres server then just exclude the Postgres Datadirectory (as mentioned above), then do some "dumps" or "backups" from within "postgres" and you will have transportable datasnapshots at least.

I expect, but am not sure, that inate Postgres capability is coming soon to TKLBAM.

Add new comment