Geekzone: technology news, blogs, forums
Guest
Welcome Guest.
You haven't logged in yet. If you don't have an account you can register now.




15352 posts

Uber Geek

Trusted
Subscriber

# 215441 27-Jun-2017 17:24
Send private message

I have an Amazon Linux server (similar to Centos) and a Windows machine at home. I'd like to get files from the server to the PC as easily as possible, automatically and regularly. ie I want a sync, at least daily.

 

As background, this is to get backups of my data (database dumps, web root, etc) off the server to my PC.

 

Here's what I've considered:

 

  • Right now I use the Dropbox-Uploader script. It works well to upload, but if a file is deleted on Unix it does't remove it on dropbox. That wastes disk space on Dropbox and my PC.
  • The official dropbox client for Linux doesn't seem to let you sync specific directories, at least not easily.
  • rsync isn't well supported under windows. You have to run cygwin. I wonder about security as well.
  • Bittorrent Sync is a PITA between Windows and Linux. I had problems with time sync between platforms (even though both were fine), sync database problems, and it was just generally a real pain in the butt to try to get working. I did get a test directory to sync, but I gave up. I got so annoyed at it I completely nuked it from the server.
  • Unison has been suggested, but it hasn't been updated for a good while. It's new software on each machine as well.

Is there an easy way that I've missed?


Filter this topic showing only the reply marked as answer Create new topic
Mr Snotty
8920 posts

Uber Geek

Moderator
Trusted
Lifetime subscriber

  # 1807807 27-Jun-2017 17:28
Send private message

How about just connecting something like an oDroid to your network to do a pull via rsync ditching the Windows PC all together?







15352 posts

Uber Geek

Trusted
Subscriber

  # 1807814 27-Jun-2017 17:57
Send private message

I'd have to run another computer and integrate it with my backup system. Windows would be much easier.

 
 
 
 


Mr Snotty
8920 posts

Uber Geek

Moderator
Trusted
Lifetime subscriber

  # 1807819 27-Jun-2017 18:15
Send private message

timmmay: I'd have to run another computer and integrate it with my backup system. Windows would be much easier.

 

With Windows another way of doing it is to schedule a bash script to run under cygwin to do a rsync pull from the host. Doing Linux to Linux is far easier still as you're able to have something like an oDroid or RPi running on the same network that does the pull making the backups available over SMB for you to do what you like with.







15352 posts

Uber Geek

Trusted
Subscriber

  # 1807864 27-Jun-2017 19:39
Send private message

The server is locked down pretty tight, though it's completely under my control. I would rather not open any incoming ports if I can help it. I really don't want the added complication of more hardware, cygwin, etc. I use BT Sync across the world for family on Windows and Android, works fine there, it's just the Linux client p155ed me off. I think my problem was at least partly permissions and restricted time.

 

Sounds like I might just have to keep trying with BTSync. Either that or find a way to delete the files on Dropbox, which would be better still.


544 posts

Ultimate Geek

Subscriber

  # 1807867 27-Jun-2017 19:46
Send private message

How about using amazons AWS command line tools and sync files from Windows to S3 then the Linux box can sync the files from S3.

 

AWS Sync documentation 

 

Edit: added link to sync command documentation. 




15352 posts

Uber Geek

Trusted
Subscriber

  # 1807877 27-Jun-2017 20:17
Send private message

Sure I could script to sync from Linux up to S3, that's easy and is something I might've done some time anyway. I have software that does one-off sync from S3 to Windows, but I don't think I have software that does a scheduled sync from S3 down to Windows. I guess I could possibly script it using task scheduler and the AWS command line on Windows, but is there a better way than that?


723 posts

Ultimate Geek


  # 1807880 27-Jun-2017 20:29
Send private message

WinSCP has an option that might do what you want: https://winscp.net/eng/docs/scriptcommand_synchronize


 
 
 
 




15352 posts

Uber Geek

Trusted
Subscriber

  # 1807927 27-Jun-2017 21:01
Send private message

WinSCP is a decent option. Using the AWS command line tools looks pretty easy too though, maybe I'll give that a go. 

 

Thanks for the idea @djtOtago :) I should've probably thought of it myself, given I use S3 and Glacier for my backups already. I was thinking zero cost - this will cost me $0.02 per month plus maybe another cent or two for bandwidth...


2459 posts

Uber Geek


  # 1807971 28-Jun-2017 00:09
Send private message

There's also https://syncthing.net/

 

Works ok with syncing between OSes and it can use relays (with end to end TLS between clients, so the relays can't see what you're transferring), so you don't need to open ports..

 




15352 posts

Uber Geek

Trusted
Subscriber

  # 1808472 28-Jun-2017 19:46
Send private message

Going via S3 worked great. Even for me who's well certified in AWS it took a bit of doing though. I first tried to get EC2 roles working so I didn't need users, but that didn't work - not sure why. I gave up and used users.

 

     

  1. Define a policy for a user to write to an S3 bucket, and another policy for a user to read from it - note that I keep all my users / buckets separate for security
  2. Create AWS IAM users for each policy, with API access
  3. Run "aws configuration" on the server and on my PC to set up the console access
  4. I wrote a script for cron to call on the server and for task scheduler to call on the PC. When I ran it manually the data came in at near line speed - 95Mbps from Oregon. Impressive.

 

 

 

Server

 

aws s3 sync /var/backups/borg s3://bucket-name/var/backup/borg --storage-class STANDARD_IA --delete --exclude ".sync/*"

 

Windows PC

 

aws s3 sync s3://bucket-name/folder c:\storage\folder --delete

 

 

 

Server Write Policy (probably has a couple of unnecessary permissions)

 

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "arn:aws:s3:::*"
},
{
"Sid": "Stmt1498596902000",
"Effect": "Allow",
"Action": [
"s3:AbortMultipartUpload",
"s3:DeleteObject",
"s3:DeleteObjectVersion",
"s3:GetObject",
"s3:GetObjectVersion",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:ListBucketVersions",
"s3:ListMultipartUploadParts",
"s3:PutObject",
"s3:PutObjectVersionAcl"
],
"Resource": [
"arn:aws:s3:::bucket-name",
"arn:aws:s3:::bucket-name/*"
]
}
]
}

 

 

 

Policy for Client Read (no writes, probably has a couple of unnecessary permissions)

 

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "arn:aws:s3:::*"
},
{
"Sid": "Stmt1498596902000",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"s3:ListBucketMultipartUploads",
"s3:ListBucketVersions",
"s3:ListMultipartUploadParts"
],
"Resource": [
"arn:aws:s3:::bucket-name",
"arn:aws:s3:::bucket-name/*"
]
}
]
}


2998 posts

Uber Geek

Lifetime subscriber

  # 1808571 29-Jun-2017 07:10
Send private message

I use FreeFileSync... it's a cross-platform syncing/backup tool.

 

You can use this across a network if you set up a SMB driver on your Linux machine, or else to some kind of USB drive if you prefer sneakernet.

 

 


544 posts

Ultimate Geek

Subscriber

  # 1808639 29-Jun-2017 09:16
Send private message

timmmay:

 

Going via S3 worked great. Even for me who's well certified in AWS it took a bit of doing though. I first tried to get EC2 roles working so I didn't need users, but that didn't work - not sure why. I gave up and used users.

 

     

  1. Define a policy for a user to write to an S3 bucket, and another policy for a user to read from it - note that I keep all my users / buckets separate for security
  2. Create AWS IAM users for each policy, with API access
  3. Run "aws configuration" on the server and on my PC to set up the console access
  4. I wrote a script for cron to call on the server and for task scheduler to call on the PC. When I ran it manually the data came in at near line speed - 95Mbps from Oregon. Impressive.

 

 

That's basically the same as what I ended up doing. I'm syncing the other way. My windows machine to S3 late evening. Several EC2 servers in different locations pull from S3 early morning.

 

I've never tried using roles, I just have a few users setup with permissions to either read, write or read and write that Machines or scripts use, depending on what they need to do. I have often found it a bit odd some of the permissions you need to give a user to do a simple task.
IIRC for a user to be able to delete an object, they also need to be able to GET object and LIST objects and PUT object.




15352 posts

Uber Geek

Trusted
Subscriber

  # 1808648 29-Jun-2017 09:22
Send private message

I guess delete internally calls some of the other calls, or there's some dependency.

 

I back up my PC to S3 using CloudBerry Backup, but that's encrypted, deduplicated, etc, not a straight mirror.

 

I appreciate the suggestions from @frankv and @kyhwana2, but this way is cheap, easy, and I learned something :)


gsr

111 posts

Master Geek


  # 1808656 29-Jun-2017 09:50
Send private message

I use Google Drive + Insync. Works really well. You can choose folders you want to sync.




15352 posts

Uber Geek

Trusted
Subscriber

  # 1808683 29-Jun-2017 09:57
Send private message

gsr:

 

I use Google Drive + Insync. Works really well. You can choose folders you want to sync.

 

 

For Linux and Windows?


Filter this topic showing only the reply marked as answer Create new topic



Twitter and LinkedIn »



Follow us to receive Twitter updates when new discussions are posted in our forums:



Follow us to receive Twitter updates when news items and blogs are posted in our frontpage:



Follow us to receive Twitter updates when tech item prices are listed in our price comparison site:





News »

Chorus to launch Hyperfibre service
Posted 18-Nov-2019 15:00


Microsoft launches first Experience Center worldwide for Asia Pacific in Singapore
Posted 13-Nov-2019 13:08


Disney+ comes to LG Smart TVs
Posted 13-Nov-2019 12:55


Spark launches new wireless broadband "Unplan Metro"
Posted 11-Nov-2019 08:19


Malwarebytes overhauls flagship product with new UI, faster engine and lighter footprint
Posted 6-Nov-2019 11:48


CarbonClick launches into Digital Marketplaces
Posted 6-Nov-2019 11:42


Kordia offers Microsoft Azure Peering Service
Posted 6-Nov-2019 11:41


Spark 5G live on Auckland Harbour for Emirates Team New Zealand
Posted 4-Nov-2019 17:30


BNZ and Vodafone partner to boost NZ Tech for SME
Posted 31-Oct-2019 17:14


Nokia 7.2 available in New Zealand
Posted 31-Oct-2019 16:24


2talk launches Microsoft Teams Direct Routing product
Posted 29-Oct-2019 10:35


New Breast Cancer Foundation app puts power in Kiwi women's hands
Posted 25-Oct-2019 16:13


OPPO Reno2 Series lands, alongside hybrid noise-cancelling Wireless Headphones
Posted 24-Oct-2019 15:32


Waikato Data Scientists awarded $13 million from the Government
Posted 24-Oct-2019 15:27


D-Link launches Wave 2 Unified Access Points
Posted 24-Oct-2019 15:07



Geekzone Live »

Try automatic live updates from Geekzone directly in your browser, without refreshing the page, with Geekzone Live now.


Support Geekzone »

Our community of supporters help make Geekzone possible. Click the button below to join them.

Support Geezone on PressPatron



Are you subscribed to our RSS feed? You can download the latest headlines and summaries from our stories directly to your computer or smartphone by using a feed reader.

Alternatively, you can receive a daily email with Geekzone updates.