Google photos backup with rclone

I've been using Google Photos to store photos from my phone for over 10 years and it works great to have them automatically sync when I set up a new phone.

But, I'm paranoid and I want another backup just in case somehow I loose access to the account through a hack or some other issue.

An easy backup solution for this on linux/ubuntu is with rclone.

Install with:

sudo apt install rclone  

Once installed, we can configure it via:

rclone config  

This will ask you questions about the setup.
Choose "n" for a new remote.
Select Google Photos as the cloud service.
Follow the instructions to authenticate with your Google account.
It'll ask about setting a client_id, though possible to use the defaults and leave this blank, it's better to use your own for both a quota view but also potentially for privacy.

rclone has docs on creating a client to follow with
https://rclone.org/drive/#making-your-own-client-id
which you'll setup with https://console.cloud.google.com/apis/api/

I've created the following script backup-google-photos.sh which then automates backing up a few key folders and all photos split out by year & month for a little organisation, I've saved this in ~/Pictures/GooglePhotosBackup/:

#!/bin/bash

echo "Backing up Google Photos!"

echo "=> removing previous log file"  
rm ~/Pictures/google-photos-backup.log

# backup the smaller album, feature and shared-album folders first
echo "=> backup albums:"  
rclone copy googlephotos:album ~/Pictures/GooglePhotosBackup/album --progress --transfers 8 --bwlimit 50M --log-file ~/Pictures/google-photos-backup.log --log-level INFO  
echo  
echo "=> backup featured:"  
rclone copy googlephotos:feature ~/Pictures/GooglePhotosBackup/feature --progress --transfers 8 --bwlimit 50M --log-file ~/Pictures/google-photos-backup.log --log-level INFO  
echo  
echo "=> backup shared-albums:"  
rclone copy googlephotos:shared-album ~/Pictures/GooglePhotosBackup/shared-album --progress --transfers 8 --bwlimit 50M --log-file ~/Pictures/google-photos-backup.log --log-level INFO  
echo

# then in media, it contains sub folders for all, by-day, by-month and by-year
# photos repeat in these so rather than download 4 copies of each image/video
# i'm currently just backing up the by month ones so we have a little folder structure
echo "=> backup all photos by month:"  
rclone copy googlephotos:media/by-month ~/Pictures/GooglePhotosBackup/media/by-month --progress --transfers 8 --bwlimit 50M --log-file ~/Pictures/google-photos-backup.log --log-level INFO  
echo

echo "Complete! View ~/Pictures/google-photos-backup.log for full log detail."  

This downloads to a directory
Adjust the paths along with the 8 transfers at once limit and bwlimit. Also it logs to a file so you can review what happened later if needed.

With this, we can add run/execute permissions with:

chmod +x ./backup-google-photos.sh  

and then going forward we can either run this once a week or so manually with:

./backup-google-photos.sh

Or setup a cron to run it if you know your machine will generally be running around a specific time to do the backup.

crontab -e  

add a line to run this, e.g. Fridays at 2pm

0 14 * * 5 ~/Pictures/GooglePhotosBackup/backup-google-photos.sh  

You may from time to time need to refresh the access token.
To do this you can run:

rclone config reconnect googlephotos: --auto-confirm  
comments powered by Disqus
Want to setup your own server? Digital Ocean offer $100 free for new accounts.
DigitalOcean Referral Badge