Hacker News new | past | comments | ask | show | jobs | submit login

Since I do not trust any cloud service with the master/only copy of my data (and I do not trust my phone -- the origin of my photos -- as a permanent storage device), I developed a tool called Timeliner which downloads all your content from Google Photos and other online services to your own computer: https://github.com/mholt/timeliner

I run this on a cronjob and it adds all my photos to a timeline in conjunction with my tweets and Facebook posts. It's got a few rough edges but should (mostly) work as advertised.

I would love to have help maintaining it, especially now given this announcement (because Timeliner does not require Google Drive).

One major limitation with the Google Photos API is that it's impossible to get the location data from the photos through it. Timeliner tries to compensate for this by allowing you to import your Google Location History, but this is not ideal either. Edited to add (since there are a lot of the same question in the replies): no, the EXIF metadata does not contain the location because Google Photos API strips it out (it leaves most other metadata intact).

My use case is backup of photos that are taken from my phone. and of course finding a data organization strategy that I'm not forced to change every year due to whatever bad news comes out about such and such a vendor, such as this one.

I just saw the EXIM detail which is kind of a dealbreaker, in conjunction with the fact that google photos is yet another API they will probably keep messing with and changing in the future, and I instead upgraded my Dropbox account to 2TB (which wasn't really an option when I first started with google photos) and am using Autosync https://play.google.com/store/apps/details?id=com.ttxapps.dr... to push my photos out as is (Dropbox's photo sync wants to rename the photos for no good reason). Just pushing out the raw data from the phone directly where I can grab it on my synology on the backend seems like the best approach. But it's only been about 45 minutes and there's plenty of time for something to blow up, maybe I'll be back.

>One major limitation with the Google Photos API is that it's impossible to get the location data from the photos through it.

Isn't this embedded with the photo itself, the EXIF data? Or is that what the Google API strips out of the photo?

Also: Thanks for the link to timeliner, looks useful.

Google strips it out. As a work around people have been using the "sync to Drive" feature, which allows downloading of photos with the GPS data in-tact. But now that's going away...

There are two possibilities I can think of. (1) Scrape the location from the web app, or (2) maybe Google Takeout for Google Photos leaves the location metadata intact? Feel free to discuss here: https://github.com/mholt/timeliner/issues/38

Why not use Nextcloud's auto-upload feature and get a copy of the image without any intermediary mangling the file? Seems best if your looking to have a backup of your data...

I've been using Google Photo's as an easy way to make sure my images are safely "in the cloud" for several years. I've only recently got around to try and sync my library down onto my computer. Google Photos has several limitations I dislike, the lack of GPS data being only the latest one I'm learning about, but it does the job of syncing whenever internet is available pretty good. I'm not looking to move to another system I have to maintain.

I had just stumbled on Timeliner a couple days ago. Looks awesome. I've been thinking a lot lately about a) how to take control of ones data without getting overwhelmed with server configuration, backups, file formats, mobile apps etc etc and b) how to summarize a life's worth of data in a way that will be digestible by descendants 100 years from now who will never have the time to trawl through all the data available from their ancestors. Great to see others thinking about the same problems.

I love the idea of that (and your tool). I 100% agree on the idea of that implementation. I now have almost 15 years of photos in iCloud with things that would be particularly devastating to lose. Memorable vacations, lost pets and relatives. It really would be like loosing a large chunk of my history.

Looks like you don't support iCloud, not surprised I'm sure there's no public API and their API would be quite fragile and hostile to use.

I do use google photos on my phone, so everything syncs from iCloud -> Google Photos anyway.

Why don't you use the Apple Photos option that keeps (or syncs back if you don't have it locally) the original file on your Mac and then just do a regular TimeMachine, Arq, rsync backup of the directory from your Mac? I think that's what this feature is for, only downside is that you need the disk space for it but shouldn't be a problem with an external drive.

This is exactly what I do. I have an old headless Mac mini that has a 2 TB drive and syncs all photos via iCloud/Photos, then a time machine backup of that volume.

I can take a picture on any device, or copy in a picture or video, or import from my camera SD card on any device, and within a few minutes I have at least 2 backup copies available.

I just wish Nikon, Canon, Sony, or anyone really would be able to integrate a DSLR/mirrorless camera into Dropbox or iCloud Photos so I could take out the manual import process :P

> I just wish Nikon, Canon, Sony, or anyone really would be able to integrate a DSLR/mirrorless camera into Dropbox or iCloud Photos so I could take out the manual import process :P

Given that eyefi exists, this is mostly a client-side software issue. We already have the technology, which is awesome.

The other minor problem with this approach is it requires a Mac. Is there a decent solution for iOS photo backup to Linux?

No. I’ve successfully used an OSX virtual machine to broker between Apple Photos and other object/file management systems though.

I move my photos to linux then back them up to several places. One day I’ll publish my scripts, but they aren’t strictly necessary. Gthumb, exiftool, and ffmpeg will get you far.

That looks great! I’d like to have support for iCloud. Have you tried it already? Alternatively, I’d consider adding it as data source myself.

I haven't (I don't use iCloud) - go for it! Would love to see more data sources. See https://github.com/mholt/timeliner/issues/18

I haven’t thought at using the privacy export! Thanks for pointing out. I’ll try adding the data source, so we’ll have incremental.

If you have a macOS device available, you can set the Photos app to download and keep full-quality copies of all photos in your library.

I have a Synology NAS and use CloudSync to maintain a copy of my Google Drive/Photos in my local NAS. The problem with this announcement for me is that the recommended method to backup Google Photos to my NAS is to enable the Google Photos folder in Google drive and use cloudsync to backup Google Drive. See https://www.synology.com/en-us/knowledgebase/DSM/help/CloudS...

This announcement breaks that and destroys the easiest option for downloading all photos from Google.

Awesome. Putting the hacker into Hacker News.

Absolutely. Fix the problem, instead of complaining about it or explaining in detail why it can't or shouldn't be fixed.

This looks great. Thanks.

There is also Perkeep which is very interesting as well. https://perkeep.org/

> One major limitation with the Google Photos API is that it's impossible to get the location data from the photos through it.

Shouldn't jpegs have this data if geoloc was on at the time of shooting ? Am I right in understanding that you imply that this meta-data is somehow disconnected from the file on Google Photos ?

That understanding is correct, according to my interpretation of this comment [0].

[0]: https://news.ycombinator.com/item?id=20166771

Matt, long time no talk!

This is awwwwwwesome!!!

I was just the other night telling somebody I wish I had something like this.

Another feature (do you know anything that does this, or if it could be done in timeliner?) is if it could republish my photos to Facebook if it got them from Google Photos, and vice-versa?

Hi Mark :) Been a while! I think I want to keep Timeliner read-only on the data sources, but I'm sure there are already tools out there that can sync between your online accounts like that.

> Location data is stripped. Photos are downloaded with their full EXIF metadata except for latitude and longitude. This is unfortunate. We recommend augmenting your timeline with Google Location History to add location data.

This is indeed extremely sad.

I have similar mistrusts and my only acceptable solution has been to copy the actual files off my phone onto my computer (largely manual; some scripts), and to treat cloud hosts as backups of last resort. Even Dropbox (which is otherwise great) doesn’t upload Live Photos, (or preserve the file name of the photo).

I’m writing from an iPhone/Ubuntu perspective.

I do it to, but I use Syncthing to copy to my home server. It works really well on Android. Pretty much just set it up and forget about it.

> the EXIF metadata does not contain the location because Google Photos API strips it out

There's a setting:

"Remove geo location in items shared by link Affects items shared by link but not by other means"

Edit: I just tried it, I downloaded a photo and double checked, EXIF location data was included.

Did you download the photo via the web interface or the Google Photos API? If via the API, I'd be very surprised; if via the web interface, then that's different from the API and isn't really feasible at scale (I have a test script where I tried using a headless browser buuuut it is very very slow and flaky.)

You know what's a good idea: for every image that it downloads and stores, you could add a new field in the EXIF metadata and name it Timeliner-Sponsors. :)

That should make all the sponsors happy, right?

Joking aside, looks like a good project.

Great tool - do you have a docker image pre-made for this? This would be perfect to run directly off a NAS such as a QNAP so it is set and forget :)

Also checkout perkeep

This is amazing as this is something I've wanted to work on forever.

Sucks that EXIF is stripped upon being uploaded into Google.

Why not use an open source app like Syncopoli (which is a rsync client, available on F-Droid) to back up your pictures (including location and other metadata) and other files on your phone to your rsync server, which would be your own computer or VPS?

Just add wireguard

I sync from my phone to Google Photos, OneDrive, and iCloud. OneDrive and iCloud syncs to my computer. Then I have BackBlaze backing up from my computer.

I think I have enough backups.

I have a similar workflow as well. Although I do like when Google makes those AI highlight movies.

Perhaps it could read the location from exif data?

Location metadata is stripped from EXIF when downloaded through Google Photos API.

Only alternative I can figure is to scrape the actual Google Photos web app.

How can you trust your computer? What if your hard drive fails? How would you recover your photos then?

Given his involvement, I'd imagine the answer would be https://relicabackup.com/

> "Since I do not trust any cloud service with the master/only copy"

All photos still exist in the cloud, backed up on a hard drive. Multiple copies are unlikely to fail at the same time.

Cloud providers are notorious for automated systems that will lock you out of your account, with no recourse and no ability to talk to a human about it.

My guess would be RAID 1?

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact