Archive all of my Google Reader.

mingsoup

Golden Member
May 17, 2006
1,295
2
81
I need a program to locally backup all items in my Google Reader. I have several picture blogs which I would like to save to my local computer. The posts contain a single picture, which I'd like to save.

How could I do this? Is there a program that would do this. Most programs do not save articles and/or their images to the hard drive. I suppose I need an offline mode? Something which is as open format as possible.

Thanks

edit: http://webapps.stackexchange.com/qu...blog-entries-from-a-rss-feed-in-google-reader

This method seems to pop out a large xml file, the atoom n=1000 method. It contains what seems like the links to the images and any text in the immediate posts. Would it be possible to turn that XML into one giant web page for saving?
 
Last edited:

mingsoup

Golden Member
May 17, 2006
1,295
2
81
Actually I want to suck out ALL of reader. 10'000s of feed items with all content. Text and links to images.

I found 3 ways to do this:
https://github.com/samastur/GReader-hoover. Creates JSON files of EVERYTHING.
https://github.com/kerchen/export_gr2evernote. Creates HTML files of all your starred items.

You can also just make a folder public, then feed that site into httrack to download all of it. This is a bit simpler I suppose as it gets by Authentication which is needed in #1/2. I have a couple of picture blogs I am archiving in their entirety since I've been subscribed to them via option 3.
 

SMOGZINN

Lifer
Jun 17, 2005
14,359
4,640
136
Have you considered messing with IFTTT? It should be able to do what you are wanting.