this post was submitted on 15 Jun 2023
56 points (100.0% liked)

Technology

37717 readers
483 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Reddit has a form where you can request a copy of your data. The process can take up to 30 days, after which you will get a private message on your Reddit account with a download link. The data comes in the form of CSV files that you can open using Microsoft Excel or any text editor.

If you’d rather not wait for Reddit to deliver your data, or would prefer to keep your data in a searchable archive, you can use Brownman’s tool, reddit-user-to-sqlite. This command line application can download the complete public archive of any Reddit user and compile it in an SQLite database file. Just keep in mind that this method will stop working on July 1, 2023, when the API change occurs (because you don't actually own that content you created on Reddit?).

See https://www.wired.com/story/how-to-download-your-reddit-data/

#technology #deletereddit #Reddit

top 9 comments
sorted by: hot top controversial new old
[–] princessofcute@kbin.social 4 points 1 year ago (3 children)

I've heard other similar tools only grab the first 1000 comments/posts does this grab everything? Or does it have a similar limitation?

[–] roofuskit@kbin.social 4 points 1 year ago (1 children)

If you are requesting directly from Reddit, it will be everything. Just keep in mind they are going to be flooded with responses right now so 30 days is probably optimistic.

[–] gratux@lemmy.blahaj.zone 4 points 1 year ago

Theoretically, they are required by the GDPR to respond within one calendar month from the day they receive the request. Let's see if they can keep up.

[–] gratux@lemmy.blahaj.zone 3 points 1 year ago (1 children)

This is the official GDPR data request form, so it includes everything.

[–] daitya@kbin.social 1 points 1 year ago

Thank you for the direct link.

[–] Saturnlks@lemm.ee 1 points 1 year ago* (last edited 1 year ago)

I had a 3 prong approach to getting my stuff.

The python tool mentioned in the wired article, the reddit is fun newly added export feature, and redditmanager.com

Between those 3, I was able to get my public stuff (comments and posts), my stared/saved stuff, and all my rif clicked links (quite shocking/surprising the amount of data, but not that large mb wise)

I used my clicked links from the rif export to help build my RSS feed list

1000 comments and a separate 1000 posts for the python tool

[–] jherazob@beehaw.org 3 points 1 year ago* (last edited 1 year ago) (1 children)

I have 17 years of posts and comments and have been active, i don't have much trust in those kinds of tools specially given that the API has hard limits to what it can reach, still will check

Edit: Welp, from starters we have issues, it requires the absolute latest Python version, as a sysadmin i hate when things demand the latest and greatest just to be installed, i'm in a version still with full support...

Oh well, we'll see...

[–] deadcade@lemmy.deadca.de 1 points 1 year ago

The official GDPR data request form returns "everything" (assuming you can trust Reddit to provide everything). But there's no hard limit on amount of comments/posts.

[–] Fubarberry@lemmy.fmhy.ml 2 points 1 year ago

Thanks for sharing. I put a request in and I'll see how comprehensive the data is.

load more comments
view more: next ›