Dataset Viewer

The dataset viewer should be available soon. Please retry later.

Arctic Shift Reddit Archive

Every Reddit comment and submission since 2005, organized as monthly Parquet shards

What is it?

The full Reddit archive from Arctic Shift, converted to Parquet and hosted here for easy access. Covers every public subreddit from 2005-12 through 2026-02.

Right now the archive has 2.6B items (1.3B comments, 1.3B submissions) in 270.4 GB of compressed Parquet. Comments and submissions are stored as separate datasets, split into monthly shards you can load individually or stream together.

Reddit has been around since 2005. Millions of people use it to talk about everything - programming, sports, cooking, politics, niche hobbies. That makes it one of the best sources of natural conversation data for language model training, sentiment analysis, community research, and information retrieval. Most Reddit datasets only cover specific subreddits or time windows. This one covers all of it.

What is being released?

Monthly Parquet files, split by type (comments vs submissions). Small months fit in one shard. Large months (post-2015 or so) get split into multiple ~200 MB shards.

data/
  comments/
    2005/12/000.parquet       earliest month with data
    2006/01/000.parquet
    ...
    2023/06/000.parquet
              001.parquet     large months get multiple shards
              002.parquet
  submissions/
    2005/12/000.parquet
    2006/01/000.parquet
    ...
stats.csv                     one row per committed (month, type) pair
zst_sizes.json                .zst file sizes for all months (from torrent metadata)
states.json                   live pipeline state (updated every ~5 min)

stats.csv tracks every committed (month, type) pair with row count, shard count, Parquet size, original .zst size, processing time, and commit timestamp. zst_sizes.json maps every (type, month) pair to its .zst archive size β€” useful for estimating remaining work.

Breakdown by type and year

Comments

  2005  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  1.1K
  2006  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  417.2K
  2007  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  2.5M
  2008  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  7.2M
  2009  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  18.9M
  2010  β–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  48.5M
  2011  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  123.3M
  2012  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  260.3M
  2013  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘  402.2M
  2014  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ  483.0M

Submissions

  2005  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  5.4K
  2006  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  304.7K
  2007  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  908.2K
  2008  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  2.5M
  2009  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  4.8M
  2010  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  6.9M
  2011  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  15.0M
  2012  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  29.3M
  2013  β–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  39.7M
  2014  β–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  47.9M
  2017  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  10.4M
  2018  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  11.3M
  2019  β–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  21.2M
  2023  β–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  39.2M
  2024  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘  477.4M
  2025  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ  494.3M
  2026  β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘  87.5M

How to download and use this dataset

Load comments or submissions separately, filter by year or month, or stream the whole thing. Standard Hugging Face Parquet layout, works with DuckDB, datasets, pandas, and huggingface_hub out of the box.

Using DuckDB

DuckDB reads Parquet directly from Hugging Face - no download step needed.

-- Top 20 subreddits by comment volume (all time)
SELECT subreddit, count(*) AS comments
FROM read_parquet('hf://datasets/open-index/arctic/data/comments/**/*.parquet')
GROUP BY subreddit
ORDER BY comments DESC
LIMIT 20;
-- Monthly submission volume for 2023
SELECT
    strftime(created_at, '%Y-%m') AS month,
    count(*) AS submissions,
    sum(num_comments) AS total_comments
FROM read_parquet('hf://datasets/open-index/arctic/data/submissions/2023/**/*.parquet')
GROUP BY month
ORDER BY month;
-- Most active authors across all comments
SELECT author, count(*) AS comments, avg(score) AS avg_score
FROM read_parquet('hf://datasets/open-index/arctic/data/comments/**/*.parquet')
WHERE author != '[deleted]'
GROUP BY author
ORDER BY comments DESC
LIMIT 20;
-- Average comment length by year
SELECT
    extract(year FROM created_at) AS year,
    avg(body_length) AS avg_length,
    count(*) AS comments
FROM read_parquet('hf://datasets/open-index/arctic/data/comments/**/*.parquet')
GROUP BY year
ORDER BY year;
-- Top linked domains in submissions
SELECT
    regexp_extract(url, 'https?://([^/]+)', 1) AS domain,
    count(*) AS posts
FROM read_parquet('hf://datasets/open-index/arctic/data/submissions/**/*.parquet')
WHERE url IS NOT NULL AND url != ''
GROUP BY domain
ORDER BY posts DESC
LIMIT 20;

Using datasets

from datasets import load_dataset

# Stream all comments without downloading everything
comments = load_dataset("open-index/arctic", "comments", split="train", streaming=True)
for item in comments:
    print(item["author"], item["subreddit"], item["body"][:80])

# Load submissions for a specific year
subs = load_dataset(
    "open-index/arctic", "submissions",
    data_files="data/submissions/2023/**/*.parquet",
    split="train",
)
print(f"{len(subs):,} submissions in 2023")

Using huggingface_hub

from huggingface_hub import snapshot_download

# Download only 2023 comments
snapshot_download(
    "open-index/arctic",
    repo_type="dataset",
    local_dir="./arctic/",
    allow_patterns="data/comments/2023/**/*",
)

For faster downloads, install pip install huggingface_hub[hf_transfer] and set HF_HUB_ENABLE_HF_TRANSFER=1.

Using the CLI

# Download a single month of submissions
huggingface-cli download open-index/arctic \
    --include "data/submissions/2024/01/*" \
    --repo-type dataset --local-dir ./arctic/

Dataset statistics

Type Months Rows Parquet Size
comments 108 1.3B 111.3 GB
submissions 138 1.3B 159.2 GB
Total 108 2.6B 270.4 GB

Monthly breakdown

Click to expand full monthly table (108 comment months + 138 submission months)
Month Type .zst Size Download Process Upload Shards Rows Parquet
2005-12 comments - 36.7s 1.0s 17.0s 1 1,075 138.4 KB
2005-12 submissions - 1m04s 1.4s 8.7s 1 5,356 304.3 KB
2006-01 comments - 30.3s 0.6s 6.0s 1 3,666 388.6 KB
2006-01 submissions - 22.1s 1.2s 1m05s 1 8,048 458.3 KB
2006-02 comments - 24.3s 6.2s 21.0s 1 9,095 1021.6 KB
2006-02 submissions - 26.1s 2.0s 10.2s 1 9,501 560.3 KB
2006-03 comments - 17.6s 1.1s 7.4s 1 13,859 1.4 MB
2006-03 submissions - 29.4s 11.9s 12.0s 1 12,525 724.7 KB
2006-04 comments - 30.7s 2.8s 15.1s 1 19,090 2.1 MB
2006-04 submissions - 25.6s 3.0s 17.1s 1 12,556 725.4 KB
2006-05 comments - 24.1s 5.2s 6.0s 1 26,859 2.9 MB
2006-05 submissions - 23.9s 2.5s 5.7s 1 14,701 850.5 KB
2006-06 comments - 23.4s 9.1s 12.1s 1 29,163 3.1 MB
2006-06 submissions - 21.9s 13.3s 25.1s 1 16,942 958.7 KB
2006-07 comments - 29.2s 10.1s 17.6s 1 37,031 3.8 MB
2006-07 submissions - 25.4s 8.8s 9.3s 1 24,026 1.3 MB
2006-08 comments - 32.0s 13.3s 6.4s 1 50,559 5.3 MB
2006-08 submissions - 25.4s 9.3s 10.7s 1 40,750 2.2 MB
2006-09 comments - 19.1s 4.8s 9.7s 1 50,675 5.3 MB
2006-09 submissions - 29.3s 9.5s 8.9s 1 54,043 2.8 MB
2006-10 comments - 26.5s 10.9s 14.9s 1 54,148 5.3 MB
2006-10 submissions - 28.3s 15.0s 25.4s 1 38,333 2.2 MB
2006-11 comments - 30.2s 3.0s 20.9s 1 62,021 6.1 MB
2006-11 submissions - 25.4s 3.4s 6.1s 1 36,824 2.1 MB
2006-12 comments - 21.7s 8.1s 6.7s 1 61,018 6.1 MB
2006-12 submissions - 21.3s 8.6s 6.5s 1 36,434 2.1 MB
2007-01 comments - 24.1s 7.4s 29.9s 1 81,341 8.3 MB
2007-01 submissions - 23.4s 8.1s 10.0s 1 43,725 2.5 MB
2007-02 comments - 24.7s 7.3s 12.5s 1 95,634 9.5 MB
2007-02 submissions - 28.1s 15.8s 9.0s 1 47,317 2.8 MB
2007-03 comments - 18.1s 8.5s 28.0s 1 112,444 10.5 MB
2007-03 submissions - 24.8s 6.5s 13.6s 1 58,642 3.4 MB
2007-04 comments - 21.5s 9.3s 17.1s 1 126,773 11.6 MB
2007-04 submissions - 22.4s 8.2s 17.7s 1 61,544 3.6 MB
2007-05 comments - 39.0s 4.8s 11.8s 1 170,097 15.7 MB
2007-05 submissions - 30.4s 5.9s 11.5s 1 65,098 3.9 MB
2007-06 comments - 28.7s 8.5s 24.6s 1 178,800 16.1 MB
2007-06 submissions - 23.1s 7.8s 11.1s 1 62,693 3.8 MB
2007-07 comments - 23.2s 14.1s 15.4s 1 203,319 18.6 MB
2007-07 submissions - 23.9s 12.6s 12.8s 1 73,248 4.4 MB
2007-08 comments - 30.7s 12.2s 1m18s 1 225,111 19.9 MB
2007-08 submissions - 29.3s 6.5s 13.3s 1 84,952 5.1 MB
2007-09 comments - 32.2s 7.5s 7.7s 1 259,497 23.1 MB
2007-09 submissions - 30.2s 7.3s 15.8s 1 91,294 5.5 MB
2007-10 comments - 25.5s 8.4s 17.1s 1 274,170 24.5 MB
2007-10 submissions - 29.4s 16.7s 15.8s 1 101,633 5.9 MB
2007-11 comments - 42.0s 14.9s 1m43s 1 372,983 27.4 MB
2007-11 submissions - 30.6s 7.1s 10.6s 1 106,868 5.9 MB
2007-12 comments - 29.9s 19.0s 24.9s 1 363,390 30.2 MB
2007-12 submissions - 21.6s 8.6s 17.0s 1 111,193 6.1 MB
2008-01 comments - 34.6s 12.5s 38.0s 1 452,990 36.7 MB
2008-01 submissions - 26.2s 7.6s 19.9s 1 142,310 7.8 MB
2008-02 comments - 40.4s 19.2s 1m09s 1 441,768 35.8 MB
2008-02 submissions - 26.4s 8.1s 22.7s 1 147,834 8.3 MB
2008-03 comments - 25.0s 15.1s 1m13s 1 463,728 37.3 MB
2008-03 submissions - 24.9s 13.6s 20.6s 1 168,227 9.4 MB
2008-04 comments - 32.7s 16.6s 1m32s 1 468,317 38.5 MB
2008-04 submissions - 28.3s 7.8s 25.3s 1 167,472 9.4 MB
2008-05 comments - 29.9s 13.4s 11.9s 2 536,380 43.5 MB
2008-05 submissions - 17.6s 9.2s 7.3s 1 177,022 10.0 MB
2008-06 comments - - 17.9s 7.8s 2 577,684 47.4 MB
2008-06 submissions - 34.5s 9.0s 11.4s 1 190,682 10.8 MB
2008-07 comments - 29.3s 16.7s 11.1s 2 592,610 49.3 MB
2008-07 submissions - 20.2s 10.9s 7.1s 1 218,092 12.2 MB
2008-08 comments - - 23.5s 7.2s 2 595,959 49.1 MB
2008-08 submissions - 32.2s 11.4s 26.2s 1 212,552 12.0 MB
2008-09 comments - 1m09s 11.3s 16.6s 2 680,892 56.1 MB
2008-09 submissions - 42.8s 13.1s 35.5s 1 256,268 14.6 MB
2008-10 comments - - 20.4s 28.9s 2 789,874 64.2 MB
2008-10 submissions - 34.6s 11.2s 6.0s 1 282,974 16.3 MB
2008-11 comments - 1m24s 19.9s 58.3s 2 792,310 63.0 MB
2008-11 submissions - 27.7s 15.6s 20.1s 1 272,505 15.5 MB
2008-12 comments - - 33.1s 41.5s 2 850,359 69.8 MB
2008-12 submissions - 22.1s 13.6s 15.2s 1 283,915 16.0 MB
2009-01 comments - 1m29s 13.0s 10.3s 3 1,051,649 86.1 MB
2009-01 submissions - 26.3s 11.6s 9.3s 1 331,060 18.9 MB
2009-02 comments - - 21.1s 21.8s 2 944,711 78.7 MB
2009-02 submissions - 23.1s 13.2s 11.8s 1 329,042 19.1 MB
2009-03 comments - 35.6s 32.8s 17.9s 3 1,048,643 89.8 MB
2009-03 submissions - 24.1s 17.7s 17.3s 1 362,805 21.4 MB
2009-04 comments - - 47.6s 8.7s 3 1,094,599 93.2 MB
2009-04 submissions - 39.6s 17.2s 7.0s 1 357,107 21.3 MB
2009-05 comments - 1m24s 12.9s 30.6s 3 1,201,257 105.3 MB
2009-05 submissions - 30.5s 15.2s 53.4s 1 355,193 21.4 MB
2009-06 comments - - 21.9s 11.7s 3 1,258,750 111.7 MB
2009-06 submissions - 38.8s 17.0s 14.8s 1 383,969 23.9 MB
2009-07 comments - 53.2s 36.0s 16.2s 3 1,470,290 129.9 MB
2009-07 submissions - 22.7s 20.5s 15.2s 1 427,135 28.4 MB
2009-08 comments - - 23.4s 15.4s 4 1,750,688 153.3 MB
2009-08 submissions - 1m01s 21.4s 17.1s 1 435,860 30.3 MB
2009-09 comments - 1m15s 55.7s 3m02s 5 2,032,276 172.6 MB
2009-09 submissions - 24.9s 21.3s 14.7s 1 443,037 31.7 MB
2009-10 comments - 1m18s 29.3s 1m49s 5 2,242,017 199.1 MB
2009-10 submissions - 1m17s 24.4s 39.7s 1 461,702 33.7 MB
2009-11 comments - - 1m03s 57.1s 5 2,207,444 193.5 MB
2009-11 submissions - 33.3s 20.9s 31.1s 1 452,320 32.8 MB
2009-12 comments - 1m38s 1m26s 25.9s 6 2,560,510 220.7 MB
2009-12 submissions - 26.7s 12.1s 25.4s 1 492,225 36.3 MB
2010-01 comments - - 49.6s 29.6s 6 2,884,096 249.4 MB
2010-01 submissions - 1m02s 34.1s 6.9s 2 549,007 40.9 MB
2010-02 comments - 1m35s 46.4s 43.4s 6 2,687,779 237.5 MB
2010-02 submissions - - 58.9s 9.7s 2 506,868 38.2 MB
2010-03 comments - - 48.8s 1m37s 7 3,228,254 281.1 MB
2010-03 submissions - 1m16s 31.6s 13.2s 2 602,586 46.2 MB
2010-04 comments - 1m40s 50.7s 1m02s 7 3,209,898 274.8 MB
2010-04 submissions - - 1m11s 14.2s 2 617,302 46.8 MB
2010-05 comments - - 1m14s 29.6s 7 3,267,363 278.5 MB
2010-05 submissions - 56.2s 1m08s 10.2s 2 515,637 41.6 MB
2010-06 comments - 2m06s 41.9s 1m09s 8 3,532,867 298.9 MB
2010-06 submissions - 26.9s 19.7s 1m34s 1 478,396 40.0 MB
2010-07 comments - - 2m51s 31.1s 9 4,032,737 346.6 MB
2010-07 submissions - 1m15s 1m58s 6.9s 2 504,098 44.2 MB
2010-08 comments - 4m02s 3m19s 43.9s 9 4,247,982 364.2 MB
2010-08 submissions - - 3m48s 15.2s 2 537,480 46.9 MB
2010-09 comments - - 5m44s 3m40s 10 4,704,069 400.0 MB
2010-09 submissions - 1m55s 48.3s 17.7s 2 600,209 53.7 MB
2010-10 comments - 5m59s 5m52s 12.3s 11 5,032,368 433.8 MB
2010-10 submissions - - 52.3s 11.6s 2 630,298 57.6 MB
2010-11 comments - - 14m01s 3m50s 12 5,689,002 487.7 MB
2010-11 submissions - - 8m41s 32.5s 2 674,615 63.6 MB
2010-12 comments - 4m22s 11m24s 6m30s 12 5,972,642 513.1 MB
2010-12 submissions - 59.3s 15.1s 8m13s 2 729,840 69.3 MB
2011-01 comments 592.8 MB - 12m14s 3m28s 14 6,603,329 570.4 MB
2011-01 submissions 117.9 MB - 12m30s 1m51s 2 837,996 91.1 MB
2011-02 comments 574.2 MB - 13m35s 1m57s 13 6,363,114 551.3 MB
2011-02 submissions 114.1 MB - 13m58s 33.6s 2 822,302 88.3 MB
2011-03 comments 679.7 MB 6m36s 18m04s 2m59s 16 7,556,165 652.4 MB
2011-03 submissions 135.5 MB 2m44s 18m30s 8m51s 2 976,817 104.4 MB
2011-04 comments 665.1 MB 2m35s 13m05s - 16 7,571,398 633.9 MB
2011-04 submissions 132.9 MB 1m29s 21.3s 2m41s 2 971,371 101.6 MB
2011-05 comments 772.0 MB - 17m50s 4m29s 18 8,803,949 733.7 MB
2011-05 submissions 144.9 MB - 18m18s 2m55s 3 1,081,578 111.9 MB
2011-06 comments 854.3 MB - 22m19s 11m36s 20 9,766,511 816.4 MB
2011-06 submissions 155.2 MB - 22m30s - 3 1,153,048 121.0 MB
2011-07 comments 912.7 MB 4m14s 32m23s 7m60s 22 10,557,466 870.1 MB
2011-07 submissions 172.1 MB 1m31s 20.4s 2m49s 3 1,264,991 134.2 MB
2011-08 comments 1.0 GB 6m51s 85m08s 7m46s 25 12,316,144 1012.3 MB
2011-08 submissions 199.9 MB - 44.2s 1m08s 3 1,448,347 156.5 MB
2011-09 comments 1.0 GB 5m08s 42m25s 7m56s 25 12,150,412 1004.9 MB
2011-09 submissions 204.2 MB 2m13s 1m34s 37.2s 3 1,482,575 160.2 MB
2011-10 comments 1.2 GB 7m17s 48m26s 4m00s 27 13,470,278 1.1 GB
2011-10 submissions 226.9 MB - 54.4s 1m17s 4 1,590,673 179.1 MB
2011-11 comments 1.2 GB - 31m10s 2m08s 28 13,621,533 1.1 GB
2011-11 submissions 235.5 MB 2m18s 57.4s 1m06s 4 1,634,431 181.8 MB
2011-12 comments 1.2 GB 8m48s 59m53s 2m23s 30 14,509,469 1.2 GB
2011-12 submissions 252.0 MB - 2m56s 1m21s 4 1,772,219 194.2 MB
2012-01 comments 1.4 GB 16m44s 38m15s 2m34s 33 16,350,205 1.4 GB
2012-01 submissions 285.4 MB 2m04s 5m37s 2m09s 4 1,981,577 217.7 MB
2012-02 comments 1.4 GB 7m56s 88m17s 6m19s 33 16,015,695 1.3 GB
2012-02 submissions 288.4 MB 1m14s 9m08s 1m10s 4 1,961,817 221.3 MB
2012-03 comments 1.6 GB 2m47s 33m06s 3m58s 36 17,881,943 1.5 GB
2012-03 submissions 315.1 MB 1m51s 14m38s 44.8s 5 2,158,965 240.0 MB
2012-04 comments 1.7 GB 12m02s 82m21s 8m23s 39 19,044,534 1.6 GB
2012-04 submissions 333.9 MB 1m31s 7m36s 1m35s 5 2,279,491 253.9 MB
2012-05 comments 1.8 GB 17m27s 166m35s 9m40s 41 20,388,260 1.7 GB
2012-05 submissions 348.5 MB 1m41s 9m32s 1m22s 5 2,293,901 273.1 MB
2012-06 comments 1.9 GB 17m41s 96m36s 8m49s 44 21,897,913 1.8 GB
2012-06 submissions 365.1 MB - 8m13s 1m37s 5 2,393,973 285.7 MB
2012-07 comments 2.1 GB 31m15s 190m27s 7m58s 49 24,087,517 2.0 GB
2012-07 submissions 402.5 MB 1m34s 25m03s 1m48s 6 2,663,529 313.2 MB
2012-08 comments 2.2 GB 23m39s 115m40s 13m16s 52 25,703,326 2.1 GB
2012-08 submissions 425.4 MB 1m09s 13m10s 1m26s 6 2,782,752 332.2 MB
2012-09 comments 2.0 GB 25m24s 224m32s 8m21s 47 23,419,524 1.9 GB
2012-09 submissions 419.9 MB 2m48s 11m14s 1m38s 6 2,568,109 305.8 MB
2012-10 comments 2.1 GB 14m24s 117m20s 9m47s 50 24,788,236 2.0 GB
2012-10 submissions 446.3 MB 3m10s 24m04s 1m05s 6 2,776,156 321.5 MB
2012-11 comments 2.1 GB 43m46s 203m55s 28m57s 50 24,648,302 2.0 GB
2012-11 submissions 441.6 MB 4m37s 24m13s 2m12s 6 2,706,118 321.5 MB
2012-12 comments 2.2 GB - 135m16s 6m37s 53 26,080,276 2.1 GB
2012-12 submissions 444.0 MB 1m17s 13m35s 1m27s 6 2,735,743 324.8 MB
2013-01 comments 2.6 GB 34m20s 246m41s 7m59s 61 30,365,867 2.5 GB
2013-01 submissions 522.7 MB 8m49s 28m60s 2m57s 7 3,134,862 371.4 MB
2013-02 comments 2.4 GB - 279m03s 10m20s 55 27,213,960 2.2 GB
2013-02 submissions 476.8 MB 3m38s 16m45s 1m13s 6 2,886,631 346.7 MB
2013-03 comments 2.7 GB 14m04s 173m12s - 62 30,771,274 2.5 GB
2013-03 submissions 534.8 MB 2m12s 19m46s 2m06s 7 3,201,525 392.5 MB
2013-04 comments 2.9 GB 35m31s 303m11s 10m13s 67 33,259,557 2.7 GB
2013-04 submissions 563.0 MB 3m41s 38m37s 3m16s 7 3,376,188 412.2 MB
2013-05 comments 2.9 GB 30m17s 302m19s 9m34s 67 33,126,225 2.7 GB
2013-05 submissions 551.8 MB 5m18s 48m28s 2m29s 7 3,261,976 411.0 MB
2013-06 comments 2.8 GB 28m56s 155m23s 8m08s 66 32,648,247 2.7 GB
2013-06 submissions 550.7 MB 2m44s 23m23s 2m04s 7 3,189,490 413.1 MB
2013-07 comments 3.1 GB 23m32s 341m28s 10m14s 70 34,922,133 2.9 GB
2013-07 submissions 590.7 MB 3m32s 46m19s 3m05s 7 3,374,406 442.8 MB
2013-08 comments 3.1 GB 14m57s 169m56s 12m54s 70 34,766,579 2.9 GB
2013-08 submissions 586.6 MB 1m40s 26m37s 3m26s 7 3,315,090 443.7 MB
2013-09 comments 2.8 GB 18m24s 152m34s 7m57s 64 31,990,369 2.6 GB
2013-09 submissions 561.6 MB 6m10s 50m41s 2m27s 7 3,121,163 416.4 MB
2013-10 comments 3.1 GB 26m34s 317m17s 10m08s 72 35,940,040 2.9 GB
2013-10 submissions 620.1 MB 2m10s 29m42s 2m24s 7 3,467,225 457.2 MB
2013-11 comments 3.2 GB 43m31s 359m50s 15m57s 75 37,396,497 3.0 GB
2013-11 submissions 642.3 MB 8m21s 42m51s 2m12s 8 3,522,378 476.1 MB
2013-12 comments 3.4 GB 47m50s 190m02s 12m43s 80 39,810,216 3.2 GB
2013-12 submissions - 12m37s 23m54s 1m03s 8 3,801,410 512.7 MB
2014-01 comments - 37m42s 102m37s 5m46s 85 42,420,655 3.5 GB
2014-01 submissions - 13m51s 21m45s 58.4s 9 4,145,541 572.4 MB
2014-02 comments - 22m36s 94m02s 5m21s 78 38,703,362 3.2 GB
2014-02 submissions - 2m51s 21m23s 1m01s 8 3,946,491 557.1 MB
2014-03 comments - 33m04s 103m16s 6m18s 85 42,459,956 3.6 GB
2014-03 submissions - 9m18s 19m14s 49.6s 9 4,209,109 586.7 MB
2014-04 comments - 12m57s 96m04s 5m07s 85 42,440,735 3.5 GB
2014-04 submissions - 5m27s 17m56s 56.9s 9 4,115,035 571.6 MB
2014-05 comments - 15m09s 104m42s 5m39s 86 42,514,094 3.6 GB
2014-05 submissions - 5m42s 16m24s 44.3s 9 4,162,070 590.8 MB
2014-06 comments - 7m47s 103m44s 7m38s 84 41,990,650 3.5 GB
2014-06 submissions - 2m24s 16m42s 51.5s 9 4,092,661 588.8 MB
2014-07 comments - 33m55s 103m19s 6m37s 94 46,868,899 3.9 GB
2014-07 submissions - 2m07s 19m47s 51.2s 10 4,502,441 659.5 MB
2014-08 comments - 7m29s 112m60s 7m06s 94 46,990,813 3.9 GB
2014-08 submissions - 2m30s 21m14s 1m03s 10 4,596,551 672.6 MB
2014-09 comments - 12m04s 102m58s - 90 44,992,201 3.8 GB
2014-09 submissions - - 23m54s 1m14s 10 4,567,136 662.5 MB
2014-10 comments - 31m14s 107m55s 7m44s 95 47,497,520 4.0 GB
2014-10 submissions - 4m51s 21m41s 1m02s 10 4,831,580 704.9 MB
2014-11 comments - 6m10s 105m02s 7m55s 93 46,118,074 3.9 GB
2014-11 submissions - 3m02s 25m10s 1m03s 10 4,735,396 699.8 MB
2017-11 submissions 2.2 GB 8m23s 79m11s 2m55s 21 10,377,379 1.3 GB
2018-01 submissions 2.4 GB 8m56s 94m49s 3m14s 23 11,306,843 1.4 GB
2019-11 submissions 5.5 GB 9m48s 26m01s 5m27s 43 21,243,315 2.2 GB
2023-12 submissions 15.2 GB 30m19s 76m33s 26m24s 79 39,245,797 4.4 GB
2024-01 submissions 16.1 GB 183m48s 126m12s 21m25s 83 41,263,034 4.7 GB
2024-02 submissions 15.2 GB 111m14s 119m06s 23m35s 79 39,030,731 4.4 GB
2024-03 submissions 16.0 GB 65m44s 116m50s 34m36s 84 41,926,097 4.8 GB
2024-04 submissions 17.4 GB 23m13s 25m48s 31m31s 82 40,701,692 4.7 GB
2024-05 submissions 15.3 GB 20m14s 104m42s 29m22s 80 39,877,545 4.7 GB
2024-06 submissions 15.0 GB 32m22s 92m02s 19m44s 77 38,423,968 4.6 GB
2024-07 submissions 15.9 GB 24m49s 116m01s 13m17s 82 40,924,291 4.8 GB
2024-08 submissions 16.2 GB 163m07s 104m37s 70m51s 83 41,477,291 4.9 GB
2024-09 submissions 15.0 GB 33m40s 91m13s 28m16s 78 38,596,941 4.7 GB
2024-10 submissions 15.5 GB 31m32s 100m29s 32m47s 78 38,986,522 4.8 GB
2024-11 submissions 15.1 GB 46m44s 108m24s 33m44s 77 38,022,413 4.7 GB
2024-12 submissions 15.4 GB 25m12s 82m18s 25m24s 77 38,188,278 4.8 GB
2025-01 submissions 16.2 GB 59m52s 73m59s 23m14s 80 39,905,721 5.1 GB
2025-02 submissions 14.8 GB 35m36s 64m07s 14m40s 73 36,173,238 4.7 GB
2025-03 submissions 16.3 GB 56m39s 77m30s 15m47s 80 39,637,989 5.2 GB
2025-04 submissions 16.1 GB 58m07s 79m49s 31m40s 78 38,776,405 5.2 GB
2025-05 submissions 17.2 GB 86m13s 91m27s 47m45s 82 40,563,699 5.4 GB
2025-06 submissions 17.2 GB 45m47s 82m33s 50m14s 81 40,417,850 5.4 GB
2025-07 submissions 17.3 GB 46m29s 82m56s 53m20s 86 42,900,690 5.8 GB
2025-08 submissions 17.5 GB 32m24s 91m24s 29m09s 89 44,170,194 5.8 GB
2025-09 submissions 16.7 GB 35m21s 87m53s 37m53s 85 42,368,178 5.4 GB
2025-10 submissions 17.0 GB 27m11s 79m60s 12m49s 88 43,919,437 5.5 GB
2025-11 submissions 16.7 GB 36m19s 84m30s 13m47s 86 42,567,338 5.3 GB
2025-12 submissions 17.0 GB 33m58s 90m19s 25m17s 86 42,861,083 5.5 GB
2026-01 submissions 18.3 GB 48m04s 98m58s 49m17s 93 46,094,846 5.9 GB
2026-02 submissions 16.6 GB 35m02s 84m21s 17m25s 83 41,434,105 5.3 GB

Query per-month stats directly:

SELECT year, month, type, shards, count, size_bytes
FROM read_csv_auto('hf://datasets/open-index/arctic/stats.csv')
ORDER BY year, month, type;

stats.csv columns:

Column Description
year, month Calendar month
type comments or submissions
shards Number of Parquet files for this (month, type)
count Total rows across all shards
size_bytes Total Parquet size across all shards
zst_bytes Original .zst source file size (from torrent metadata)
dur_download_s Seconds to download the .zst source
dur_process_s Seconds to decompress and convert to Parquet
dur_commit_s Seconds to commit to Hugging Face
committed_at ISO 8601 timestamp when this pair was committed

Pipeline Status

Pipelined ingestion running on doge-01 (4 cores, 6 GB RAM, 219 GB free). Auto-updated every ~5 minutes.

Started: 2026-04-01 02:51 UTC / Elapsed: 5h 20m / Committed this session: 5

Active Workers

Stage Month Type Progress
Processing 2014-12 comments shard 22 Β· 11.0M rows Β· 148.7K rows/s

Throughput

Metric Value
Download 32 Mbps avg
Processing 4.9K rows/s avg
Upload 228.0s per commit avg
ETA 2026-04-11 23:41 UTC

Progress

β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘β–‘ 246 / 490 (50.2%)

Metric This Session
Months committed 5
Rows processed 107.7M
Data committed 9.9 GB

Last update: 2026-04-01 08:10 UTC

Dataset card for Arctic Shift Reddit Archive

Dataset summary

A repackaging of the Arctic Shift monthly Reddit dumps into Parquet. Arctic Shift re-processes the PushShift Reddit archive, which captured most public Reddit content from the early days through the 2023 API changes.

Covers every public subreddit, every month, both comments and submissions. Built for research, analysis, and training. People use it for:

  • Language model pretraining and fine-tuning - one of the largest sources of natural conversation on the internet
  • Sentiment and trend analysis - two decades of public opinion on just about everything
  • Community research - thousands of subreddits, each with its own culture and moderation norms
  • Information retrieval - real questions and answers from r/AskReddit, r/explainlikeimfive, and others
  • Content moderation research - moderation signals are preserved in the data

Dataset structure

Data instances

Example comment:

{
  "id": "c0001",
  "author": "spez",
  "subreddit": "reddit.com",
  "body": "Welcome to Reddit!",
  "score": 42,
  "created_utc": 1134028003,
  "created_at": "2005-12-08T10:06:43",
  "body_length": 19,
  "link_id": "t3_17",
  "parent_id": "t3_17",
  "distinguished": null,
  "author_flair_text": null
}

Example submission:

{
  "id": "abc123",
  "author": "kn0thing",
  "subreddit": "reddit.com",
  "title": "The Downing Street Memo",
  "selftext": "",
  "score": 15,
  "created_utc": 1118895720,
  "created_at": "2005-06-16T01:02:00",
  "title_length": 23,
  "num_comments": 3,
  "url": "http://www.timesonline.co.uk/...",
  "over_18": false,
  "link_flair_text": null,
  "author_flair_text": null
}

Data fields

Comments (data/comments/YYYY/MM/NNN.parquet)

Column Type Description
id VARCHAR Reddit's base-36 comment ID
author VARCHAR Username. [deleted] if account was removed
subreddit VARCHAR Subreddit name (no r/ prefix)
body VARCHAR Comment text in Markdown
score BIGINT Net upvotes at time of archival
created_utc BIGINT Unix timestamp
created_at TIMESTAMP Derived from created_utc
body_length BIGINT Character count of body
link_id VARCHAR Parent submission ID (t3_... format)
parent_id VARCHAR Parent comment or submission ID
distinguished VARCHAR moderator, admin, or null
author_flair_text VARCHAR Author's flair in this subreddit

Submissions (data/submissions/YYYY/MM/NNN.parquet)

Column Type Description
id VARCHAR Reddit's base-36 submission ID
author VARCHAR Username of the poster
subreddit VARCHAR Subreddit name
title VARCHAR Post title
selftext VARCHAR Post body for text posts (empty for link posts)
score BIGINT Net upvotes at time of archival
created_utc BIGINT Unix timestamp
created_at TIMESTAMP Derived from created_utc
title_length BIGINT Character count of title
num_comments BIGINT Comment count on this post
url VARCHAR External URL for link posts, permalink for text posts
over_18 BOOLEAN NSFW flag
link_flair_text VARCHAR Post flair text
author_flair_text VARCHAR Author's flair

Data splits

Two named configs: comments and submissions. Each loads all monthly shards as a single train split.

You can also load specific years or months with data_files:

# Load just January 2020 comments
ds = load_dataset("open-index/arctic", data_files="data/comments/2020/01/*.parquet", split="train")

# Load all 2023 submissions
ds = load_dataset("open-index/arctic", data_files="data/submissions/2023/**/*.parquet", split="train")

Dataset creation

Why we built this

Reddit is one of the best sources of real human conversation on the internet, but getting at the full archive got a lot harder after Reddit locked down API access in 2023. The Arctic Shift project preserves the data as monthly .zst JSONL dumps. We convert those dumps to Parquet on Hugging Face so you can query with DuckDB, stream with datasets, or bulk download without any special tooling.

Source data

Everything comes from Arctic Shift torrent archives, which re-process the PushShift Reddit dumps. Source format is .zst-compressed JSONL, one JSON object per line.

  • 2005-12 through 2023-12: From the Arctic Shift bundle torrent
  • 2024-01 onward: Individual monthly torrents from Arctic Shift

Processing steps

The pipeline is written in Go and uses DuckDB for the Parquet conversion. For each (month, type) pair:

  1. Download the .zst via BitTorrent with selective file priority (only the needed file from the bundle, not the whole archive)
  2. Stream through a klauspost/compress zstd decoder with a 2 GB window
  3. Chunk the JSONL into ~2 million line batches, writing each to a temp file
  4. Convert each chunk to Parquet with DuckDB read_json_auto, explicit column selection, TRY_CAST, Zstandard compression, 131K-row row groups
  5. Delete each temp chunk right after the shard is written (disk is tight)
  6. Commit all shards plus updated stats.csv and README.md to Hugging Face
  7. Clean up local shards after the commit goes through

The pipeline picks up where it left off - stats.csv tracks what has been committed, and those pairs get skipped on restart. Disk usage stays minimal: at most one .zst, one JSONL chunk, and the current month's shards on disk at a time.

No filtering, deduplication, or content changes. The data matches the Arctic Shift dumps exactly. All Parquet files use Zstandard compression.

Personal and sensitive information

Usernames and user-generated text are included as they appeared publicly on Reddit. Deleted accounts show as [deleted], deleted content as [removed].

No PII scrubbing has been done. At this scale, the dataset almost certainly contains personal information that people posted publicly. If you find something that should be removed, open a discussion on the Community tab.

Considerations for using the data

Social impact

Making the full Reddit archive accessible in a standard format should help researchers study how online communities work, how language changes over time, and how one of the internet's biggest platforms has shaped public discourse.

Biases

Reddit skews young, male, English-speaking, and North American/European. Subreddits vary wildly in culture, moderation, and toxicity. The voting system amplifies what each community already agrees with.

We did not filter, score, or assess the data in any way. Controversial, toxic, and NSFW content is all in there. Apply your own filtering for your use case.

Known limitations

  • Completeness depends on PushShift. PushShift missed some content, especially in the earliest months and during ingestion outages.
  • Scores are snapshots. The score field is whatever PushShift captured at the time, not the final score.
  • Deleted content. Posts deleted before PushShift got to them are gone. Posts deleted after capture may still have the original text.
  • No user profiles. Just posts and comments. No karma, no account metadata.
  • Markdown and HTML. Comment bodies use Reddit's Markdown variant. Some old content has raw HTML.

Additional information

Licensing

Reddit content is subject to Reddit's Terms of Service. Arctic Shift distributes the archive under permissive research terms. This repackaging is provided as-is for research and education.

Not affiliated with or endorsed by Reddit, Inc. or Arctic Shift.

Thanks

All the data here comes from Arctic Shift, which preserves and distributes the PushShift Reddit archive through Academic Torrents. None of this would be practical without their work.

Contact

Questions, feedback, or issues - open a discussion on the Community tab.

Last updated: 2026-04-01 08:11 UTC

Downloads last month
5,402