this post was submitted on 11 Feb 2025
851 points (99.4% liked)

Not The Onion

13317 readers
2242 users here now

Welcome

We're not The Onion! Not affiliated with them in any way! Not operated by them in any way! All the news here is real!

The Rules

Posts must be:

  1. Links to news stories from...
  2. ...credible sources, with...
  3. ...their original headlines, that...
  4. ...would make people who see the headline think, “That has got to be a story from The Onion, America’s Finest News Source.”

Comments must abide by the server rules for Lemmy.world and generally abstain from trollish, bigoted, or otherwise disruptive behavior that makes this community less fun for everyone.

And that’s basically it!

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] QualifiedKitten@lemmy.world 32 points 1 day ago (3 children)

I know of at least one US government web page that still references "Gulf of Mexico", but I don't want to link it, because I'm very curious to see how long it can fly under the radar. I have a thing set up that checks the page regularly and will alert me whenever it changes.

Is there a way to set up archive.org or something like that to save regular snapshots without risking drawing more attention to it?

[–] dan@upvote.au 4 points 17 hours ago* (last edited 17 hours ago)

Install ArchiveBox. Even if you don't have a home server or VPS, you can run it on your regular PC - it's just a Docker container, or if you don't like Docker, you can run the Python code directly. http://archivebox.io/

That way, it's under your full control, and you keep all the data.

For tracking changes to sites, changedetection.io is free and open-source if you self-host it. Just their remotely hosted version costs money.

[–] IDKWhatUsernametoPutHereLolol@lemmy.dbzer0.com 7 points 1 day ago* (last edited 1 day ago)

Use archive.is and manually save it every hour?

archive.org is subject to takedown requests, so its pointless.

[–] Duamerthrax@lemmy.world 1 points 21 hours ago

Setup a script to use wget to grab a copy of the page every six hours or so?