Modern software is junk. Almost every program uses vastly more resources than it needs, and does its main task worse than older, more focused programs.

I don't think I have a single "new" program that's as good as the thing it replaced, not a single program as good and light as the stuff we had 30 years ago. So where possible I use 30-40 year old software, and I resent the complex stuff I have to deal with. It's polluting the planet, literally boiling the oceans.

Case 1

This blog is in WordPress, which is in PHP on a giant tower of shitty software, like 20 "plugins" to fix things that are inadequate and wrong in it. I've done what I can to lighten it some, streamline layout, but that's lipstick & yoga pants on a pig. 25 years ago I had a simple blog (uh, actually also in PHP, tho I had another one in Perl, so that's not any better). But that was LOC, it just needed a tiny local database, and really could've just used flat files. And before the blog, I had just my hierarchical web site, and before that I had Gopher.

Gopher was basically perfect. Just a structured tree of documents, accessed by raw socket connections or manually by telnet. If you wanted to make a journal ("web log" -> "blog" was a decade away), you put links to plain text entries on a Gopher menu.

iMark's Gopher Hole _   _   0 gMugshot    /images/mark.gif    example.com 70 1Games  /games  example.com 70 iJournal    _   _   0 01990-09-01 /journal/1990-09-01.txt example.com 70 01990-08-25 /journal/1990-08-25.txt example.com 70 . 

etc. Actually at the time I probably would've done chronological order, not reverse.

We have Gemini now trying to be like Gopher, but it has TLS, and a complex connection protocol, and error messages (Gopher just responded "3" if something went wrong, possibly followed by a message), and then the page you get is presentation, not a menu; it doesn't tell you the content type of any link, it tries to style content in-line, like a lower-resource WWW. But to run Gemini, you need a web server to update TLS, it won't stay up without constant maintenance, and it uses more resources than just serving a web page.

Case 2

Mastodon is a giant database that constantly messages other databases to tell them about posts… and it still sometimes takes a while to propagate messages, or fails utterly. There's no markup except URLs, and either polls or images (can't have both, and aren't inline). The only control you have over your experience is blocking people, and crude text-match filters.

30 years ago, we had USENET, email, and IRC/ICB chat. USENET was often slow, some servers would only connect once a day, others every hour, some every 15 minutes or so. You might need a couple hops to get to someone. But your message length was unlimited, most clients handled some markup with *bold*, /italic/, _underline_, and <URLs and FTP hostnames>. Images had to be UUEncoded, but most clients could insert them easily, graphical ones could display them inline, and download them; I used text-only strn so I'd download and run xv to see images. But the power we had in those clients was so much better. strn did scoring, I had thousands of lines of regular expressions and header lines to match with scores up or down. I'd go into a newsgroup, and the best stuff would be at the top, mediocre stuff below it if I cared, junk and spam and assholes deleted.

If you wanted to immediately contact someone, email or chat existed. There's an experimental chat system on Pleroma, but not on Mastodon yet/ever. Or you can use the modern equivalent of that, burn 1GB of RAM and a CPU core running Slack or Discord. Madness.

Case 3

Emacs. Eight-hundred Megs And Constantly Swapping. Is emacs the original sin, or were there flotilla-of-shit programs before it? Back in the day, you could start micro-emacs ("me" on Atari ST, later uemacs) in milliseconds, or emacs in many tens of seconds or even minutes. The emacs people would just leave this giant blob of an interpreter, editor, half an operating system but not really, running all day, eat most available RAM and CPU, and load files into it. The me and vi people would instantly open a file, edit, and close, barely a blip on the system resources. 30 years later, uemacs starts in nanoseconds, and emacs starts in seconds, but it's just as obnoxious.

Today I use BBEdit, which is svelte for an IDE, but it's a giant pig compared to what "a text editor" needs to be; I keep trying other IDE-types like Sublime Text or Atom, and they're too heavy for me to tolerate. And in console, I run Vim, which isn't as bloated as emacs, but it's fat. None of these make me happy. STeVIe was much lighter, and I've repeatedly considered going back to it if I can recompile it. I did manage to compile Linus' build of uemacs and it's nice, but I can't get used to it again after 25-ish years off it; my console habits are vi, it seems.

Resolved

The end goal of software is not to put everything in it, a flight simulator in your spreadsheet (fucking Excel!); a computer in your fridge for playing ads; a web server, email client, and text editor in your math program "notebook"; a fucking NTFS miner in your MS Paint clone.

The end goal of good software is to do ONE THING. To do it fast, efficiently, and correctly, in the least resources you can.

Re-evaluate your use of flotilla of shit software, and dump it.