I was in school, and I remember my 1993. Our school was one of the few schools in my hometown (north-east India) that got computers.
Unfortunately, we had too many students for each computer during classes. I started a revolt that “Computers are wasting our study time, as our upcoming board exams are more important.” The whole class signed the petition and the School Head had to schedule a class-wide talk and agreed to make it totally optional to the point of, “If you really want, you be part of it. But yes, study for the exam is more important.”
So, the computer classes ended up with just me (the traitor), a friend from Kerala, and the school head’s daughter. We ended up like 3 computers each to our disposal. I wrote a QBasic Game-ish program to impress my first girlfriend — she uses the arrow keys to launch dots to hit some area on a heart-shaped thingy on the screen and it prints her name. I remember using physical graph-paper to calculate the screen “pixels” (I think) or co-ordinates to calculate strike areas.
Oh and Yes, almost all of my classmates remember me for being that traitor.
I've been half joking lately that if I wrote an OS, I'd call it Nostalgia OS. I'd aim for a UI reminiscent of windows 98 / windows 2000 / snow leopard. With HID guidelines and a rich, clear, cohesive set of UI widgets to build applications with. I think that was the peak computing user interface - at least as I experienced it.
Of course, the kernel would be based on capabilities (probably SeL4). And applications would probably ship as WASM bundles. And I'd have a built in local first user database built around CRDTs and things instead of a file system, kinda like a modern Lotus Notes. But for the UI? That era was great.
My favourite OSX in terms of visual design was the Panther/Tiger era personally. Leopard looked good, but there was something really cheerful and friendly about Tiger. The iPods of those days were also really well thought-out in my opinion.
Definitely a world apart from the utilitarian Windows 98 UI.
To some extent (not capabilities), Haiku fits the bill here (https://en.wikipedia.org/wiki/Haiku_%28operating_system%29). Applications are bundles (but not WASM of course). The UI is very clean. The whole OS is also elegant and very fast on modern hardware.
Yes because BeOS was way ahead of its time. A complete new OS, doing all system things so much more efficiently, that it could allow wasting cpu time on high level actions, like moving windows in real-time while they were playing videos.
On 1993s hardware, impossible with Windows or OS/2.
At that time, there was almost no spam because we could report them to abuse@domain...
There was no firewall in front of our campus and we were using rlogin to connect outside. I used export DISPLAY from Brest to Paris (to use xdvi that was not installed locally).
IMO, the security (ssh and killing rlogin) is the main change that is a really useful progress.
Pre-wikipedia days means you would likely have found me hunched over a 14" VGA monitor reading articles about giant devil rays and mansa musa in Microsoft Encarta as a kid.
I adored Encarta. My aunt had the multi-CD set, I still vividly remember the animated landing page, clicking on cool pictures to navigate. Having to switch to CD#2 because the article you wanted wasn't on there.
It actually felt magical, and I totally credit it with my fascination with STEM and science, I would spend hours browsing around.
I don't think I was ever happier as a programmer than I was in the early 90s, well before university, with no thought of the internet, writing games on my Amiga.
Incidentally, I recently replayed Loom, from a bit before that era. It's still a lovely, wonderful game! Such a shame the fan-made sequel (Forge) seems to have died.
Same, but it was late 90's and I had found a Quick Basic compiler on our Windows 95 laptop. It had the ability to compile EXEs and I felt like I'd discovered some old magics. Like, this thing was just sitting here? All I had to do was double click qb45.exe?? And it opened a fully integrated terminal IDE???
The built in syntax help, which was incredible, and some example programs (no idea why our laptop had these) let me self-teach.
Now I sit all day and write bare metal firmware, but it feels so empty.
"Talk to me about Loom!" That was my introduction to it.
I believe it was reasonably popular back in the day, made money at least. Now it would be almost avant guarde in its slowness I guess. You can't even double click to run. The popular art of yesteryear becomes the high art of today...
I had two faster (Pentium 3) machines but firing up my dad's old Pentium 90 running redhat that I installed to use KDE to write C was just a beautiful feeling and really got me coding for a few years.
Even games were at least partially open (quake) or easily moddable (command and conquer, tribes, etc.) so I went down that rabbit hole soon after.
OP has a good point, but for me, I'd rather wish we'd skipped the 90s and picked up again much later. I like to think of the 8-bit era as an early bronze age of computing, lots of things went right and were done right.
16-bit, to me, are the dark ages. Lot's of confusion, not much good came out of it technologically and aesthetically. God, everything was ugly. Maybe all the trials and tribulations were necessary for what was about to come but I like to believe they weren't.
32-bit to me is the golden age and 64-bit is platinum.
If you offered my a time machine to go back, I'd surely say: "No, thank you!". There hasn't been a better time than now, but if you'd forced me at gun point, I'd pick the 80s over the 90s any time.
I agree that 32-bit is the golden age, but 64-bit is enterprise bloat. I personally would go for 1995 vs 2005, but I think 2005 was a lot better than 2015 in terms of interfaces.
I instead think the mistake was internet. If I were a time lord, I found the way to cripple the internet tech (3-body problem style) so it fails to do better than 56k.
There was plenty of amazing stuff going on with computing in the 90's. You just had to know where to look. Do you consider the 68000 CPU to be 16 or 32 bit, or both?
The 68000 was introduced in 1979, to me it is part of the 80s.
But you are making a good point. Maybe the distinction between 8/16/32/64-bit isn't really helpful. I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.
I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.
I get the sentiment, but I have to nitpick the details ;)
VGA isn't bitplaned. It's chunky -- to put a pixel on the screen in 320x200x8 VGA mode 13h you literally just write a single byte to the VGA memory. The Amiga, on the other hand, does use planar graphics.
(Maybe you're thinking of EGA, which is planar and a pain to program for)
I do admit to suffering from nostalgia, but I also think it is the case that everything useful you could do with a computer was doable in 1993, and anything that you couldn't is probably a bad idea.
Well, people showed up randomly at your door completely unannounced. And for the expectation of being available: At least where I grew up mobility was much lower, transportation difficult. You bet there was an expectation of availability. Was it better or worse? I could not say, it was surely different.
The GSM spec is some years older, because you need time to built networks and phones based on it.
In Germany GSM started 1992, and we already had a decent state-wide analog mobile phone network (mostly car phones because of the hardware requirements), called "C net" This was popular since the 80ies and only shutdown in 2000, when GSM was widely established with three private networks (and also cheaper then).
Gotta love the German creativity in naming. It's called "C Net" because that came after "B Net", which came after "A Net". While "A" apparently stood for "Autotelefon" (car phone).
And when D-Net came, T-Mobile called their service D1, and Vodafone called theirs... D2.
Close enough. I had a copy of MacOS Apple Smalltalk from the 1985 training course I took, and had started a job using Objectworks. Squeak 1.0 is quite close to ST-80.
Well the games of my childhood, Total Annihilation and Seven Kingdoms, came out in '97. So for my own selfish reasons I would argue that the Pentium II is where progress should have stopped.
On the other hand, at least I'll get to play Castle of the Winds.
(I also loved Z: Steel Soldiers, but despite the '01 release date, I'm sure it too would have run on a Pentium II).
I’ve often had similar thoughts but in my mind it tends to be windows XP/Mac OS 10.3 ish. Those systems were super capable and responsive, but the web moved at a slower pace. A good balance between computing power and humanity.
I'm old enough to remember the 1990s. Many of us who do consider it the last good decade. Living was cheap. The previously ever-present threat of nuclear annihilation had seemingly abated. This was before the d0t-com crash and obviously the War on Terror that has dominated the 21st century thus far.
I have fond memories of the 486 era, which was really the early 1990s. I'm kinda surprised the PC component of this isn't mentioned here. it was also peak Borland.
It does mention Windows NT but honestly nobody really cared about that until NT 3.0/3.5 and it soon thereafter became Windows XP and laid the foundation for modern Windows.
1993 IIRC had pre-1.0 Linux. I downloaded a distribution (SLS) onto ~30 5.25" floppy drives about that time.
But I really wonder if it was that the tech was sufficiently good at that time or it's simply the tech we had when life was sufficiently good. 1993 was before the dot-com bubble started. That's true. And I guess with more computing power came a lot of the things that many people dislike now. Ads, news feeds, social media, micro-transactions, etc.
But we also have Youtube, video streaming, digital maps and navigation, search engines and a host of other things that are genuinely good.
This stuff was also fantastically expensive (in inflation-adjusted dollars). We shouldn't forget that too.
> It does mention Windows NT but honestly nobody really cared about that until NT 3.0/3.5 and it soon thereafter became Windows XP and laid the foundation for modern Windows.
Fun fact: NT 3.1 was the first version of NT, released in 1993. It was versioned like that to match Windows 3.1 which had been released the previous year.
And NT really took off with Windows 2000. Not just business people but more ordinary people were using it as a more stable alternative to Windows 95/98 (albeit lacking some compatibility, especially with games).
For me, youtube is only nice because of the decades old content that people have put on it. But that is because there is no such quality content made in the world anymore, and that is partially because of the enshittification bought on by the internet.
If it was not the case, youtube won't be that big of a deal. Let me disclose here that I am not a big fan of "on-demand" content.
I'm always amazed how Youtube can be so many different things for different people ... It's true that it used to be better a few years back, but people still upload great content even it it's harder to find nowadays.
Also, music ... back in the 90ies, if you were drawn to the obscure side of music, you'd read about it, and could, at best, imagine what it was like, because your local record store didn't have it, the bigger store the next town over didn't have it, and IF anyone could order it was with a non-refundable down payment.
Nowadays, you can probably find it on YT, and that's great IMHO. I my musical horizon would be so much more limited without it.
I recall sitting around all afternoon to tape Layla off the radio, during a repeat countdown, after hearing it the day before for the first time. The DJ cut in during the fade out with "Indeed..." and forty years later I still can't listen to that song without hearing him at the end.
My musical discoveries exploded with the internet, I can't imagine what I would have missed without it.
> I still can't listen to that song without hearing him at the end.
Something similar with me and "Another Day In Paradise". The first time I heard it was from a cassette my friend recorded from Dubai radio accidentally prefixed with an intro by the radio host..And that intro still comes to mind whenever I hear the song..
>My musical discoveries exploded with the internet
I don't really have nostalgia for that, I prefer the immediacy honestly.
Nowadays people are captured by music differently, as they were captured by music differently before music could be mechanically or digitally reproduced.
For me in the 90s it was the satellite dish and VHS that opened up the world in terms of content, music channels, movies, etc, channels like Cartoon Network, MTV, Viva & Viva Zwei, and so on. And then the internet for me came in '97 or '98.
1993 was before the west entered the last stage of capitalism. It was a time when companies still competed on products rather than using monopolistic force to squeeze ever more revenue out of the same people by turning every life necessity into a subscription. Similarly, it was a time when you could mail-order a house and build it yourself. Rental prices were low because there was no regulatory capture on housing construction yet.
Where I disagree with you is video streaming. In my opinion, YouTube and the commercialisation of holiday memories (which later became Instagram influencers) were the beginning of widespread depression. Seemingly regular people sharing their exceptional life somehow forces everyone else to compare themselves to the dreams presented on YouTube and most people will come up short and then most people will feel insufficient. I believe that’s why early YouTube ads were so powerful. Your ad for exotic goods would play immediately after the viewer became painfully aware of how boring they are, when measured against the top 0.1% on a global scale.
I never understand why people want to label such eras of capitalism as “late” or the last era of capitalism. The late stage was late only to its own death. This isn’t the last stage either. Plenty more to grow. Capitalism is more akin to an indestructible and rapidly mutating organism than an ideology.
Two years before I moved across into IT. This was when I was a graphic designer making magazines and nursing journals using tools like Aldus PageMaker and QuarkXpress. Those were fantastic times. It felt like we could do anything with computers.
I was eyeing a career in IT and moved across soon after, and was dumped into Novell Netware 3.12 land which was an eye opener (Fire Phasers anyone?).
2009, please, sorry. 1993 is fun and all, I can go relive the dreams of Microsoft Encarta, but 2009 has Mac OS 10.6, gigabit ethernet everywhere, and USB.
I'd be ok with it if it's possible to use all the insights and software from after that, we'd just be constrained by the hardware - the Amiga 4000, Apple Mac Quadra 840av 128 MB or a Pentium 66 MHz or the 486 DX2-66 with 64 MB ram.
On the Amiga4000 and Apple you could run their 'native' OS, or you could run a modern day NetBSD or a BeOS version (HaikuOS wouldn't run).
On the Pentium-66, modern Freedos would run, Windows 3.11+win32s+calmira, Windows98SE, Windows2000sp5, XPlite, NetBSD and perhaps even an absolutely stripped down Linux 6 kernel with all the features that would be handy.
You could program in a modern i586 build FreePascal, Zig, Lua, juniper, micropython/shedskin, mruby/natalie, picoruby, juniper - perhaps a low memory JVM could run (OpenJDK 8 or JamVM).
It'd be possible to use sqlite, raylib and r3d-freepascal for efficient 3D games/apps next to Quake/Darkplaces, Doom2/GZDoom, Duke3D, Counterstrike 1.5, Halflife, perhaps Unreal1, Irrlicht and FTEQW.
Modern LambdaMOO's Toastunt/Moor could be made to run and inform6 and tads3 interactive fiction compilers.
Fairy-Stockfish could be compiled - so enough creativity for gaming.
I am just worried about getting Vassal, the java boardgame game engine to run comfortably... but the resolution would be pretty low to play many boardgames comfortably: the highest ATI card could drive 1280x1024. I would really like to use it so I don't have the real world board game setup time. Keldons Race for the Galaxy would compile and run though.
For internet, Dillo+ supports https, gemini and gopher.
https://github.com/crossbowerbt/dillo-plus
This can also be used to browse zim offline wikipedia files with kiwix-serve.
Now my only real problem is that we wouldn't have GenAI - probably EVER - would that be a blessing or a curse?
---
Handy Linux kernel 6 tweaks for low memory situations if you don't want to run NetBSD (which perhaps would be the best choice):
* zram + zstd
zram creates a compressed block device in your RAM. To the OS, it looks like a regular swap partition, but it lives entirely in memory.
When your 64MB fills up, the kernel sends data to /dev/zram0. The data is compressed (usually 3:1 ratio) and stored back in a small slice of your RAM.
Version 6.19 includes better compression ratios and rebalancing. It prevents the CPU from over-working itself on decompression.
* zswap + zstd
zswap is a front-end for a physical swap file on your hard drive.
It intercepts pages headed for the hard drive, compresses them, and keeps them in a RAM pool. If that pool gets too full, it evicts the oldest compressed data to the actual disk. Starting in 6.18, zswap transitioned to using the zsmalloc allocator by default. This reduces "internal fragmentation," meaning it packs those compressed bytes tighter.
* frontswap
API that allows the kernel to intercept swap-outs and store them in a transient memory pool; it works with Zswap to keep the system responsive during high load.
* Maple tree
Replaces old "Red-Black trees" for memory management; it reduces the CPU cycles needed to find data in RAM.
* SLUB sheaves
A modern memory allocator optimization that packs small objects into "sheaves" to reduce fragmentation.
* CONFIG_SLOB_BERBER
A specialized 2025 backport of the old "SLOB" allocator; a memory-efficient way to handle kernel objects, saving roughly 1-2MB of overhead compared to the standard SLUB used in modern PCs.
* Ext4 without journaling
Disable the "Journal" to save RAM and disk writes; it provides the best file-allocation speed without the memory overhead of Btrfs.
* Reiser4 patch
An efficient file system for small files; it packs them directly into the tree nodes, which saves disk space and reduces I/O.
* KSM (Kernel Shared Memory)
Scans RAM for identical pages (like duplicate library code) and merges them into one; it’s a "free" RAM upgrade if you run multiple instances of the same program.
* Very High Frequency (VHF) HZ Tuning
Manually setting CONFIG_HZ to 100 (instead of the modern 1000); this reduces the number of times the CPU "wakes up" per second, saving precious cycles for actual work.
* DevTmpfs
Automates device node creation entirely within the kernel; it saves you from running a heavy udev or mdev daemon in userland, freeing up roughly 2–5MB of RAM.
* LZ4 Compression for Kernel/Initramfs
Using LZ4 instead of xz, gzip or zstd for the kernel image.
I'd be ok with it if it's possible to use all the insights and software from after that, we'd just be constrained by the hardware - the Amiga 4000, Apple Mac Quadra 840av 128 MB or a Pentium 66 MHz or the 486 DX2-66 with 64 MB ram.
Can't we achieve something like this now with microcontrollers like ESP32 or RP2040?
This project runs a ca. 1990 scientific workstation (not just a PC) on an RP2040:
HTML/HTTP/URL (1993-ish) was a massive win over Gopher.
It later turned bad with way too much complexity introduced in a symbiosis between Google and a relatively small group of prolific key web standards people.
It was a variation of embrace, extend, extinguish, but with a combination of open standards and sheer complexity.
What this is really talking about is that we abandoned well engineered thought out platforms for piles of hacks, and superior clean languages for inferior but popular ones.
The problem with a lot of those beautiful systems is that they were neither free nor easy to use. The things that won were either or both of these.
For hardware the things that won, like the PC platform, had scale, and therefore won on price performance. A lot of the hardware mentioned here was priced for enterprise. Platforms that were only available in those price brackets and never fell in price either died or stayed extremely niche (s390x).
A second problem with the nice software is that a lot of it was not ported to the cheap hardware.
For software, free isn’t just about money. It’s also about virality. A free OS or language implementation can just be copied. You don’t have to ask permission.
The only non-free software that won was easy to use. Free is usually still unable to achieve that. People will pay for ease of use. The nice well engineered stuff, though, was usually still arcane.
I think “worse is better” can be explained by these things. Worse is not better for some non obvious systemic or evolutionary theoretic reason. Worse is cheaper, or free, and therefore has scale and can spread virally.
Linux was free. The web was free. C compilers were free and Java and JS were free. Windows and macOS and later phones were easy to use and still cheaper than those enterprise grade things.
I enjoy having a computer that allows me to create all kinds of things that weren't possible in 1993 ... mash together all kinds of audio, video, text ... put it in a backpack, bring it somewhere, perform on stage, with an 800$ laptop. Amazing.
I'm one of those "Encarta kids" who dug through Encarta for nights on end while the parents were out, and still spend slow Sundays reading random Wikipedia articles.
Having the archives that have been created since 1993, whether Wikipedia, Youtube (to me still one of the most amazing music discovery tools I've ever encountered), Archive.org, Google Scholar, Zenodo, at my fingertips has probably widened my personal horizon beyond imagination. Not sure who I'd be without it.
So even sadder to see it all drown now in AI slop ...
I feel like you missed out on the best part of Napster - finding someone's stash of music you like surrounded by things you've never heard of and then exploring it. My memory swears you could leave someone a message but that's a lifetime ago, but I know I connected with a few people who helped me absolutely get into metal music and that's changed my life for the good forever.
Other than that you'd go to a LAN party and find someone's file share of goodies, find again the things you were into and now you had a new friend who probably liked things you never knew of and now you two are sharing new things to each other on top of that.
It was really an age of connecting people and exploring the world for me, even as a young kid.
Oh I do remember Napster, but that was way after 1993 as well ;)
Either way, that still took ages to download, etc, so, it was less immediate. And somehow, I remember it more as a source for stuff that's already well-known ...
I was in school, and I remember my 1993. Our school was one of the few schools in my hometown (north-east India) that got computers.
Unfortunately, we had too many students for each computer during classes. I started a revolt that “Computers are wasting our study time, as our upcoming board exams are more important.” The whole class signed the petition and the School Head had to schedule a class-wide talk and agreed to make it totally optional to the point of, “If you really want, you be part of it. But yes, study for the exam is more important.”
So, the computer classes ended up with just me (the traitor), a friend from Kerala, and the school head’s daughter. We ended up like 3 computers each to our disposal. I wrote a QBasic Game-ish program to impress my first girlfriend — she uses the arrow keys to launch dots to hit some area on a heart-shaped thingy on the screen and it prints her name. I remember using physical graph-paper to calculate the screen “pixels” (I think) or co-ordinates to calculate strike areas.
Oh and Yes, almost all of my classmates remember me for being that traitor.
https://brajeshwar.com/2025/fixing-a-dos-computer-for-the-ar...
I remember using graph paper as a kid in England to design my space invaders clone for the class bbc micro.
I loved your story, ingenuity and the cheekiness of youth!
I've been half joking lately that if I wrote an OS, I'd call it Nostalgia OS. I'd aim for a UI reminiscent of windows 98 / windows 2000 / snow leopard. With HID guidelines and a rich, clear, cohesive set of UI widgets to build applications with. I think that was the peak computing user interface - at least as I experienced it.
Of course, the kernel would be based on capabilities (probably SeL4). And applications would probably ship as WASM bundles. And I'd have a built in local first user database built around CRDTs and things instead of a file system, kinda like a modern Lotus Notes. But for the UI? That era was great.
> windows 98 / windows 2000 / snow leopard
I don't know if I would group those together. Windows 98/2k were visually similar, but OSX by that point was looking quite refined.
It had toned down the blue, but still had some visual flare.
95/98/2k had a very utility appearance. Just compare the recycle bins between each to see how much more effort went into OSX look and feel.
My favourite OSX in terms of visual design was the Panther/Tiger era personally. Leopard looked good, but there was something really cheerful and friendly about Tiger. The iPods of those days were also really well thought-out in my opinion.
Definitely a world apart from the utilitarian Windows 98 UI.
And personally I preferred the XP/Vista era, having search was so much better than not.
However, that just shows how far behind Windows was. Spotlight was much better.
This is sort of what SerenityOS was going for. Alas it has slowed down a bit since Andreas now handles the Ladybird browser.
To be fair Ladybird is probably needed more urgently now.
To some extent (not capabilities), Haiku fits the bill here (https://en.wikipedia.org/wiki/Haiku_%28operating_system%29). Applications are bundles (but not WASM of course). The UI is very clean. The whole OS is also elegant and very fast on modern hardware.
Yes because BeOS was way ahead of its time. A complete new OS, doing all system things so much more efficiently, that it could allow wasting cpu time on high level actions, like moving windows in real-time while they were playing videos. On 1993s hardware, impossible with Windows or OS/2.
Check out the Chicago95 theme for XFCE.
PTSD and my pining for OS/2 spikes once again...OMG...
Yes, before https://en.wikipedia.org/wiki/Eternal_September
At that time, there was almost no spam because we could report them to abuse@domain...
There was no firewall in front of our campus and we were using rlogin to connect outside. I used export DISPLAY from Brest to Paris (to use xdvi that was not installed locally).
IMO, the security (ssh and killing rlogin) is the main change that is a really useful progress.
Pre-wikipedia days means you would likely have found me hunched over a 14" VGA monitor reading articles about giant devil rays and mansa musa in Microsoft Encarta as a kid.
https://en.wikipedia.org/wiki/Encarta
I adored Encarta. My aunt had the multi-CD set, I still vividly remember the animated landing page, clicking on cool pictures to navigate. Having to switch to CD#2 because the article you wanted wasn't on there.
It actually felt magical, and I totally credit it with my fascination with STEM and science, I would spend hours browsing around.
I don't think I was ever happier as a programmer than I was in the early 90s, well before university, with no thought of the internet, writing games on my Amiga.
Incidentally, I recently replayed Loom, from a bit before that era. It's still a lovely, wonderful game! Such a shame the fan-made sequel (Forge) seems to have died.
Same, but it was late 90's and I had found a Quick Basic compiler on our Windows 95 laptop. It had the ability to compile EXEs and I felt like I'd discovered some old magics. Like, this thing was just sitting here? All I had to do was double click qb45.exe?? And it opened a fully integrated terminal IDE???
The built in syntax help, which was incredible, and some example programs (no idea why our laptop had these) let me self-teach.
Now I sit all day and write bare metal firmware, but it feels so empty.
Loom is the best looking EGA game ever made. Sadly the VGA remake was hit-and-miss with the visuals.
Impossible to think about Loom without thinking about the first room in Money Island and the guy with the badge
"Talk to me about Loom!" That was my introduction to it.
I believe it was reasonably popular back in the day, made money at least. Now it would be almost avant guarde in its slowness I guess. You can't even double click to run. The popular art of yesteryear becomes the high art of today...
I had two faster (Pentium 3) machines but firing up my dad's old Pentium 90 running redhat that I installed to use KDE to write C was just a beautiful feeling and really got me coding for a few years.
Even games were at least partially open (quake) or easily moddable (command and conquer, tribes, etc.) so I went down that rabbit hole soon after.
Also man I really miss BeOS.
OP has a good point, but for me, I'd rather wish we'd skipped the 90s and picked up again much later. I like to think of the 8-bit era as an early bronze age of computing, lots of things went right and were done right.
16-bit, to me, are the dark ages. Lot's of confusion, not much good came out of it technologically and aesthetically. God, everything was ugly. Maybe all the trials and tribulations were necessary for what was about to come but I like to believe they weren't.
32-bit to me is the golden age and 64-bit is platinum.
If you offered my a time machine to go back, I'd surely say: "No, thank you!". There hasn't been a better time than now, but if you'd forced me at gun point, I'd pick the 80s over the 90s any time.
I agree that 32-bit is the golden age, but 64-bit is enterprise bloat. I personally would go for 1995 vs 2005, but I think 2005 was a lot better than 2015 in terms of interfaces.
I personally think that "peak computing" was around 2005. Maybe 2006.
I instead think the mistake was internet. If I were a time lord, I found the way to cripple the internet tech (3-body problem style) so it fails to do better than 56k.
I think it's the "always on" nature as opposed to the bandwidth. But maybe one begets the other.
There was plenty of amazing stuff going on with computing in the 90's. You just had to know where to look. Do you consider the 68000 CPU to be 16 or 32 bit, or both?
The 68000 was introduced in 1979, to me it is part of the 80s.
But you are making a good point. Maybe the distinction between 8/16/32/64-bit isn't really helpful. I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.
I loved the Amiga but I loathed all the XT/AT segmented memory bitplaned VGA 16-bit stuff. That to me is the deep dark ages.
I get the sentiment, but I have to nitpick the details ;)
VGA isn't bitplaned. It's chunky -- to put a pixel on the screen in 320x200x8 VGA mode 13h you literally just write a single byte to the VGA memory. The Amiga, on the other hand, does use planar graphics.
(Maybe you're thinking of EGA, which is planar and a pain to program for)
The trouble I have with this proposal, is Dark Forces 2: Jedi Knight and Goldeneye weren’t released until 1997. So that’s a pass from me :)
I do admit to suffering from nostalgia, but I also think it is the case that everything useful you could do with a computer was doable in 1993, and anything that you couldn't is probably a bad idea.
1993. No phone. No expectation of being contactable. If you are out you are out.
No SSDs though :(
Well, people showed up randomly at your door completely unannounced. And for the expectation of being available: At least where I grew up mobility was much lower, transportation difficult. You bet there was an expectation of availability. Was it better or worse? I could not say, it was surely different.
There were SSDs since the '80s, both flash (rare) and DRAM-with-battery (more common). You just didn't have the $$$ for them.
CF cards were introduced in '94 and use a PATA interface. Before them there were memory cards in PCMCIA (ISA bus) format.
Yeah...it was pretty good except that last part...
Actually the GSM spec existed in 1993 :)
The GSM spec is some years older, because you need time to built networks and phones based on it. In Germany GSM started 1992, and we already had a decent state-wide analog mobile phone network (mostly car phones because of the hardware requirements), called "C net" This was popular since the 80ies and only shutdown in 2000, when GSM was widely established with three private networks (and also cheaper then).
Gotta love the German creativity in naming. It's called "C Net" because that came after "B Net", which came after "A Net". While "A" apparently stood for "Autotelefon" (car phone).
And when D-Net came, T-Mobile called their service D1, and Vodafone called theirs... D2.
This naming scheme made things very obvious. While it wasn't the most creative, it was objectively the best ;-)
In UK first affordable phone plans I remember were 96. I think T Mobile did free evening calls.
Every so often I fire up an old Squeak Smalltalk image, put it in full screen mode, and pretend that much of the intervening years never happened.
But Squeak is 1996 ;)
Close enough. I had a copy of MacOS Apple Smalltalk from the 1985 training course I took, and had started a job using Objectworks. Squeak 1.0 is quite close to ST-80.
Well the games of my childhood, Total Annihilation and Seven Kingdoms, came out in '97. So for my own selfish reasons I would argue that the Pentium II is where progress should have stopped.
On the other hand, at least I'll get to play Castle of the Winds.
(I also loved Z: Steel Soldiers, but despite the '01 release date, I'm sure it too would have run on a Pentium II).
I was doing Modula 2 coding of assignments around then on 1 bit xterms. [1] I don't think we had that new fangled Modula 3.
[1] possibly NCD-16, https://groups.google.com/g/comp.windows.x/c/yGBvXhuTL0Y
I’ve often had similar thoughts but in my mind it tends to be windows XP/Mac OS 10.3 ish. Those systems were super capable and responsive, but the web moved at a slower pace. A good balance between computing power and humanity.
Windows XP really was fantastic…
I'm old enough to remember the 1990s. Many of us who do consider it the last good decade. Living was cheap. The previously ever-present threat of nuclear annihilation had seemingly abated. This was before the d0t-com crash and obviously the War on Terror that has dominated the 21st century thus far.
I have fond memories of the 486 era, which was really the early 1990s. I'm kinda surprised the PC component of this isn't mentioned here. it was also peak Borland.
It does mention Windows NT but honestly nobody really cared about that until NT 3.0/3.5 and it soon thereafter became Windows XP and laid the foundation for modern Windows.
1993 IIRC had pre-1.0 Linux. I downloaded a distribution (SLS) onto ~30 5.25" floppy drives about that time.
But I really wonder if it was that the tech was sufficiently good at that time or it's simply the tech we had when life was sufficiently good. 1993 was before the dot-com bubble started. That's true. And I guess with more computing power came a lot of the things that many people dislike now. Ads, news feeds, social media, micro-transactions, etc.
But we also have Youtube, video streaming, digital maps and navigation, search engines and a host of other things that are genuinely good.
This stuff was also fantastically expensive (in inflation-adjusted dollars). We shouldn't forget that too.
> It does mention Windows NT but honestly nobody really cared about that until NT 3.0/3.5 and it soon thereafter became Windows XP and laid the foundation for modern Windows.
Fun fact: NT 3.1 was the first version of NT, released in 1993. It was versioned like that to match Windows 3.1 which had been released the previous year.
And NT really took off with Windows 2000. Not just business people but more ordinary people were using it as a more stable alternative to Windows 95/98 (albeit lacking some compatibility, especially with games).
>Youtube
For me, youtube is only nice because of the decades old content that people have put on it. But that is because there is no such quality content made in the world anymore, and that is partially because of the enshittification bought on by the internet.
If it was not the case, youtube won't be that big of a deal. Let me disclose here that I am not a big fan of "on-demand" content.
I'm always amazed how Youtube can be so many different things for different people ... It's true that it used to be better a few years back, but people still upload great content even it it's harder to find nowadays.
Also, music ... back in the 90ies, if you were drawn to the obscure side of music, you'd read about it, and could, at best, imagine what it was like, because your local record store didn't have it, the bigger store the next town over didn't have it, and IF anyone could order it was with a non-refundable down payment.
Nowadays, you can probably find it on YT, and that's great IMHO. I my musical horizon would be so much more limited without it.
Also I've learned a lot about guitar repair ...
Yea..
I have stayed up all night waiting for RHCP's "Otherside" to come on MTv to record it on tape..
Will kids today even understand something like that, is anyone captured by music like that these days?
I recall sitting around all afternoon to tape Layla off the radio, during a repeat countdown, after hearing it the day before for the first time. The DJ cut in during the fade out with "Indeed..." and forty years later I still can't listen to that song without hearing him at the end.
My musical discoveries exploded with the internet, I can't imagine what I would have missed without it.
> I still can't listen to that song without hearing him at the end.
Something similar with me and "Another Day In Paradise". The first time I heard it was from a cassette my friend recorded from Dubai radio accidentally prefixed with an intro by the radio host..And that intro still comes to mind whenever I hear the song..
>My musical discoveries exploded with the internet
What, MTv didn't work for you for some reason?
I don't really have nostalgia for that, I prefer the immediacy honestly.
Nowadays people are captured by music differently, as they were captured by music differently before music could be mechanically or digitally reproduced.
For me in the 90s it was the satellite dish and VHS that opened up the world in terms of content, music channels, movies, etc, channels like Cartoon Network, MTV, Viva & Viva Zwei, and so on. And then the internet for me came in '97 or '98.
1993 was before the west entered the last stage of capitalism. It was a time when companies still competed on products rather than using monopolistic force to squeeze ever more revenue out of the same people by turning every life necessity into a subscription. Similarly, it was a time when you could mail-order a house and build it yourself. Rental prices were low because there was no regulatory capture on housing construction yet.
Where I disagree with you is video streaming. In my opinion, YouTube and the commercialisation of holiday memories (which later became Instagram influencers) were the beginning of widespread depression. Seemingly regular people sharing their exceptional life somehow forces everyone else to compare themselves to the dreams presented on YouTube and most people will come up short and then most people will feel insufficient. I believe that’s why early YouTube ads were so powerful. Your ad for exotic goods would play immediately after the viewer became painfully aware of how boring they are, when measured against the top 0.1% on a global scale.
I never understand why people want to label such eras of capitalism as “late” or the last era of capitalism. The late stage was late only to its own death. This isn’t the last stage either. Plenty more to grow. Capitalism is more akin to an indestructible and rapidly mutating organism than an ideology.
Two years before I moved across into IT. This was when I was a graphic designer making magazines and nursing journals using tools like Aldus PageMaker and QuarkXpress. Those were fantastic times. It felt like we could do anything with computers.
I was eyeing a career in IT and moved across soon after, and was dumped into Novell Netware 3.12 land which was an eye opener (Fire Phasers anyone?).
2009, please, sorry. 1993 is fun and all, I can go relive the dreams of Microsoft Encarta, but 2009 has Mac OS 10.6, gigabit ethernet everywhere, and USB.
I'd be ok with it if it's possible to use all the insights and software from after that, we'd just be constrained by the hardware - the Amiga 4000, Apple Mac Quadra 840av 128 MB or a Pentium 66 MHz or the 486 DX2-66 with 64 MB ram.
On the Amiga4000 and Apple you could run their 'native' OS, or you could run a modern day NetBSD or a BeOS version (HaikuOS wouldn't run).
On the Pentium-66, modern Freedos would run, Windows 3.11+win32s+calmira, Windows98SE, Windows2000sp5, XPlite, NetBSD and perhaps even an absolutely stripped down Linux 6 kernel with all the features that would be handy.
You could program in a modern i586 build FreePascal, Zig, Lua, juniper, micropython/shedskin, mruby/natalie, picoruby, juniper - perhaps a low memory JVM could run (OpenJDK 8 or JamVM).
It'd be possible to use sqlite, raylib and r3d-freepascal for efficient 3D games/apps next to Quake/Darkplaces, Doom2/GZDoom, Duke3D, Counterstrike 1.5, Halflife, perhaps Unreal1, Irrlicht and FTEQW.
Modern LambdaMOO's Toastunt/Moor could be made to run and inform6 and tads3 interactive fiction compilers. Fairy-Stockfish could be compiled - so enough creativity for gaming. I am just worried about getting Vassal, the java boardgame game engine to run comfortably... but the resolution would be pretty low to play many boardgames comfortably: the highest ATI card could drive 1280x1024. I would really like to use it so I don't have the real world board game setup time. Keldons Race for the Galaxy would compile and run though.
For internet, Dillo+ supports https, gemini and gopher. https://github.com/crossbowerbt/dillo-plus This can also be used to browse zim offline wikipedia files with kiwix-serve.
Now my only real problem is that we wouldn't have GenAI - probably EVER - would that be a blessing or a curse?
--- Handy Linux kernel 6 tweaks for low memory situations if you don't want to run NetBSD (which perhaps would be the best choice):
* zram + zstd zram creates a compressed block device in your RAM. To the OS, it looks like a regular swap partition, but it lives entirely in memory. When your 64MB fills up, the kernel sends data to /dev/zram0. The data is compressed (usually 3:1 ratio) and stored back in a small slice of your RAM. Version 6.19 includes better compression ratios and rebalancing. It prevents the CPU from over-working itself on decompression.
* zswap + zstd zswap is a front-end for a physical swap file on your hard drive. It intercepts pages headed for the hard drive, compresses them, and keeps them in a RAM pool. If that pool gets too full, it evicts the oldest compressed data to the actual disk. Starting in 6.18, zswap transitioned to using the zsmalloc allocator by default. This reduces "internal fragmentation," meaning it packs those compressed bytes tighter.
* frontswap API that allows the kernel to intercept swap-outs and store them in a transient memory pool; it works with Zswap to keep the system responsive during high load.
* Maple tree Replaces old "Red-Black trees" for memory management; it reduces the CPU cycles needed to find data in RAM.
* SLUB sheaves A modern memory allocator optimization that packs small objects into "sheaves" to reduce fragmentation.
* CONFIG_SLOB_BERBER A specialized 2025 backport of the old "SLOB" allocator; a memory-efficient way to handle kernel objects, saving roughly 1-2MB of overhead compared to the standard SLUB used in modern PCs.
* Ext4 without journaling Disable the "Journal" to save RAM and disk writes; it provides the best file-allocation speed without the memory overhead of Btrfs.
* Reiser4 patch An efficient file system for small files; it packs them directly into the tree nodes, which saves disk space and reduces I/O.
* KSM (Kernel Shared Memory) Scans RAM for identical pages (like duplicate library code) and merges them into one; it’s a "free" RAM upgrade if you run multiple instances of the same program.
* Very High Frequency (VHF) HZ Tuning Manually setting CONFIG_HZ to 100 (instead of the modern 1000); this reduces the number of times the CPU "wakes up" per second, saving precious cycles for actual work.
* DevTmpfs Automates device node creation entirely within the kernel; it saves you from running a heavy udev or mdev daemon in userland, freeing up roughly 2–5MB of RAM.
* LZ4 Compression for Kernel/Initramfs Using LZ4 instead of xz, gzip or zstd for the kernel image.
I'd be ok with it if it's possible to use all the insights and software from after that, we'd just be constrained by the hardware - the Amiga 4000, Apple Mac Quadra 840av 128 MB or a Pentium 66 MHz or the 486 DX2-66 with 64 MB ram.
Can't we achieve something like this now with microcontrollers like ESP32 or RP2040?
This project runs a ca. 1990 scientific workstation (not just a PC) on an RP2040:
https://github.com/rscott2049/DECstation2040
In 1993 I was using a Mac IIvx running (I think) System 7.5. I'd be OK with this.
Time lords can’t do anything, they’re all dead (twice now, thanks RTD)
Maybe the doctor could take you back there with a TARDIS, but time paradoxes and fixed points in time and all that
Given how many times they've been gone forever, never to come back, and then come back again, I wouldn't count on them being dead for long
Man, we're old...
Yep, old men yelling at "the cloud"!
HTML/HTTP/URL (1993-ish) was a massive win over Gopher.
It later turned bad with way too much complexity introduced in a symbiosis between Google and a relatively small group of prolific key web standards people.
It was a variation of embrace, extend, extinguish, but with a combination of open standards and sheer complexity.
There would be no SQLite though.
yes, but there would be dbase, berkeley db, borland database engine and mdb though - good enough
What this is really talking about is that we abandoned well engineered thought out platforms for piles of hacks, and superior clean languages for inferior but popular ones.
The problem with a lot of those beautiful systems is that they were neither free nor easy to use. The things that won were either or both of these.
For hardware the things that won, like the PC platform, had scale, and therefore won on price performance. A lot of the hardware mentioned here was priced for enterprise. Platforms that were only available in those price brackets and never fell in price either died or stayed extremely niche (s390x).
A second problem with the nice software is that a lot of it was not ported to the cheap hardware.
For software, free isn’t just about money. It’s also about virality. A free OS or language implementation can just be copied. You don’t have to ask permission.
The only non-free software that won was easy to use. Free is usually still unable to achieve that. People will pay for ease of use. The nice well engineered stuff, though, was usually still arcane.
I think “worse is better” can be explained by these things. Worse is not better for some non obvious systemic or evolutionary theoretic reason. Worse is cheaper, or free, and therefore has scale and can spread virally.
Linux was free. The web was free. C compilers were free and Java and JS were free. Windows and macOS and later phones were easy to use and still cheaper than those enterprise grade things.
In Gallifrey? In Gallifrey.
I honestly don't share the nostalgia.
I enjoy having a computer that allows me to create all kinds of things that weren't possible in 1993 ... mash together all kinds of audio, video, text ... put it in a backpack, bring it somewhere, perform on stage, with an 800$ laptop. Amazing.
I'm one of those "Encarta kids" who dug through Encarta for nights on end while the parents were out, and still spend slow Sundays reading random Wikipedia articles.
Having the archives that have been created since 1993, whether Wikipedia, Youtube (to me still one of the most amazing music discovery tools I've ever encountered), Archive.org, Google Scholar, Zenodo, at my fingertips has probably widened my personal horizon beyond imagination. Not sure who I'd be without it.
So even sadder to see it all drown now in AI slop ...
I feel like you missed out on the best part of Napster - finding someone's stash of music you like surrounded by things you've never heard of and then exploring it. My memory swears you could leave someone a message but that's a lifetime ago, but I know I connected with a few people who helped me absolutely get into metal music and that's changed my life for the good forever.
Other than that you'd go to a LAN party and find someone's file share of goodies, find again the things you were into and now you had a new friend who probably liked things you never knew of and now you two are sharing new things to each other on top of that.
It was really an age of connecting people and exploring the world for me, even as a young kid.
Oh I do remember Napster, but that was way after 1993 as well ;)
Either way, that still took ages to download, etc, so, it was less immediate. And somehow, I remember it more as a source for stuff that's already well-known ...
Never was too much into LAN parties though ...
I've been decompiling SH-2A code for the past half-decade and, you know, it's fine. It's not Great. But, like, it's fine!