What I don't understand is... why? I understand keeping alive software for the sake of hardware compatibility, but browsing the web and running Discord? Is it all really just to save a few hundred dollars over... 24 years?
Perhaps because the level of respect that Windows has for its users has dropped with each successive version?
Not to mention bloat: I have a keyboard with a dedicated calculator button. On a machine with Core i5 something or other and SSD it takes about 2 seconds for the calculator to appear the first time I push that button. On the Core 2 Duo machine that preceded it, running XP from spinning rust, the calculator would appear instantly - certainly before I can release the button.
But also WinXP was the OS a lot of people used during their formative years - don't underestimate the power of nostalgia.
Also, for some people the very fact that Microsoft don't want you to would be reason enough!
Personally if I were into preserving old Windows versions I'd be putting my effort into Win2k SP4, since it's the last version that doesn't need activating.
(I did have to activate a Vista install recently - just a VM used to keep alive some legacy software whose own activation servers are but a distant memory. It's still possible, but you can't do it over the phone any more, and I couldn't find any way to do it without registering a Microsoft account.)
There are tools out there (like UMSKT) that can activate MS software from that era fully offline too. They cracked the cryptography used by the activation system and reimplemented the tool used for phone activation, so you can “activate by phone” using UMSKT instead of calling MS.
Your comment reminds me of that rule from baseball that says something about batters and hats, or maybe it was about helmets or something, it doesn't really matter though because the only point of this sports ball rambling is to distract you from noticing that my "nuh uh" has no substance. Did it work?
This is more than a bit out of place on HN in my experience, please, try to engage politely.
I’m not sure what I can say that will qualify as more than “nuh uh” to you, shy of getting a Core 2 Duo running with XP and the same keyboard as OP. That isn’t possible at the moment, is there anything else I could do?
I admit you got me mildly anmoyed with the sports nonsense, sorry about that.
Anyway, you're talking about reaction time, which isn't actually relevant. The time between an action (pressing a button, or flipping a switch) and seeing the result happen isn't the same as the time it takes you to re-act to that something. Flip a light switch, does the light turn off instantly, or does it take a full third of a second? I guarantee you can tell the difference. 300ms of latency is actually huge and easily perceptible, even if it's faster than you can react.
300 ms is a long time on a computer, definitely. Just, the autistic side of me has to speak up when it’s wildly unrealistic glorification of the past.
Keypress duration is likely much less than 300 ms, top Google result claims 77 ms on average. And that’s down and up.
I see it being in cache already as sort of game playing, i.e. we can say anything is instant if we throw a cache in front of it. Am I missing something about caching that makes it reasonable? (I’m 37, so only 18 around that time and wouldn’t have had the technical chops to understand it was normal for things to be in disk cache after a cold boot)
Okay, let's say the cache is cold and you're on an old clunky spinning rust 5400 RPM hard drive. Do the math. How long will it take, worst case, for the platter to spin to where calc.exe is stored?
For a 5400 RPM drive, worst-case rotational latency is one full rotation: 5400/60 = 90 rev/sec, so ~11ms. Average is half that (~5.5ms). If you also need to seek (yes, we'll definitely need to move on both axes in the worst case scenario requested, likely all the time), 2006-era datasheets show average seek around 11-12ms, with full-stroke seeks around 21-22ms. So worst case total access: ~33ms.
Tl;dr reaction time, 300 ms is the golden rule for reaction speed, and apparently there was actually a sports medicine study that came to that #. I was surprised to see that, 300 ms comes up a lot in UX as “threshold of perceptible delay” but it was still surprising to see.
I'm not sure why human reaction time is relevant here, since what I'm talking about isn't the time it takes me to respond to a stimulus but the time it takes the computer to respond to a stimulus.
I do do still have both computers set up side-by-side (legacy data from an old business), and the keyboard in question was a Microsoft Comfort Curve 2000 (the calculator button wasn't a proper key, it was one of those squidgy extra keys so beloved of multimedia keyboards, so not as fast to operate as a proper key.)
Anyhow, the point (arguably hypberbolic as it may have been) wasn't about reaction time per se, it was about the older calculator app - and by extension much of the rest of the OS - being a much simpler and less bloated piece of software, and running it on faster-than-contemporaneous hardware makes for a sense of immediacy which is sorely lacking in today's world of web apps.
I'd be very interested to know to what that 300ms "threshold of perceptible delay" applies. You might not notice a window taking 300ms to open - but I'd be willing to bet that when you're highlighting text with the mouse or dragging a slider, you'd be very aware of the UI lagging by nearly 1/3 of a second.
This is a lot of words that say "yeah, I was hyperbolic, but it was directionally correct." I do appreciate the candor but its a bit late, as you see by the text color of my comments. Many people do the same thing as you, no worries, I appreciate you validating my quixotic self-destructive work.
I'm sorry you're being downvoted - for the record I've upvoted since it's interesting, even if we disagree in some aspects.
Since I still have the machine in question here, and I'm now interested enough to try and get some rough measurements, I've just videoed it with my phone (30fps video) and done some frame counting, both from a cold boot with nothing cached, and also a repeated launch.
Firstly from a cold boot:
It's hard to tell exactly when the keypress registers, but I believe what I'm seeing is the key being pressed, two frames later the hourglass appears, two frames after that the calculator appears. (The TFT screen will likely be adding at least one frame lag, but let's ignore that for now.) So that's somewhere between 166 and 200ms for a cold launch.
If I close the app and repeat, there's now just one frame between keypress and hourglass, and just one more frame between hourglass and the app appearing, so now nearer 100ms.
Looking at the videos my finger is off the key the first time the app appears, but not the second time - though if I made a special effort to release the key as quickly as possible I now think I could probably just about beat it.
300 ms is way longer than they budgeted; separately, I was alive then and it's a ridiculous claim, like, it takes a general bias we all have towards seeing the past with rose-colored glasses and takes it farcically far.
Don't want to clutter too much, I'm already eating downvotes, so I'll link:
On the average consumer hardware at launch, 95 and XP were slow, memory hungry bloats. In fact everything that people say about Windows 11 now was even more true of Windows back then.
By the end of the life of Windows 95 and XP, hardware had overtook and Windows felt snappier.
There was a reason I stuck with Windows 2000 for years after the release of XP and it wasn’t because I was too cheep to buy XP.
yeah no. Ask musicians using computers - 50 milliseconds of latency between sound and movement is generally considered unplayable, 20 milliseconds is tough, below 10ms usually is where people start being unable to tell.
You’ve fallen into the common trap of conflating reaction time with observable alignment time.
Reactions are about responding to one off events.
Whereas what you’re describing is about perception of events aligned to a regular interval.
For example, I wouldn’t react to a game of whack-a-mole at 50ms, nor that quickly to a hazard while driving either. But I absolutely can tell you if synth isn’t quantised correctly by as little as 50ms.
Thats because the later isn’t a reaction. It’s a similar but different perception.
Pressing a key to trigger an action that you will then send additional input to is an entirely different sequence of events than whack-a-mole, where you are definitionally not triggering the events you need to respond to.
I'm not talking about latency (though I don't fully agree with your statement but I've covered that elsewhere). I'm talking about the GP's comparison of reactions vs musicians listening to unquantised pieces.
You simply cannot use musicians as proof that people have these superhuman reaction times.
But here we're talking about not being able to notice whether calc.exe opens in less than 300 milliseconds, not how fast we can react to it opening? It's the same thing with audio latency (and extremely infuriating when you're used to fast software where you can just start typing directly just after opening it without having to insert a pause to cater to slowness)
No it's not the same thing with music latency. For one thing, music is an audio event where as UI is a visual event. We know that music and audio stimuli operate differently.
And for the music latency, you can here where the latency happens in relation to the rest of the music piece (be the rock music, techno, or whatever style of music). You have a point of reference. This makes latency less of a reaction event and more of a placement event. ie you're not just reacting to the latency, you're noticing the offset with the rest of the music. And that adds significant context to perception.
This is also ignores the point that musicians have to train themselves to hear this offset. It's like any advanced skill from a golf swing to writing code: it takes practice to get good at it.
So it's not the same. I can understand why people think it might be. But when you actually investigate this properly, you can see why DJs and musicians appear to have supernatural senses vs regular reaction times. It's because they're not actually all that equivalent.
I'm not a part of the Windows XP community, but I've gotten close. I love that I can make it look just like Windows 2000 and that I know where all the little knobs and dials are. I can get a Windows XP installation configured to be exactly as I want it to be very quickly and I know it won't suddenly change on me.
The high point is a toss up between XP and 7 for me, but imo Windows UX peaked then (although the 98 visual style is peak for nostalgia) and has either stayed the same or gotten worse ever since. Personally I just switched to using Linux full time as soon as gaming compatibility became basically the same as Windows but I totally understand why you'd want to maintain the ability to use older Windows versions.
In my case I have an old Thinkpad that is chugging on reliably into its third decade soon, that ran XP when it left the factory. Newer linuxes don't work too well (accelerated graphics components such as mesa have accelerated beyond the hardware's capabilities) and the BSDs are a little spartan (very little software for i386 now for understandable reasons).
So I'm thinking of putting an XP install back on the thing with my licenced MS Office 2000 and a few other bits and pieces of software just for retro fun, and a reminder about how things were 20 years or so ago to avoid the rose tinted glasses effect.
Well, I can give you millions of reasons in the form of industrial hardware that’s running Windows XP and will continue to run for another 10 years at least.
I'm pretty sure it has nothing to do with money and plenty to do with the same reasons as people who preserve Commodore 64s, Amigas and DOS and Win9x PCs.
It may be more savings than that, if you count all the hours wasted with fixing things that broke in a newer version or finding workarounds that will never be as efficient.
A lot of medical devices still run XP as well unfortunately, because of old proprietary software for expensive equipment that doesn't receive updates anymore.
Why not? I strongly prefer Linux, but if somebody wants to use Windows, why not use XP? It was definitely a better user experience.
I stopped using Windows at work as Win7 was rolling out but got another job using it again as Win11 started rolling out. Having missed out on the slow decline, it's very obvious to me how much better the older Windows were. The new ones have a dumpster fire UI with built-in advertising that shoves Microsoft web crap and AI down your throat at every chance.
Coincidentally, just a few days ago, I tried to run Nim[0] on Windows XP as an experiment.
And to my surprise, the latest 32-bit release of Nim simply works out the box. But Nim compiles to C, so I also needed C compiler. Many versions of mingw I could find online - they all failed to launch.
After some time I managed to find very old Mingw (gcc 4.7.1) that have finally worked [1].
I found out the other day you can use modern clang-cl with the MSVC6 headers and it just works. you can download them from here https://github.com/itsmattkc/MSVC600 or just copy it from an install if you have one handy.
I'm guessing this no longer qualifies as "modern," since the last update was in 2018 and is no longer in active development, but I'd like to say that the 32bit version of the Tiny C Compiler by Fabrice Bellard works on Windows 98 SE.
I wonder if some of those things can be solved via a shim DLL that provides the necessary missing WinAPI functions instead of modifying the source code. Although the number of changes required seems vanishingly small anyway, so either approach coukd work just fine.
Since hardware prices have skyrocketed, it is very important to run software on low-end hardware and use a suitable operating system, such as Windows 7, whose support, amusingly, has been dropped by nearly every project recently. Backporting software to Windows 7 is something we must do, for our freedom and our wallets.
Modern Linux is bad, nothing works completely, and on top of that, fragmentation between X and Wayland makes it worse. I just want to use my favorite software on a convenient operating system that doesn't eat up a lot of resources. And yes, that’s Windows 7. Just give me back the ability to use Steam or SourceTree and stop forcing bloatware like Windows 10 or Windows 11 on me.
I have 8Gb RAM and HDD. Windows 7 works perfectly, windows 10 work acceptable. Linux bad work: horrible performance browser and IDE especially on distr with snap (browser performance near zero). On linux desktop huge problem with shortcuts. I don't know what the DE developers are doing, but I've never been able to get the key combinations I need and find convenient to work without conflicts with the system ones. Or it's just impossible to set them.
What applications are base on this? I mean it sounds super charming and nostalgic to drop a line or two which runs on WinXP, but is this actually useful?
Mostly legacy industrial machines that need some additional software for telemetry, scheduling, automation etc.
These machines are likely to live at least another 10-15 years and even the brand new ones being sold today uses Windows 7.
Modern languages and frameworks proceed and leave these old systems behind, but everything from our infrastructure to manufacturing capacity that exists runs on legacy systems, not modern computers. The cost of replacing the computers is usually more than the machine itself.
It depends on what you want. If you want to install an old copy of Visual Studio from 20 years ago then you should be able to write a program and compile it and have that work on XP. But that comes with limitations. You're not going to be able to use even C++11 and will be stuck with C++03, or maybe even C++98. If that's acceptable to you then it can work. But if you want to compile something that somebody else wrote or want to use some library that somebody else wrote, it probably won't work in that environment.
Or you could install and old copy of Cygwin or MinGW.
Do you want to run a modern Visual Studio and target XP? Maybe you can make that work if you install an old platform SDK and set WINVER and _WIN32_VERSION and work around all the warnings and compatibility problems that you'll run into. It is fighting an uphill battle and it will continue to get worse with each new version of VS that you want use.
For rust there is Rust9x https://seri.tools/blog/announcing-rust9x/. But I think this is the effort of handful of people. It is behind the upstream rust and it could go away at any time. If you want to write a toy program in Rust then it is fine, but if you want something that's going to be supported long-term you're rolling the dice.
Python 3.4.4 is the last version of Python that will run on Windows XP. That's 10 years old and many things on PyPI now require newer versions of Python so you'd be stuck with old, unsupported versions of those modules, possibly containing security issues.
As far as I'm aware so long as you limit yourself to APIs that were available in XP you don't actually need an older SDK to develop for it with modern MSVC. The early windows platform layer stuff in the handmade hero series demonstrates doing so without anything like Cygwin or MinGW.
Most new APIs introduced since Vista are COM based, and after Windows 8, WinRT based (basically COM with IIinspectable, application identity, and .NET metadata instead of type libraries).
Plain old Win32 C API is basically frozen on Windows XP view of the world, although there are a couple of new .....ExNum() suffixes for stuff like HDPI or various IO improvements, the userspace drivers initially COM (UMDF), but reverted back to plain C struct with function pointers on version 2.0.
The only officially (at least partially) supported way from Microsoft is to add into Visual Studio the toolchain named "C++ Windows XP Support for VS 2017 (v141) tools". It is still there in the "individual components" of Visual Studio Installer for the latest VS but it is marked as [Deprecated]. It is a safe bet that MS will never fix any existing bugs in it or update it so at this point your best bet might be with the open source tools.
All other currently supported toolchains rely on runtimes that are explicitly not compatible with Win XP.
There's a surprisingly large Windows XP community; everything from security patches to browsers[0] to third party Discord clients[1].
[0] https://www.mypal-browser.org/ [1] https://github.com/DiscordMessenger/dm
What I don't understand is... why? I understand keeping alive software for the sake of hardware compatibility, but browsing the web and running Discord? Is it all really just to save a few hundred dollars over... 24 years?
Perhaps because the level of respect that Windows has for its users has dropped with each successive version?
Not to mention bloat: I have a keyboard with a dedicated calculator button. On a machine with Core i5 something or other and SSD it takes about 2 seconds for the calculator to appear the first time I push that button. On the Core 2 Duo machine that preceded it, running XP from spinning rust, the calculator would appear instantly - certainly before I can release the button.
But also WinXP was the OS a lot of people used during their formative years - don't underestimate the power of nostalgia.
Also, for some people the very fact that Microsoft don't want you to would be reason enough!
Personally if I were into preserving old Windows versions I'd be putting my effort into Win2k SP4, since it's the last version that doesn't need activating. (I did have to activate a Vista install recently - just a VM used to keep alive some legacy software whose own activation servers are but a distant memory. It's still possible, but you can't do it over the phone any more, and I couldn't find any way to do it without registering a Microsoft account.)
Win2003 Enterprise does NOT need activation either. It runs smooth offline.
There are tools out there (like UMSKT) that can activate MS software from that era fully offline too. They cracked the cryptography used by the activation system and reimplemented the tool used for phone activation, so you can “activate by phone” using UMSKT instead of calling MS.
[flagged]
Your comment reminds me of that rule from baseball that says something about batters and hats, or maybe it was about helmets or something, it doesn't really matter though because the only point of this sports ball rambling is to distract you from noticing that my "nuh uh" has no substance. Did it work?
This is more than a bit out of place on HN in my experience, please, try to engage politely.
I’m not sure what I can say that will qualify as more than “nuh uh” to you, shy of getting a Core 2 Duo running with XP and the same keyboard as OP. That isn’t possible at the moment, is there anything else I could do?
I admit you got me mildly anmoyed with the sports nonsense, sorry about that.
Anyway, you're talking about reaction time, which isn't actually relevant. The time between an action (pressing a button, or flipping a switch) and seeing the result happen isn't the same as the time it takes you to re-act to that something. Flip a light switch, does the light turn off instantly, or does it take a full third of a second? I guarantee you can tell the difference. 300ms of latency is actually huge and easily perceptible, even if it's faster than you can react.
300ms is a lot of time, especially if the calculator.exe was in disk cache already.
300 ms is a long time on a computer, definitely. Just, the autistic side of me has to speak up when it’s wildly unrealistic glorification of the past.
Keypress duration is likely much less than 300 ms, top Google result claims 77 ms on average. And that’s down and up.
I see it being in cache already as sort of game playing, i.e. we can say anything is instant if we throw a cache in front of it. Am I missing something about caching that makes it reasonable? (I’m 37, so only 18 around that time and wouldn’t have had the technical chops to understand it was normal for things to be in disk cache after a cold boot)
Okay, let's say the cache is cold and you're on an old clunky spinning rust 5400 RPM hard drive. Do the math. How long will it take, worst case, for the platter to spin to where calc.exe is stored?
For a 5400 RPM drive, worst-case rotational latency is one full rotation: 5400/60 = 90 rev/sec, so ~11ms. Average is half that (~5.5ms). If you also need to seek (yes, we'll definitely need to move on both axes in the worst case scenario requested, likely all the time), 2006-era datasheets show average seek around 11-12ms, with full-stroke seeks around 21-22ms. So worst case total access: ~33ms.
Seagate Momentus 5400.3 manual (2005): https://www.seagate.com/support/disc/manuals/ata/100398876a....
Hitachi Travelstar 5K120 (2006):http://www.ggsdata.se/PC/Bilder/hd/5K120.pdf
WD Scorpio (October 2007): https://theretroweb.com/storage/documentation/2879-001121-a1...
If you had used calculator earlier that uptime, it wouldn't be crazy. It's a small exe.
Why is it impossible?
Tl;dr reaction time, 300 ms is the golden rule for reaction speed, and apparently there was actually a sports medicine study that came to that #. I was surprised to see that, 300 ms comes up a lot in UX as “threshold of perceptible delay” but it was still surprising to see.
I'm not sure why human reaction time is relevant here, since what I'm talking about isn't the time it takes me to respond to a stimulus but the time it takes the computer to respond to a stimulus.
I do do still have both computers set up side-by-side (legacy data from an old business), and the keyboard in question was a Microsoft Comfort Curve 2000 (the calculator button wasn't a proper key, it was one of those squidgy extra keys so beloved of multimedia keyboards, so not as fast to operate as a proper key.)
Anyhow, the point (arguably hypberbolic as it may have been) wasn't about reaction time per se, it was about the older calculator app - and by extension much of the rest of the OS - being a much simpler and less bloated piece of software, and running it on faster-than-contemporaneous hardware makes for a sense of immediacy which is sorely lacking in today's world of web apps.
I'd be very interested to know to what that 300ms "threshold of perceptible delay" applies. You might not notice a window taking 300ms to open - but I'd be willing to bet that when you're highlighting text with the mouse or dragging a slider, you'd be very aware of the UI lagging by nearly 1/3 of a second.
This is a lot of words that say "yeah, I was hyperbolic, but it was directionally correct." I do appreciate the candor but its a bit late, as you see by the text color of my comments. Many people do the same thing as you, no worries, I appreciate you validating my quixotic self-destructive work.
I'm sorry you're being downvoted - for the record I've upvoted since it's interesting, even if we disagree in some aspects.
Since I still have the machine in question here, and I'm now interested enough to try and get some rough measurements, I've just videoed it with my phone (30fps video) and done some frame counting, both from a cold boot with nothing cached, and also a repeated launch.
Firstly from a cold boot:
It's hard to tell exactly when the keypress registers, but I believe what I'm seeing is the key being pressed, two frames later the hourglass appears, two frames after that the calculator appears. (The TFT screen will likely be adding at least one frame lag, but let's ignore that for now.) So that's somewhere between 166 and 200ms for a cold launch.
If I close the app and repeat, there's now just one frame between keypress and hourglass, and just one more frame between hourglass and the app appearing, so now nearer 100ms.
Looking at the videos my finger is off the key the first time the app appears, but not the second time - though if I made a special effort to release the key as quickly as possible I now think I could probably just about beat it.
I was curious, so did a quick web search, which claims that 300ms is the average reaction time and plenty of people run faster than that.
But I think the question was the other way: Why couldn't calc.exe launch in 300ms?
300 ms is way longer than they budgeted; separately, I was alive then and it's a ridiculous claim, like, it takes a general bias we all have towards seeing the past with rose-colored glasses and takes it farcically far.
Don't want to clutter too much, I'm already eating downvotes, so I'll link:
https://news.ycombinator.com/item?id=46642003
I have Windows 95 on a Pentium 120 MHz and calc.exe is instantaneous enough that it's probably much less than 300ms to launch.
XP's calculator is hardly any different than 95. It's easy to believe that launching it on a Core 2 Duo of all things is also instant.
You’re both kind of right.
On the average consumer hardware at launch, 95 and XP were slow, memory hungry bloats. In fact everything that people say about Windows 11 now was even more true of Windows back then.
By the end of the life of Windows 95 and XP, hardware had overtook and Windows felt snappier.
There was a reason I stuck with Windows 2000 for years after the release of XP and it wasn’t because I was too cheep to buy XP.
The Doherty threshold is 400 ms. That’s the threshold which you start impacting users focus, and flow.
Back in the day, we actually used to aim for that as a user experience metric.
yeah no. Ask musicians using computers - 50 milliseconds of latency between sound and movement is generally considered unplayable, 20 milliseconds is tough, below 10ms usually is where people start being unable to tell.
You’ve fallen into the common trap of conflating reaction time with observable alignment time.
Reactions are about responding to one off events.
Whereas what you’re describing is about perception of events aligned to a regular interval.
For example, I wouldn’t react to a game of whack-a-mole at 50ms, nor that quickly to a hazard while driving either. But I absolutely can tell you if synth isn’t quantised correctly by as little as 50ms.
Thats because the later isn’t a reaction. It’s a similar but different perception.
Pressing a key to trigger an action that you will then send additional input to is an entirely different sequence of events than whack-a-mole, where you are definitionally not triggering the events you need to respond to.
I'm not talking about latency (though I don't fully agree with your statement but I've covered that elsewhere). I'm talking about the GP's comparison of reactions vs musicians listening to unquantised pieces.
You simply cannot use musicians as proof that people have these superhuman reaction times.
But here we're talking about not being able to notice whether calc.exe opens in less than 300 milliseconds, not how fast we can react to it opening? It's the same thing with audio latency (and extremely infuriating when you're used to fast software where you can just start typing directly just after opening it without having to insert a pause to cater to slowness)
No it's not the same thing with music latency. For one thing, music is an audio event where as UI is a visual event. We know that music and audio stimuli operate differently.
And for the music latency, you can here where the latency happens in relation to the rest of the music piece (be the rock music, techno, or whatever style of music). You have a point of reference. This makes latency less of a reaction event and more of a placement event. ie you're not just reacting to the latency, you're noticing the offset with the rest of the music. And that adds significant context to perception.
This is also ignores the point that musicians have to train themselves to hear this offset. It's like any advanced skill from a golf swing to writing code: it takes practice to get good at it.
So it's not the same. I can understand why people think it might be. But when you actually investigate this properly, you can see why DJs and musicians appear to have supernatural senses vs regular reaction times. It's because they're not actually all that equivalent.
Familiarity, I suppose.
I'm not a part of the Windows XP community, but I've gotten close. I love that I can make it look just like Windows 2000 and that I know where all the little knobs and dials are. I can get a Windows XP installation configured to be exactly as I want it to be very quickly and I know it won't suddenly change on me.
The high point is a toss up between XP and 7 for me, but imo Windows UX peaked then (although the 98 visual style is peak for nostalgia) and has either stayed the same or gotten worse ever since. Personally I just switched to using Linux full time as soon as gaming compatibility became basically the same as Windows but I totally understand why you'd want to maintain the ability to use older Windows versions.
In my case I have an old Thinkpad that is chugging on reliably into its third decade soon, that ran XP when it left the factory. Newer linuxes don't work too well (accelerated graphics components such as mesa have accelerated beyond the hardware's capabilities) and the BSDs are a little spartan (very little software for i386 now for understandable reasons).
So I'm thinking of putting an XP install back on the thing with my licenced MS Office 2000 and a few other bits and pieces of software just for retro fun, and a reminder about how things were 20 years or so ago to avoid the rose tinted glasses effect.
Why not? XP was formative for many people who are in the profession now.
For me, it's knowing what I know now, what could I've done back 25 (wow!) years ago. It's a fun exercise.
Well, I can give you millions of reasons in the form of industrial hardware that’s running Windows XP and will continue to run for another 10 years at least.
I'm pretty sure it has nothing to do with money and plenty to do with the same reasons as people who preserve Commodore 64s, Amigas and DOS and Win9x PCs.
It may be more savings than that, if you count all the hours wasted with fixing things that broke in a newer version or finding workarounds that will never be as efficient.
A lot of medical devices still run XP as well unfortunately, because of old proprietary software for expensive equipment that doesn't receive updates anymore.
Why not? I strongly prefer Linux, but if somebody wants to use Windows, why not use XP? It was definitely a better user experience.
I stopped using Windows at work as Win7 was rolling out but got another job using it again as Win11 started rolling out. Having missed out on the slow decline, it's very obvious to me how much better the older Windows were. The new ones have a dumpster fire UI with built-in advertising that shoves Microsoft web crap and AI down your throat at every chance.
It is actually incredibly fast and already does basically everything you want from an operating system I guess.
It's fun and interesting. most people don't actually daily drive it
Wait till everything you do is exfiltrated by copilot…
Another browser alternative is Supermium[0] it's an up to date Chromium based browser that also works great in Windows XP.
[0] https://win32subsystem.live/supermium/
My browsers[1] still target XP and developing in Win7.
[1] https://msfn.org/board/topic/185966-my-browser-builds-part-5...
There's a Windows 9x community too, although maybe not as large.
If you ever wanted to use a modern C and C++ compiler on windows xp, 32 bit version of w64devkit[1] does target it and provides a recent gcc version.
[1] https://github.com/skeeto/w64devkit
Coincidentally, just a few days ago, I tried to run Nim[0] on Windows XP as an experiment.
And to my surprise, the latest 32-bit release of Nim simply works out the box. But Nim compiles to C, so I also needed C compiler. Many versions of mingw I could find online - they all failed to launch.
After some time I managed to find very old Mingw (gcc 4.7.1) that have finally worked [1].
[0] - https://nim-lang.org/
[1] - https://ibb.co/TBdvZPVt
Try MinC, MinC is not Cygwin. Install both the base and the compiler from their page.
https://minc.commandlinerevolution.nl/english/home.html
TCL/Tk 8.6 for XP, and as a bonus you get a gopher and gemini browser from gopher://hoi.st called BFG:
gopher://texto-plano.xyz:70/1/~anthk/bfgxp
Unzip the file and launch "lanzar.bat" in order to test it. I think I added tcllib and tklib just in case, so you can do a lot with that interpreter.
From what I remember doing it several years ago, it was not too hard to patch MSVC 2019 to run on (not just generate binaries for) XP.
I used CoLinux with pubuntu, and from there install GCC.
Anything for Win9x?
I found out the other day you can use modern clang-cl with the MSVC6 headers and it just works. you can download them from here https://github.com/itsmattkc/MSVC600 or just copy it from an install if you have one handy.
then run (something like) this:
I don't know if it's any better or worse than MinGW practically but it is definitely cursed.I haven't tried it but i saw this a few days ago: https://github.com/crazii/MINGW-toolchains-w9x
Thank you!!!
I'm guessing this no longer qualifies as "modern," since the last update was in 2018 and is no longer in active development, but I'd like to say that the 32bit version of the Tiny C Compiler by Fabrice Bellard works on Windows 98 SE.
https://www.bellard.org/tcc/
I wonder if some of those things can be solved via a shim DLL that provides the necessary missing WinAPI functions instead of modifying the source code. Although the number of changes required seems vanishingly small anyway, so either approach coukd work just fine.
Since hardware prices have skyrocketed, it is very important to run software on low-end hardware and use a suitable operating system, such as Windows 7, whose support, amusingly, has been dropped by nearly every project recently. Backporting software to Windows 7 is something we must do, for our freedom and our wallets.
vxkex works really well, give it a try
Literally just use linux. if you're fed dogshit, don't go back to dog vomit, use an actual operating system like an actual adult.
Going back to win7 is just a neurological pathogen level of stockholm syndrome. give your head a shake.
Modern Linux is bad, nothing works completely, and on top of that, fragmentation between X and Wayland makes it worse. I just want to use my favorite software on a convenient operating system that doesn't eat up a lot of resources. And yes, that’s Windows 7. Just give me back the ability to use Steam or SourceTree and stop forcing bloatware like Windows 10 or Windows 11 on me.
> nothing works completely
Such as? Everything has been working for me the past >20 years.
As for X vs. Wayland: you do not have to care about it. Just install a distro with X.
Steam works on Linux, and lots of games run under Linux now thanks to Proton.
I have 8Gb RAM and HDD. Windows 7 works perfectly, windows 10 work acceptable. Linux bad work: horrible performance browser and IDE especially on distr with snap (browser performance near zero). On linux desktop huge problem with shortcuts. I don't know what the DE developers are doing, but I've never been able to get the key combinations I need and find convenient to work without conflicts with the system ones. Or it's just impossible to set them.
Starting with the very basics: hibernation & track pads.
Dotnet 10 for Windows XP
https://github.com/kalnod/Win32Dotnet10Starter
Strangely, AutoHotKey v1.1 scripts when compiled in ANSI mode runs perfectly on Windows XP.
AHK came in very handy we needed a quick tool to track mill operators, roughly 20-30 lines of code and we had a working GUI app.
> Added back 5ms sleep on Windows 7/8 in (*Process).Wait (reverted f0894a0)
This was interesting!
What applications are base on this? I mean it sounds super charming and nostalgic to drop a line or two which runs on WinXP, but is this actually useful?
Mostly legacy industrial machines that need some additional software for telemetry, scheduling, automation etc.
These machines are likely to live at least another 10-15 years and even the brand new ones being sold today uses Windows 7.
Modern languages and frameworks proceed and leave these old systems behind, but everything from our infrastructure to manufacturing capacity that exists runs on legacy systems, not modern computers. The cost of replacing the computers is usually more than the machine itself.
Is it hard to write software that compiles and can run on windows XP now? What about like rust and python?
It depends on what you want. If you want to install an old copy of Visual Studio from 20 years ago then you should be able to write a program and compile it and have that work on XP. But that comes with limitations. You're not going to be able to use even C++11 and will be stuck with C++03, or maybe even C++98. If that's acceptable to you then it can work. But if you want to compile something that somebody else wrote or want to use some library that somebody else wrote, it probably won't work in that environment.
Or you could install and old copy of Cygwin or MinGW.
Do you want to run a modern Visual Studio and target XP? Maybe you can make that work if you install an old platform SDK and set WINVER and _WIN32_VERSION and work around all the warnings and compatibility problems that you'll run into. It is fighting an uphill battle and it will continue to get worse with each new version of VS that you want use.
For rust there is Rust9x https://seri.tools/blog/announcing-rust9x/. But I think this is the effort of handful of people. It is behind the upstream rust and it could go away at any time. If you want to write a toy program in Rust then it is fine, but if you want something that's going to be supported long-term you're rolling the dice.
Python 3.4.4 is the last version of Python that will run on Windows XP. That's 10 years old and many things on PyPI now require newer versions of Python so you'd be stuck with old, unsupported versions of those modules, possibly containing security issues.
> Python 3.4.4 is the last version of Python that will run on Windows XP.
Pity. You can compile and run pretty much any version of Lua on XP with VS 2010 or lcc and it works just fine https://ibb.co/d0pMK7Jk
IronTCL with the BFG gemini/gopher browser from gopher://hoi.st:
gopher://texto-plano.xyz:70/1/~anthk/bfgxp
Also, from some http proxy:
http://portal.mozz.us/gopher/texto-plano.xyz:70/1/~anthk/bfg...
MinC, a tiny C SDK plus Git and goodies from OpenBSD's base (ncurses it's included too):
https://minc.commandlinerevolution.nl/english/home.html
As far as I'm aware so long as you limit yourself to APIs that were available in XP you don't actually need an older SDK to develop for it with modern MSVC. The early windows platform layer stuff in the handmade hero series demonstrates doing so without anything like Cygwin or MinGW.
Most new APIs introduced since Vista are COM based, and after Windows 8, WinRT based (basically COM with IIinspectable, application identity, and .NET metadata instead of type libraries).
Plain old Win32 C API is basically frozen on Windows XP view of the world, although there are a couple of new .....ExNum() suffixes for stuff like HDPI or various IO improvements, the userspace drivers initially COM (UMDF), but reverted back to plain C struct with function pointers on version 2.0.
The only officially (at least partially) supported way from Microsoft is to add into Visual Studio the toolchain named "C++ Windows XP Support for VS 2017 (v141) tools". It is still there in the "individual components" of Visual Studio Installer for the latest VS but it is marked as [Deprecated]. It is a safe bet that MS will never fix any existing bugs in it or update it so at this point your best bet might be with the open source tools.
All other currently supported toolchains rely on runtimes that are explicitly not compatible with Win XP.
sadly doesn't work on winxp x86 - just tried. not a valid win32 application.