More than 30 years later, you can still run winquake.exe on Windows 11. Fullscreen does not support widescreen but the windowed mode still works flawlessly. As much as Microsoft has been questionable lately, their commitment to backward compatibility is impressive.
I love this about Windows so much it's hard to explain to somebody who doesn't understand why it matters. :-)
Why do we need Windows 11 to support old software when we can use an older version of Windows, in an emulator at that. Playing Quake doesn't require a secure, patched box, and if a secure environment is the point of extreme backwards compat, then it seems like endless backwards compatibility is not the best way to achieve that goal (sandboxing an old, emulated OS, for example, comes to mind as more reasonable).
Letting Microsoft play this backwards compatibility card feels not healthy for the evolution of software or the diversification of the industry.
Breaking backwards compatibility is bad for diversity, because it "culls" a whole load of otherwise working software that is not being maintained. You can see the reverse of this on the app stores, which have mandatory update policies.
Regularly doing it basically forces developers into a limited-term license, subscription, or SaaS model, in order to pay for the upgrade churn required by the platform.
And a lot of it is just churn. Not evolution, not better, just .. different.
> it "culls" a whole load of otherwise working software
It doesn't cull it, you can still run Windows 3.11 or 98SE as well under emulation as on contemporary original hardware.
If anything, breaking backwards compatibility forces you to run your old software in an "authentic" environment, versus say, on some hardware/software combination tens of generations removed. Like, why would you want to run SkiFree in Windows 11, it feels like an abomination to me, almost disrespectful to the game. I don't want to see my old programs in Windows 11...
Because it's not limited to games, forcing updates cuts of a lot of apps that can't invest enough in updating.
Also the barrier to use you're suggesting with alternative install/emulator is pretty high for an average user. It also breaks integration with everything else (e.g., a simple alt-tab will show the VM instead of 2 apps running inside)
Also because a lot of progress is regression, so having an old way to opt out into is nice
Integration is the biggest thing. While some desktop VM hosts provide various integration bits like file sharing and rootless window support, the experience is rarely seemless.
Drawing a few examples from an old Raymond Chen blog post[1], integrations required for seemless operation include
• Host files must be accessible in guest applications using host paths and vice versa. Obviously this can't apply to all files, but users will at least expect their document files to be accessible, including documents located on (possibly drive-letter-mapped) network shares.
• Cut-and-paste and drag-and-drop need to work between host and guest applications.
• Taskbar notification icons created by guest applications must appear on the host's taskbar.
• Keyboard layout changes must be synchronized between host and guest.
These are, at least to a useful degree, possible. Integrations that are effectively impossible in the general case:
• Using local IPC mechanisms between host and guest applications. Chen's examples are OLE, DDE, and SendMessage, but this extends to other mechanisms like named pipes, TCP/IP via the loopback adapter, and shared memory.
• Using plug-ins running in the guest OS in host applications and vice versa. At best, these could be implemented through some sort of shim mechanism on a case-by-case basis, assuming the plug-in mechanism isn't too heavily sandboxed, and that the shim mechanism doesn't introduce unacceptable overhead (e.g., latency in real-time A/V applications).
Finally, implementing these integrations without complicated (to implement and configure) safeguards would effectively eliminate most of the security benefits of virtualization.
You don't necessarily "need" it but what feature of Win11 or OSX is worth the making all existing software inoperable? Can't say I have seen one outside of gets security updates.
I don't know, you could do something totally wild like re-imagining the filesystem... I, for one, would love a flat blob store organized in some other way than folders or filenames. I think there's tons of interesting things that could and would be explored without backwards compatibility holding us back. That's how the original OS X came to be.
But what I really don't get, is why we need backwards compat when computers can run computers, and old operating systems hardly demand resources on a modern computer.
I know what you’re trying to say. That Proton IS Windows at some level. And so MS gets some credit for that. But they don’t.
A lot of actual work went into Proton and into making games work therein.
MS is a slow, lumbering, monoculture that has lacked innovation and creativity for a very long time. I don’t see how freezing APIs or keeping old APIs around (mostly through versioned DLL hell) as some grand accomplishment.
It only matters if you don't have the source to your programs. So yes, there is a huge corpus of programs where this matters. But there is also a large library of programs where the source is available and backwards compatibility does not matter nearly as much.
As a concrete example, the source to quake is available, this has allowed quake to run on so many platforms and windows infamous backwards compatibility has little effect in keeping quake running, windows could have broken backwards compatibility and quake would still run on it.
if you have the source to those programs, and are willing to (sometimes significantly) rewrite parts of them and recompile (see: wayland, for example)
The amazing part is that you don't need to do this in windows whether you have the source or not. I am a linux user, but for all their faults, Microsoft got their backwards compatibility stuff right. Something that the oss world, on average, needs to be convinced it's a desirable thing.
SDL was born from Icculus to run commercial games without issues on X/GL or whatever.
So, it's actually a NOT in your clause. More like a 99% of the existing graphical games modulo some oddies with Ogre3D and friends.
At least in order to be playable under Linux. Said this, the 99% of the games from that era will run perfectly fine with OssPD->Pipewire (install OSSPD, just run the game) and 32bit SDL1 libraries.
Quake is a rare exception. Source availability is rare on Windows, falling to almost zero for commercial applications (for obvious reasons). There's also plenty of corporate internal applications where the one company that is using it is also the only one with the source .. and they've lost it.
Quite a lot of game source is lost entirely even by the original authors.
Not to mention that even if you do have the source, changing the use of an API can be a really expensive software modification project. Even Microsoft haven't been entirely systematic, you can easily find WinForms control panel dialogs in Win11.
Some embedded Windows apps exist in this space as well. Oscilloscopes and other expensive scientific instruments that run Windows XP.
If you are like me and like to toy around with ancient game engines, for the sake of simply modding or trying things out, I made this tiny clean fork: https://github.com/klaussilveira/clean-quake
The idea is that it builds on 64-bit Linux with a very simple Makefile and SDL2, so you can start from there as your ground truth, and then have fun. It also removes a lot of cruft, like all the DOS and Windows 95 stuff mentioned in the article.
This is a great write-up for those of us that were into Quake when it was released. Trying to tune your performance was a huge undertaking during the days where you tried running Quake while also having Windows 95. I got into Quake because of all the available MAP tools you could use with it, and the multiplayer aspect, which previously had been very difficult to get working without a LAN.
The detail that -wavonly (falling back to the older WinMM API instead of DirectSound) actually gave the highest frame rate is a perfect example of a lesson that keeps reappearing in systems programming: "more direct" doesn't always mean faster when you're CPU-bound. DirectSound's lower latency came at the cost of more CPU cycles that could otherwise go to rendering.
Link to MGL v4 programmer guide seems to be broken. I’m super curious about this technique — can we do the same nowadays for modern Windows and video cards?
Probably at least in part for the same reason they used NextStep, because it wasn't prone to taking down the entire OS due to stray pointer dereferences during development, as was possible with Win95.
At the time Windows 95/98/Me lacked certain features so it's possible they wanted to experiment with things like multi-threading/processing. NT also supported Intel's physical address extensions, PAE, to address 64GB of RAM using an extra 4 bits of memory address creating 16x 4GB banks. Might have helped them in the development phase as map compiling took a lot of resources back then. Its also possible they saw that NT was the future of Windows as Win 2k married the multimedia stack of Win 9x with the more capable NT kernel. That led to XP which finally killed the Win 9x family.
I tried running Quake 2 on Windows NT 4 before 2k came out so like '98/'99 but had an issue as NT lacked DirectX. My memory has faded and I don't remember if the installer failed or it failed to run. I think it was the former as I have a recollection of something complaining about missing DirectX.
I do know that multi-processing was implemented in Quake 3 and I specifically ran Windows 2000 for that.
That's some of the same stuff that SDL is meant to abstract over, right? Although I guess SDL was more targeting Windows / Linux differences than Windows / Windows differences.
Also Linux/Linux differences – Xlib, SVGAlib, DirectFB, DRI, GGI, DGA and who knows how many other ways to draw stuff on the screen existed for Linux back then.
I don't get it, I am sincerely sorry :(.
Why do we need Windows 11 to support old software when we can use an older version of Windows, in an emulator at that. Playing Quake doesn't require a secure, patched box, and if a secure environment is the point of extreme backwards compat, then it seems like endless backwards compatibility is not the best way to achieve that goal (sandboxing an old, emulated OS, for example, comes to mind as more reasonable).
Letting Microsoft play this backwards compatibility card feels not healthy for the evolution of software or the diversification of the industry.
Breaking backwards compatibility is bad for diversity, because it "culls" a whole load of otherwise working software that is not being maintained. You can see the reverse of this on the app stores, which have mandatory update policies.
Regularly doing it basically forces developers into a limited-term license, subscription, or SaaS model, in order to pay for the upgrade churn required by the platform.
And a lot of it is just churn. Not evolution, not better, just .. different.
> it "culls" a whole load of otherwise working software
It doesn't cull it, you can still run Windows 3.11 or 98SE as well under emulation as on contemporary original hardware.
If anything, breaking backwards compatibility forces you to run your old software in an "authentic" environment, versus say, on some hardware/software combination tens of generations removed. Like, why would you want to run SkiFree in Windows 11, it feels like an abomination to me, almost disrespectful to the game. I don't want to see my old programs in Windows 11...
Because it's not limited to games, forcing updates cuts of a lot of apps that can't invest enough in updating.
Also the barrier to use you're suggesting with alternative install/emulator is pretty high for an average user. It also breaks integration with everything else (e.g., a simple alt-tab will show the VM instead of 2 apps running inside)
Also because a lot of progress is regression, so having an old way to opt out into is nice
Integration is the biggest thing. While some desktop VM hosts provide various integration bits like file sharing and rootless window support, the experience is rarely seemless.
Drawing a few examples from an old Raymond Chen blog post[1], integrations required for seemless operation include
• Host files must be accessible in guest applications using host paths and vice versa. Obviously this can't apply to all files, but users will at least expect their document files to be accessible, including documents located on (possibly drive-letter-mapped) network shares.
• Cut-and-paste and drag-and-drop need to work between host and guest applications.
• Taskbar notification icons created by guest applications must appear on the host's taskbar.
• Keyboard layout changes must be synchronized between host and guest.
These are, at least to a useful degree, possible. Integrations that are effectively impossible in the general case:
• Using local IPC mechanisms between host and guest applications. Chen's examples are OLE, DDE, and SendMessage, but this extends to other mechanisms like named pipes, TCP/IP via the loopback adapter, and shared memory.
• Using plug-ins running in the guest OS in host applications and vice versa. At best, these could be implemented through some sort of shim mechanism on a case-by-case basis, assuming the plug-in mechanism isn't too heavily sandboxed, and that the shim mechanism doesn't introduce unacceptable overhead (e.g., latency in real-time A/V applications).
Finally, implementing these integrations without complicated (to implement and configure) safeguards would effectively eliminate most of the security benefits of virtualization.
[1] https://web.archive.org/web/20051223213509/http://blogs.msdn...
> forcing updates cuts of a lot of apps that can't invest enough in updating.
What about emulation?
You don't necessarily "need" it but what feature of Win11 or OSX is worth the making all existing software inoperable? Can't say I have seen one outside of gets security updates.
I don't know, you could do something totally wild like re-imagining the filesystem... I, for one, would love a flat blob store organized in some other way than folders or filenames. I think there's tons of interesting things that could and would be explored without backwards compatibility holding us back. That's how the original OS X came to be.
But what I really don't get, is why we need backwards compat when computers can run computers, and old operating systems hardly demand resources on a modern computer.
And the entire Quake series runs very well on Linux+Proton as well. In other words, I’m not sure why this is impressive on Microsoft’s part.
The online games have depressingly (to me) small communities. But they’re still kicking.
There are games I have that don't run on even Windows 10 but work flawlessly in Wine/Proton.
The amazing work the Wine team and Valve have done can't be understated.
> And the entire Quake series runs very well on Linux+Proton as well. In other words, I’m not sure why this is impressive on Microsoft’s part.
Something funny about this statement considering what Proton is.
I know what you’re trying to say. That Proton IS Windows at some level. And so MS gets some credit for that. But they don’t.
A lot of actual work went into Proton and into making games work therein.
MS is a slow, lumbering, monoculture that has lacked innovation and creativity for a very long time. I don’t see how freezing APIs or keeping old APIs around (mostly through versioned DLL hell) as some grand accomplishment.
Building a stable base enough that porting to another foundation just works DOES deserve credit and pretending otherwise is extremely silly.
The page was blank when MS wrote upon it.
It only matters if you don't have the source to your programs. So yes, there is a huge corpus of programs where this matters. But there is also a large library of programs where the source is available and backwards compatibility does not matter nearly as much.
As a concrete example, the source to quake is available, this has allowed quake to run on so many platforms and windows infamous backwards compatibility has little effect in keeping quake running, windows could have broken backwards compatibility and quake would still run on it.
if you have the source to those programs, and are willing to (sometimes significantly) rewrite parts of them and recompile (see: wayland, for example)
The amazing part is that you don't need to do this in windows whether you have the source or not. I am a linux user, but for all their faults, Microsoft got their backwards compatibility stuff right. Something that the oss world, on average, needs to be convinced it's a desirable thing.
If it uses SDL (99% of the libre games), you don't have to rewrite anything.
> 99% of the libre games
i.e. much less than 1% of all existing games.
Unreal engine uses SDL, so more than 1% of games.
SDL was born from Icculus to run commercial games without issues on X/GL or whatever. So, it's actually a NOT in your clause. More like a 99% of the existing graphical games modulo some oddies with Ogre3D and friends.
At least in order to be playable under Linux. Said this, the 99% of the games from that era will run perfectly fine with OssPD->Pipewire (install OSSPD, just run the game) and 32bit SDL1 libraries.
Quake is a rare exception. Source availability is rare on Windows, falling to almost zero for commercial applications (for obvious reasons). There's also plenty of corporate internal applications where the one company that is using it is also the only one with the source .. and they've lost it.
Quite a lot of game source is lost entirely even by the original authors.
Not to mention that even if you do have the source, changing the use of an API can be a really expensive software modification project. Even Microsoft haven't been entirely systematic, you can easily find WinForms control panel dialogs in Win11.
Some embedded Windows apps exist in this space as well. Oscilloscopes and other expensive scientific instruments that run Windows XP.
True. It's amazing that you can play Quake even on the Oculus Quest 3 these days.
If you are like me and like to toy around with ancient game engines, for the sake of simply modding or trying things out, I made this tiny clean fork: https://github.com/klaussilveira/clean-quake
The idea is that it builds on 64-bit Linux with a very simple Makefile and SDL2, so you can start from there as your ground truth, and then have fun. It also removes a lot of cruft, like all the DOS and Windows 95 stuff mentioned in the article.
It is WinQuake or QuakeWorld?
WinQuake. Or, well, NetQuake as it is known.
“NetQuake” was primarily used to distinguish the original network code from the predictive model of Quakeworld, which came out a little while later.
Get qengine too for Quake2. Similar to Chocolate Doom/Hexen and clean Quake.
qengine is one of my things too! :)
software renderers are so much fun.
This is a great write-up for those of us that were into Quake when it was released. Trying to tune your performance was a huge undertaking during the days where you tried running Quake while also having Windows 95. I got into Quake because of all the available MAP tools you could use with it, and the multiplayer aspect, which previously had been very difficult to get working without a LAN.
The detail that -wavonly (falling back to the older WinMM API instead of DirectSound) actually gave the highest frame rate is a perfect example of a lesson that keeps reappearing in systems programming: "more direct" doesn't always mean faster when you're CPU-bound. DirectSound's lower latency came at the cost of more CPU cycles that could otherwise go to rendering.
Link to MGL v4 programmer guide seems to be broken. I’m super curious about this technique — can we do the same nowadays for modern Windows and video cards?
Fixed it. Sorry about that.
Thanks! Great write-up as always.
> Last but not least, id Software really wanted Quake to work on Windows NT.
Why?
Probably at least in part for the same reason they used NextStep, because it wasn't prone to taking down the entire OS due to stray pointer dereferences during development, as was possible with Win95.
At the time Windows 95/98/Me lacked certain features so it's possible they wanted to experiment with things like multi-threading/processing. NT also supported Intel's physical address extensions, PAE, to address 64GB of RAM using an extra 4 bits of memory address creating 16x 4GB banks. Might have helped them in the development phase as map compiling took a lot of resources back then. Its also possible they saw that NT was the future of Windows as Win 2k married the multimedia stack of Win 9x with the more capable NT kernel. That led to XP which finally killed the Win 9x family.
I tried running Quake 2 on Windows NT 4 before 2k came out so like '98/'99 but had an issue as NT lacked DirectX. My memory has faded and I don't remember if the installer failed or it failed to run. I think it was the former as I have a recollection of something complaining about missing DirectX.
I do know that multi-processing was implemented in Quake 3 and I specifically ran Windows 2000 for that.
NT4 lack DirectX. But in 1999 i got temporary access to some machine and copy few DX files from Win98.
As result - i was able to play some windowed games without 3D acceleration.
Why not? id was always into cross platform support, quake had a linux and solaris port weeks after launch, and mac a year later or so.
That's some of the same stuff that SDL is meant to abstract over, right? Although I guess SDL was more targeting Windows / Linux differences than Windows / Windows differences.
Also Linux/Linux differences – Xlib, SVGAlib, DirectFB, DRI, GGI, DGA and who knows how many other ways to draw stuff on the screen existed for Linux back then.
SDL1 used to render anything.
Yeah, also SDL didn't exist until a year after WinQuake's release.