How to allow continued use of obsolete versions of MS Windows

One thread of the topic ‘Why do people even bother using Wine?’ was about being able to continue using old versions of Windows, given the absence of updates and the dangers of using them online.

This advice corresponds well with the way that thread was going at first - many of us need Wine only to run software that requires an older OS. I’d add current but long-running and regularly updated software that just wasn’t developed for Linux; a case I know about is Toweb (website creation) which is specifically stated to be OK on Wine and the Mac equivalent.

I proposed as part of the discussion that Microsoft should be persuaded to enable licence-free use of limited versions of their obsolete OS on, perhaps, a virtual machine. This would help combat the damage technical progress is doing to data and perfectly good old software (that requires for example obsolete platforms like QT3, some Java stuff, Macromedia). Someone has asked me if I can rescue a very expensive encyclopedia from about 2000, which uses Macromedia; I can’t even see any of the content.

I should have added that MS could offer the suitably-frozen OS as a free but commercially interesting service from their store, simply requiring that the virtual machine is running under the current version of Windows.

While this isn’t fundamentally a FOSS topic, it does relate to reasons for promoting Linux, and might interest developers of open source virtual machines.

The only MS platform, that i know of, that has been released, to the public, is Windows 95/98. Windows XP through W11, are in need of a license and activation, to run properly. All I do is
install SP’s as needed, mostly for graphic-card support, especially for Nvidia.

The real problem with making this work is the lack of motivation. Using obsolete Windows versions is such an extremely niché use-case, it just does not make much sense to invest time, effort and perhaps even money into such a project.

For 99.99% of things you would need to run in an obsolete version, you can find a better and superior alternative. For example, there are newer encyclopedias, which are completely available online, as well, and actually run on current operating systems with up-to-date information.

There are also tons of website creation tools. I doubt, there is a need for the one you mentioned.

If it perfectly good, if it does not run on your current setup and is not maintained anymore?

I thought open source was the solution to this.
I still run a couple of Fortran programs that were written 50 years ago and they still run on any setup with a fortran compiler.

1 Like

There are specific cases when someone wants to use software that is only compatible with certain OS version. A couple years ago I installed Windows 2000 on AMD Athlon XP machine for a small manufacturing facility owner, because he didn’t want to invest in new software and the software he owned only worked on Windows 2000.

1 Like

That is a good example case.
I have to make the general point… open source is the answer . If the manufacturer had open source software he would have the source code and be able to port it to any OS or any computer.

That of course does not help this particular case.

In the example provided, the guy was cheapskating on new software to get rid of old software from over 20 years ago. This man has lost his way and probably wouldn’t even port the software if it would cost a penny.

Some years ago - for my eldest daughter’s birthday - I made her a VirtualBox VM of Windows XP with ALL the “Living Books” titles, these are interactive books on CD-ROM for toddlers and young children, they had separate disks for Mac users and Windows users… We only ever had the Windows versions, on Windows 3.0/3.1 and 3.11, but also 95, but I first saw them demo’d on a Mac…

Anway - Windows XP was the most recent version of Windows you could still run them on, and you kinda had to downtune the launcher for each on, e.g. 640x480 and 256 colours, and a few other things… Anyway - it worked quite well - you didn’t even need to install anything, just copy the EXE files and subfolders off each CD-ROM… Never tried to make them run on Windows Vista / 7 / 8 or later, I think it’s not possible anyway… So - anyway - I managed to get an OVA file that was ONLY 700 MB in size… the VM runs perfectly on Linux VirtualBox…

But now I’m trying to port that VM into a Mac, I have to use UTM and QEMU … Just about to throw in the towell - no idea how to force XP in QEMU/UTM to go 256 colour… worked a treat in VirtualBox…

All that effort just for the nostalgias?


I didn’t know that; thanks for making the situation clear and also for the reminder to install SP’s. I installed W98 and was immediately surprised by the screen resolutions we were used to of a couple of decades ago. We always reviewed mission-critical pharmaceutical regulatory documents from laser printouts, because the eyestrain caused by reading from the screen let to too many errors being missed.

1 Like


I deliberately mentioned encyclopedias because the major ones at least are a distinct literary form, with a distinguished editorial board curating the contributions of equally-distinguished writers. You don’t throw out and replace a literary work of whatever standing just because the medium has become inconvenient. This is a big worry with ebooks (file formats and DRM management), but I don’t know the current situation there.

They are all different, but probably not many provide as much as the one I mentioned for 50 euros.

I think these statements call for a detailed article, perhaps with contributions from a historian interested in the longevity of software and records. I presented elsewhere the case of someone who bought the popular game Scrabble in 2009 and has now reached an age when she can’t adapt to a new version though she now really needs it. In research and industry, hugely expensive equipment is kept in service typically for 10 or 20 years, much longer than the supposedly normal lifetime of the computers that control them and acquire and process the data. In one case, from mid 1970s till early 1990s the design team ceased to exist, but there was 3rd party support for the PDP8 computer and its specific disk drives. In another, the supplier Agilent could periodically update (1988-2002) from UCSD Pascal, through Unix, Win 3.11 to Win2000, because they used the same platforms for numerous instruments.

However, a huge number of industrial, public, military, aerospace, etc. computer applications require extensive validation, not only for obvious cases such as manufacturing an injectable pharmaceutical. That also applies to the associated office work. Every operating detail has to be specified, for reasons I won’t go into here; the cost of change is enormous. The latest changes that the desktop and file explorer that came with Windows 11 may look like some trivial (and in a way intrusive) contribution by a bunch of fidgety geeks at M$, but many written procedures will have to be revised.

Finally, I don’t think open source is always relevant here. Major scientific and financial applications, as well as Latex, have kept to the same languages for decades, but that isn’t always the case. Java and IDEs like QT have evolved, together with their runtimes. In practice, applications have had to be replaced because of the number of libraries, dependencies and so on that are not upwards compatible. What a waste of coding effort and retraining of highly skilled users.

1 Like

There is an argument for not writing those dependencies into code in the first place.
However that is not always easy to achieve. Self contained code tends to grow huge, like flatpak.

I wasn’t talking about Wikipedia, I was talking about actual encyclopedias with the same quality of information, except up to date and available online. Additionally, encyclopedias becoming outdated is nothing new. You, for example, wouldn’t want to read encyclopedias from the 50s in the 80s.

Did you check? Did you already count in the effort of setting up an entire OS just to get it running? I guess, it would cost less effort and time to work around the missing features in newer software, than to use the outdated software for 50 euros in its own dedicated environment, requiring a lot of essentially unnecessary effort.

I don’t think so. If you are looking at it from a historic perspective, that’s a different story. What you have said before so far, has nothing to do with a historic perspective. You actually want to use the outdated software, as if it could still be a daily driver, when it is not.

Obviously, looking historically at anything might make sense. However, using the software does not make sense.
A good example is the initial moon landing. It’s very interesting to look at the technology used from a historical perspective, however nobody would use that technology today, because it is utterly outdated and it wouldn’t make any sense to actually use it, today.

I’d say, that’s an extreme edge case and requires effort specific to the situation, which cannot be applied to the topic in general, as a general rule.

That’s correct. But you are mixing apples and oranges here. You cannot apply industrial reasonings to personal home computer use-cases. One is based on HUGE amounts of money and the fact that any change might require the company to re-train all the trained professionals to adapt to the new software and many other problems can arise, too.
However, you don’t have 99.99% of the issues when changing such things in a personal home computing environment. You don’t have staff. You don’t need to make money off your usage, or at least you won’t make millions out of it, or turning it around, losing millions if it does not work for a day.
Therefore, comparing those two does not yield productive results. It’s mixing two different things.

Yes, I think that’s just natural with such absolutely critical applications. However, again, not comparable. You don’t run pharmaceutical equipment that keeps people alive in your own personal computing environment (I hope).

It’s not that those languages and applications were/are that great. It’s about cost efficiency and convenience. If companies would all have infinite money, you would see constant change and constant improvement and constant replacement.

That’s a false assessment, in my opinion. It’s not a waste. It’s progress. New things are built on top of the knowledge of the previous things. Just because there was something before and it is being replaced now, does not mean the previous thing is wasted now or was meaningless. It has meaning.
It’s also usually better to replace legacy software with completely re-built software, from scratch, because at some point in time you cannot fix it anymore and it wouldn’t make sense, anyway. Same reason, why so many buildings are not renovated, but actually torn down, to be replaced by a completely newly built building. Same with software.
And that’s good.
I personally already had the pleasure to try to fit legacy software which started being developed more than 20 years ago, into a fresh and up to date infrastructure and all in all I can say this: it’s a huge pain in the ass.
The only way to “fix” that legacy software would be to replace it with something different, that does everything better. You can’t always take the old stuff and just “improve” or “fix” it. Some software eventually becomes so outdated and fundamentally wrong at some point in time, there is no fixing possible. It’s totaled. You have to replace it.
And, again, that’s good.
Keeping ancient software around forever is a huge pain for everyone, except for managers trying to go the easy and short-term profitable way, instead of the right, slightly less profitable in the short term, but more profitable in the long term, way.


I don’t think, someone is a “highly skilled user”, if he is stuck with the knowledge and technology from decades ago. That’s not a “highly skilled user”. It’s a highly skilled lame.
Especially when talking about software, the user (not end-user) has to adapt. Always. Always has to stay up to date. If anyone calls himself a “professional” and at the same time does not evolve with the software and new paradigms in the software world, that person is, by all means, not a “professional”. He’s a sleeper or at best slowpoke talking about stuff from decades ago, which has no relevance whatsoever, today, except in very specific niché environments.

Well I still have an HP65 programmable calculator… the model used on the Apollo mission, and it is still useful.
But I agree, old technology (at least in computing) is largely useless but can be educational and interesting.


Uncanny - I was just using something similar as an example… I have a full Encyclopedia Brittanica my grandpa gave us in the 1960’s…

WHOLE swathes of it are irrelevant / redundant / superceded, or just plain wrong!

But also - VAST reams of data in it are STILL relevant, correct and useful.

In the 1960’s, Alexander the Great was born in 356 BC. In the 2020’s, yeah, he was still born in 356 BC!

The anatomical transparencies at the back of one volume still hold true, 100%. e.g. some organs we didn’t know what were for, but, illustrators still included them!

Sure - many of the political maps are mostly irrelevant, but they’re still a valuable “snapshot in time” as to what the world looked like at the height of the cold war! When my copy was published, the Vietnam War was still a “US Training Exercise”, i.e. thousands of US sevicemen “training” the South Vietnamese army, but also Australian and other allied nations getting involved (I was terrified the government was going to come in and grab my dad and take him off there!).


Yes I do. As another correspondent has remarked, it’s history: a major subject of academic and recreational study.

1 Like

Did you try it? You can use it for free, but get a few adverts. Toweb has been around since 2005, is regularly updated, has all the modern facilities and has a major following, particularly for e-commerce. Where did you get the idea it’s outdated? While it is constructed for Windows, it installs easily on Linux with Wine and the Mac equivalent; that’s just a little bit difficult for inexperienced users when a site is constructed from a large number of files which stay on the host computer.

Nowadays much of public and private history is on computers. Sometimes you want to read files a quarter century old that weren’t ported to the latest format every few years. I recall during the last couple of years tweaking some MathCAD code (about 1995) to save a lot of work, and recovering to DXF some elaborate drawings on GraphicWorks where only the native files could be found - the latter was bought with the groceries but at the time was a very popular serious application that was presumably sold cheap because it didn’t quite make it against the competition. I used Wine for those typical recovery and maintenance jobs because I didn’t know that W98 is available on virtual machines.

I say it’s typical of the sort of work we do on the technical side of the voluntary social sector.

Ididn’t know FOSS was exclusively concerned with home computing.

I said that doesn’t apply only to the obviously mission-critical cases. Every business needs to get its IT right and can’t afford to re-tool every time an OS changes.

It isn’t money, it’s the expertise of thousands of expert contributors, accumulated over many decades, and that would take decades of futile effort to replace, giving the same functionality. That’s why each sector (science, finance…) has kept with a few old by excellent high level languages.

Unfortunately that isn’t always the case. Sometimes all you get from change is something like that awful MS Office ribbon.

The skill is in the job, for which the software is just a lowly tool that the fidgety geeks keep changing even when it works perfectly. It’s true that when I was in Pharma, the IT people seemed to have a grip even on higher direction. It’s often the little tail wagging the dog that does the work. Our laboratory applications for instrument control and data acquistion were seen as hostile competition. Modern professional software developers need to think ahead: how will their work evolve without having to be re-written, debugged and revalidated every couple of years. Of course nobody should object to relative amateurs using an ephemeral development environment.

The unfortunate exception to this is scientific literature.
Anything before about 1980 is rarely available online, with some exceptions for particular journals.
Anything subject to an unexpired copyright is only available if you pay an exorbitant fee.

The situation is not much better with literature in general.
Publishers have the whole thing tied up.

Much of the literature going back a couple of centuries seems to have been scanned now. I wanted the two papers (1828 & 1829) by Robert Brown and found them - behind the paywall of Taylor & Francis (47 euros). The DOIs were provided so I could get them from SCI-HUB. Some old books like the one on Gout by Garrod are available free from Google (who say it’s in the public interest), and the French National Library’s archive is available free online. I wouldn’t mind paying a small fee for that kind of service, a bit like paying postage in earlier times.

There remains the question of preserving both World patrimony and family photos for future centuries. Ilford has a colour micro-photographic system they claim is good for 500 years and is human readable (and therefore not subject to ephemeral DRM). You can get a lot of data onto a spool of film or tape, which seems a better solution for this purpose than proposed elaborate but less bulky 3D recordings made in solid media. You don’t want archive media to be small enough to fall between the floorboards like a microSD card :slightly_smiling_face:. As a chemist I can accept that modern photographic dyes can be stable but still see the advantage of silver, or any other element that’s unlikely to be there by accident or get smudged by diffusion.


Yes it is improving. I dont mind paying a fee, but some of the modern journals that are not ‘open’ charge rather large fees to access just one paper. In Biology the older litreature is still rather slow to be digitised.

Preserving stuff there are lots of options today. If you put things on Github, it ends up in the Arctic Vault.