When I was still working at Arecibo Observatory, there was definitely a couple systems running Win95 that were considered data critical. And they were still running on ~486 hardware IIRC.
They hadn't been replaced as they were running some custom ISA interface boards developed by a research group in the 90s, and the community was still using the data output by the machine. And since it had been trucking along for ~20 years, convincing them they needed to develop a replacement was hard.
When I left, it was still there, chugging along. And once a day, a tech would wander in with a floppy and copy some data off and wander back to the control room where a USB floppy drive was attached to a workstation specifically to read this floppy and let the tech copy the data files off to a network location.
Sounds like their system works. If that 486 hardware lasted 20 years already, seems like there's a pretty good chance it'll outlast most of the modern stuff it would be replaced with.
It works yes, but they were running out of spare parts when I left. That system's parts were on frequent Ebay searches, as it was becoming quite fragile. I think it was on a 3rd power supply and a second mobo or proc, can't quite remember.
It was on a redundant on-line UPS as the last time it went completely off and was booted from cold, it ended up needing parts. So no, I'd say it's not likely to outlast modern replacements. Just had lots of support and organ transplants.
Systems like that ultimately end up being a resource drain, as unlike modern hardware it's not simply wandering over to a retailer and buying standard replacement parts, but sourcing used or NOS parts and hoping they work, often requiring far more time and effort by someone who could be doing some other important task.
The problem is that in open use observatories like Arecibo, a lot of equipment like that is put in place by third parties, but becomes our responsibility to maintain. But the science is important, so we maintain it as best we can.
> If that 486 hardware lasted 20 years already, seems like there's a pretty good chance it'll outlast most of the modern stuff it would be replaced with.
Unlikely. Good quality 'modern stuff' has far better quality than what was available back then. Power supplies have gotten much better (and have more protections), as did motherboards with solid state capacitors. Older motherboards loved their electrolytic capacitors.
Processors had less protections. Thermal monitoring was at its infancy.
Memory sticks back then were very unreliable. As were hard drives.
Then there's just a matter of age. If it has lasted 20 years, chances are it won't last much longer.
It has probably lasted 20 years because it wasn't being powered on and off constantly. It most likely had stable power sources and - most important - a climate controlled environment with low amounts of dust.
A 486 system was built before the capacitor plague. Quality electrolytics may not last forever, but can last a long time. Also, built before RoHS and the reduced longevity of poorly formulated or poorly applied lead-free solders that were common for many years.
Processors had less protections, and thermal monitoring was at its infancy because almost all the 486s wouldn't use enough power to cook themselves to death, even with passive heatsinks.
I don't know about memory sticks and hard drives from that period. It's quite possible they've replaced the hard drive with compact flash or something too, which should be pretty reliable given they apparently only make a floppy disk's worth of data every day.
This! Got a used AMD-K5 PR133 in a big-tower a long time ago, integrated it into my home lab, and used it to experiment.
About 1 one year later it underwent my cleaning routine, vacuuming dust out, cleaning the fans, etc. I took the cpu-cooler off to check for bad thermal grease and discovered it had none at all! No pad either, someone just clipped that thing on, and instead of direct contact it had a small airgap which you could look through from the sides.
Unbelievable.
When I got it, it must have been in use for years already!
Um.... All electrolytics eventually go bad as the electrolyte solution dries out eventually. Re-capping retro computers and radio gear from the 70s through the 90s is very much a thing.
That being said, electrolytics were somewhat less common than they are now. But the Capacitor Plague is not the only reason electronics are recapped, especially older/vintage/retro electronics.
Modern components are run much closer to their limits, simply due to engineering optimisations.
A 486 doesn't even need a heatsink, it's passively cooled and perfectly fine with that, as software of the time (mainly DOS) didn't even "idle" the CPU --- it was drawing full power all the time.
That also means if you manage to boot DOS or similar era software on a modern machine (especially a laptop) that was designed to have the CPU idled for most of its life, it is likely to go into thermal throttling immediately.
Yes and no. The 8086 had a HLT instruction that was meant to replace the busy loop waiting for an interrupt. It was soon understood that you could also save power by turning off the clock. I'm not sure if the "SL" versions of the 486 turned off the clock with the HLT instruction, but later versions did. Use of it was apparently introduced in MS-DOS 6.0, so FreeDOS, Windows 95 and OS/2 2.1(?) would have probably used the instruction to effectively idle the processor. Not sure if Windows 3.1 made use of it, but it would make sense as the basis of the system's event loop.
Besides what the sibling reply mentioned about capacitors and processors, WRT memory that machine has survivorship bias. The fact it's been running for thirty years tells you the RAM in the machine is good and is unlikely to have a problem barring damage. The same is true for all the other still working components.
I worked in an office with a similar setup that "worked", but it was very fragile. When the HDD started failing on the old 486 machine we scrambled to migrate as much as possible (including a few rewrites and ports of utility programs). We were lucky that our problem was the software not running on modern systems (for reasons), not a dependency on the hardware. Mostly we communicated with hardware systems via serial, so any computer with a serial port (or a USB-to-serial adapter, which are often unreliable) could be made to work if the software existed.
> Sounds like their system works. If that 486 hardware lasted 20 years already, seems like there's a pretty good chance it'll outlast most of the modern stuff it would be replaced with.
Surely it does make some sense, but think about the hardware suddenly breaking down and not finding what to replace it with. Even with spare parts it could be a pain as the old generation retires
I actually was surprised to see these are made, as I considered getting one for a few projects. The ones I saw were pretty expensive with some being resold on ebay for just under a thousand dollars. Reading some reviews though it seems yoy have to be careful as not all the boards may support what you want. Apparently a good many don't support DMA.
It can be tricky. I had to work with something like this a few years ago and managed. As I remember I had to get Scitech Display Doctor/Univbe from some abandonware sites to make use of the weird onboard VGA under Windown95. The ISA-card for the CNC-machine worked, it used DMA, among other things. It all worked, and was even more fluid to operate/program.
The cost was just over 400EUR. (for the board)
Anyways, it's industrial, the formfactor often does not fit existing cases and backpanels, but it's doable. More so than hunting ebay or similar for spares of dubious provenance.
Only exception would be if the "App" and the board/card were designed so sloppily that they really only run on exactly that same hardware.
IIRC, the machine was on an isolated equipment network with no external access for some time, but the network drivers would often cause some kind of conflict with the ISA interface card (and custom driver) so the network interface was disabled. So yeah, should be safe (and is likely no longer hardwired to the network).
Win95 is running as root by default. Any process can open another's memory. It has the same registry entry as Windows NT for running a program on startup. The keylogging API is basically the same. The IShellFolder and IWebBrowser interfaces are identical, so as long as the malware didn't request specific classes or features that are outside the realm of Windows 95, it can probably do pretty well. Luckily, Internet Explorer 3 doesn't support TLS.
That floppy disk controller is the only way into the box and isn't nearly as vulnerable as a modern usb controller since there's no smarts!! The windows 95 system is going to be safer than a patched windows 10 on the network for their use case.
For the common-case en masse compromises, not a concern. For targeted attacks it can still be an issue. Stuxnet is an example of this where they compromised an air-gapped system. Stuxnet was obviously extremely sophisticated, technically and for the overall operation, but the principles still apply. Also people frequently misjudge how big of a target they may be & what their risk profile really is (not saying for this use-case specifically, just in general).
>Do you suspect there are any 0day/worms still targeting Windows 95 systems?
You'd be surprised how much malware is "legacy" 32-bit and works without issue on WinXP. A lot of malware even shies away from Unicode API's and specifically calls out the *A Win32 API functions.
Considering the Win32 API is backwards compatible to a fault, I don't see why modern malware couldn't absolutely wreck a Win9x box.
Yes, but Windows XP was in widespread use until recently and is still in use in a lot of places. I don't know where "netmarketshare.com" takes their user statistics (Google just used them when i searched for that) but it reports "1.26%" for Windows XP, which i think is still a large number. There are still so many users of Windows XP that Microsoft released a critical security patch last year despite XP being official EOL for years now.
Loaded from the boot sector of any floppy that happens to be present when the system is restarted, unless the boot order has been changed from factory defaults and the (possibly very old) coin cell powering NVRAM holds sufficient charge to retain these changes when (if) the system is ever powered off.
In general, attacks (e.g. most ransomware campaigns) are not automated malware spreading on its own, they involve people. If it's accessible on a network (which is the big 'if'), then it's obvious on any scan that it's a Win95 machine, and then you can try to look up a vulnerability for any exposed service. For example, if file sharing is running, then all the SMB vulnerabilities (e.g. EternalBlue and friends) apply also to Win95 systems, and I'm not sure if Windows 95 had a patch made, backported and issued for them.
Eh, no. Not by a chance. I have some old as heck games which don't even work even under W2k with w9x compat mode (it's hidden, you need a command to enable it), so a total nope under XP.
Windows XP (NT kernel) and 9x are totally different OSes, using totally different kernels. They share some API, but not really the interesting parts.
I work on software that still supports XP with only a little pain. Visual Studio 2019 will still install the necessary toolchain if you ask nicely. Targeting Windows 95/98 in 2020 sounds like a nightmare.
I wonder if it's even just as easy to just run some of that on VirtualBox / some hypervisor running Windows 95, that way they have access to modern replacement hardware.
The entire disk (all 10s of GBs of it) was cloned and backed up as a disk image, and can be re-imaged whenever necessary.
The data is copied off daily via floppy (don't get me started...) and is backed up on local network shares. The specifics of that weren't in my need-to-know and was managed by the IT group.
The system was essentially stateless, so spinning it up from the image wasn't an issue, but the PC itself is essentially part of the "equipment", as it ran the custom drivers and it was specifically designed for that PC and a one-of-a-kind ISA card.
Reading this... my first thought was "But newer x86 can.... oh right ISA..." then I googled for USB ISA slots... and they are apparently a thing. That doesn't mean someone wouldn't need to write a new driver but at least they could still use the cards.
Many years ago I was using an MCA (multi channel analyzer) ISA card connected to a surface barrier detector to measure beta decay spectra. This stuff basically counts the number of pulses from the detector binning them according to the height of the signal. It had its own acquisition software meant for DOS (the manufacturer provided the sources as well).
The thing is you could operate it under Windows (98) in protected mode, but then you lost statistics. How come? Well, Windows scheduling meant that the MCA only took data while it was allotted time slices, as a result the dead time of the whole setup increased, and you couldn't tell by how much.
Under real mode DOS it took advantage of DOS being almost an RTOS and the program running as a single task and you could be pretty confident about your data.
I don't think using adapters with this sort of cards is a good idea. It's not a matter of drivers, sometimes the old stuff simply doesn't work well when plugged in newer systems because everything has changed and you need to redesign everything. If the old stuff does the job, the OS being unsupported only becomes an insurmountable problem for the folks that consider using an unsupported OS some sort of crime in itself. Well, there's usually more than just an abstract computer being used in such circumstances.
Exactly. I believe someone tried investigating a solution to use a modern system and either perform some kind of pass through to a Win95 VM or some such magic, and it ultimately came down to needing to re-implement either the driver or the software, and that is hard to do without funding and time. Or any good documentation on what the original function and behavior was.
Yeah my first thought was just running a 16bit OS on modern hardware (it will do it if you're careful and do the right things). But then the issue of it being an ISA card popped up... that's not really a solvable problem with modern hardware unless you can convince an IHV to make you a custom motherboard. Although apparently some exist ( https://www.ibase.com.tw/english/ProductDetail/EmbeddedCompu... ) probably for industrial applications. While I'm very sure you could get DOS running on that... I'm less sure of Windows 95. The google search did show other options that are plausible but I doubt many of them would arrive working as is without needing capacitor replacement at a minimum.
There are all sorts of USB-to-Foo interfaces, the vast majority of which implement whatever Foo is just barely good enough to say it works with a straight face. Like the USB-Serial adapters...most of them work ok at 9600,8,N, but trying 5-bit with 1.5 stop bits (something a 'real' PC serial port can be persuaded to do) and good luck.
So, yeah, I'm sure that USB-ISA adapter probably works with that old 3Com ethernet card or that Soundblaster card that 'definitely' sounds better than anything else, but a custom card with custom software and probably unknown tolerance to variations from 'real' ISA? Good luck with that.
Oh probably and even if you did find one that works "well" you'd have to buy a ton of them to ensure you have backups. I'm well aware of the issues. But without an alternative or a grant to update the hardware what do you do? At some point the availability of period hardware with ISA slots is going to be prohibitively expensive.
You can still buy new manufacture motherboards with ISA slots and modern processors. Intel still (or until recently) made ISA bridge chips for modern Southbridge chips. And it's something that a smallish FPGA could handle. There is a whole ecosystem of companies that recreate old-style motherboards for exactly this scenario.
My brother's company does IT stuff for the manufacturing sector. Very conservative. I've worked with a number of his customers to source exactly this sort of thing. Years ago they asked my thoughts on an Italian made cloth dying machine the size of a bus with a DOS PC controller and a minimally documented ISA controller. A mid 7-figure US$ setup; the company that built it is long gone. We found a shop that supplied new motherboards, modern (at the time...P3/P4 era I think) with ISA slots the vendor guaranteed it would be every bit as slow and weird as a PC/AT. Migrated everrything over, the company bought a stack of motherboard spares new, and I think they're still using them.
Now...you want a challenge? Migrating the proprietary, mostly undocumented software, all in Italian (we aren't native speakers...though my high school Latin occasionally helped) from an old ESDI disk to something, anything, else. That was much more interesting.
Currently I have a related task, a telescope is being operated using a set of old hardware (late 90s-early 00s) and same age software, due to current situation in case of lock-downs the observations are not performed, so the idea is to migrate everything into virtual machines with PCI forwarding, so that VNC/whatever other remote desktop software can allow remote operations. Does anyone have such an experience? What are the pitfalls? I plan use Xen or KVM.
I did something similar for an organisation that had a strange music licensing setup for pub jukebox machines. A significant portion of their business hinged on this one ancient DOS pc with custom hardware and a licensing dongle.
Ultimately it comes down to whether you can get the inputs and outputs you need replicated/attached to the virtual machine and and exact copy of the machine itself into virtual form - the latter challenge can include having to do risky stuff like physically removing the source HDD to plug it into various interface converters and then taking an image from it using an intermediate machine that can read it.
Depending on how your setup works you may run into issues with things like time precision within a VM - if high time accuracy is important in the work it’s doing you’ll need to put attention into ensuring VM time = real world time.
Some top of my head suggestions based on our experience:
- Use Proxmox. Even free version can go a long way (because backing up VMs periodically is a good thing and Proxmox can do it fine).
- If you're going to run Linux, run XFCE desktops with X2Go server. X2Go can work almost in any network condition.
- Test PCI forwarding extremely well before taking the plunge. Especially if you're going to virtualize specially designed equipment. If it works, it's magic. If it doesn't then you're in big trouble.
- If the PCI cards will be migrated from older equipment to newer one, test to see whether they work well with spread spectrum. If it doesn't make sure that the server you use allows disabling it.
If you're really curious, hit my profile and email me. Always happy to chat about AO and the fun stuff there.
I was there for about 3 years and left around 2016. It was one of the best and most interesting places I've worked at so far.
The people were great, the work was great, and I personally felt the work came with a huge feeling of accomplishment and made me feel like I was contributing to a better world by facilitating scientific progress.
I was part of the Electronics Dept. which was ultimately responsible for the scientific hardware, the signal paths, and everything between. There were some people in Physical that were more responsible for the electro-mechanical stuff like motors and such. I was primarily responsible for assisting with the analog signal path and was considered a Receiver Specialist, but the latter half of my time there was spent writing a lot of digital control stuff for newer projects as I was the only one on staff at the time who had the skills and time to take those on.
I help to maintain an isolated network of desktops running Windows 95. It's a complex EHR system with multiple clients.
It's a family business, and the software works really well (a medical niche), so the investment to update the stack doesn't make any sense. (And there's no need to bother clinicians neither with things that don't add value) I have a VM for tweaking and working with it. I have multiple ETL scripts that extract the data to a modern ERP.
I don't have access to the source code, but I have tried to reverse engineer it from multiple angles. Essentially it's a 16 Bits client written on Delphi with a database (dBase) that doesn't support multiple readers or writers, shared via a network drive. Clients read files from the shared drive and create locks to avoid multiple clients working in the same data. That's the main source of pain, but the clinicians adapt to it in a week or so.
I haven't seen yet a modern system as complete as this one. There are multiple clinical centers in the area using the same software (25ish) and the 2-3 that moved into newer systems regretted greatly.
The market is too small for any new developments. But the guy who built it in the 90's maintains it and earns enough to live happily.
I have industrial machines that run the user interface on Windows 3.1.
There was a fad for a while of not using PLCs, and instead using 'regular computers', at least for a portion of the system (usually a PLC still handled the most performance sensitive parts).
When you have a production line that's 100 feet of steel, motors, ovens, and other equipment, and the cost of redoing all the control systems is quoted at hundreds of thousands of dollars, then why change it? It still works.
I guess it would come down to a combination of the companies risk tolerance and the cost of the worst case scenario. Eventually the system is going to blow w/out possibility of repair, now you have the expenses of downtime + replacement. So one argument in favor of replacing "prematurely" is to prevent just that. Like how you might sell a car for higher towards the end of its life instead of waiting until it's EOL, since now you get less money for the car and suddenly need to buy a new one. Granted, a replacement system could be worse than the original.
There's literally millions of old computers sitting around in warehouses and storage closets all over the place. That's besides mounts of new old stock sitting around.
If you're looking for replacement parts for old PC equipment, you don't need a million units, you just need a couple units. Finding a handful of units for replacement parts is pretty easy.
I use WordPerfect 6.2 for DOS, not for any nostalgia or legacy reasons, just because it's a full-featured and highly configurable word processor that I can use in a terminal. I only use it for writing letters and so on, nothing too serious, but I prefer to stay in the terminal if I can.
It works beautifully under dosemu2, which has a terminal mode that can translate various VGA modes into S-Lang calls (S-Lang is like ncurses, so no X11 required). I find this technically impressive and makes a lot of old DOS software indistinguishable from native linux software; stdin/stdout, parameters, host filesystem access, etc all work transparently.
It can import TTF fonts and print to PostScript, which I just pipe into ps2pdf and then handle on the host.
I'm not aware of any other full-featured modern word processor that can run in an xterm. I know about wordgrinder but it's very very basic. You could use a text editor, but it's not ideal for layout because it doesn't understand things like proportional font geometries - you need that to know how lines/glyphs will fit on the physical page when it's printed. You could write it in some form of markup, html, TeX, markdown, whatever, but if I'm just trying to format a document I prefer a word processor.
(Note: dosemu2 doesn't require virtual 8086 mode, so it works fine on x86-64)
Brilliant - I've long observed that almost everything we are doing with our GHz computers, we were doing previously with old MHz systems. We have more pixels (but use them for the same result), more memory (but everything has grown to fill it), more bandwidth (but websites packed with junk bytes).
The only "new" thing enabled by our bigger systems is "big data", which is largely a process of finding patterns that will fool the user/purchaser/customer.
> I've long observed that almost everything we are doing with
> our GHz computers, we were doing previously with old MHz
> systems.
I do also, but then I do have to pinch myself and remember some of the cutting edge software that really does make use of the hardware - i.e. games, video editing, neural networks, etc. I know we could already do these things on lesser machines, but there is no doubt that the level of which they are currently done could not be replicated on a lesser machine. And also remember the efficiency of systems such as web servers, todays average usage would be a DoS attack of the past.
I do look in disappointment though at text editors, window managers, file viewers, etc - that despite having much more computing power, offer not many more features but still eat tonnes of resources.
I'm currently (slowly) working on a Window Manger for X11 which tries to bring back some of these ideas but for modern devices: https://github.com/danielbarry/oakwm/ It's built on top of wm2 (which itself has roots going back to Bell Labs plan9). A lot of the work is in ripping out the unnecessary features and making it touch friendly. The idea is to run it on the PineTab Linux tablet.
I'd like a browser that is just a nice browser. It should fit easily into a few 10s MB and start instantly. It wouldn't waste 100s of MB on implementing dangerous/pointless/wasteful javascript APIs, like battery state or physical screen APIs; it wouldn't support (literally) 4300 options; wouldn't save 200 MB of config data; and wouldn't load many MB of data just to start and display a blank page!
Dillo/NetSurf may be of interest to you. They fit the size requirement as well as the "lack of JS" requirement (i.e. no JS support at all). "appsites" will definitely not work in them, but the document-oriented web and things like HN are fine.
I've used Dillo, but unfortunately most web pages are seriously malformed. Would be nice if it was possible to get some better CSS support.
I would like to see better JS support, but the scope of JS is simply insane for such a small browser. It's unfortunate much of the web is completely unusable without running JS. Perhaps it's possible to first run the page through a larger browser engine and then send the processed content to the small browser (such as Dillo), that would massively widen the scope of what it could display.
For the record, chrome on android is ~60MB, less than half the size of desktop chrome (~150MB). Android chrome supports the same set of javascript APIs. Clearly that isn't the whole picture.
Actually, you can select which Chrome to use for the Webview in Android developer settings if you have Installed multiple versions of Chrome. Chrome stands on its own.
So Chrome for Android is 10MB larger than the entirety of Damn Small Linux, which includes a desktop environment with three web browsers (firefox, dillo and netrik).
While 60MB is still humongous, it does make sense that Google would at least care slightly on Android, as the majority of the Android market are low-to-mid-tier smartphones with reasonable performance instead of the "desktop-in-a-pocket" that are the current flagship phones.
But while they care slightly on Android, they don't care at all on desktops.
Depends largely on the state of code caches, microcode and bus structure. Loop unrolling can be the worst thing you can do, on some architectures.
See, the hardware folks have listened in on the compiler people and their problems. They've done things like identify loops and rewritten them in microcode for optimization. If a short loop can fit entirely within the CPU code buffer, speed goes way up. Unroll the loop and blow the CPU code buffer, defeat the optimization and lose all that.
what you want isn't possible with the web being like MTV and moving far beyond it's original intention. The best you can get is get one of the small browsers and live with broken pages. Some people do just that. I think you should be think in terms of relativity though. Think about how much a "10 MB" only browser took up in main memory back in the day.
> I think you should be think in terms of relativity though.
> Think about how much a "10 MB" only browser took up in main
> memory back in the day.
This is exactly the point, software has swelled to use the resources available to it, so with each new iteration your machine gets faster but what it runs gets slower. It doesn't feel like this is something we should settle for.
Well that and high quality video. One and only one tangible benefit of this new web is better quality video. Everything else sucks as much or worse than in the days of Altavista and punch the monkey ads.
But apart from the HD video, the search, the aqueducts, the roads, the education, the browsers that don't crash all the time, and the wine... what have the Romans ever done for us?
I disagree. Searching for keywords is practically impossible nowadays, with all smart AI-powered™ search engines that unhelpfully return results that are not relevant to your query. And SEO spam, which has killed the niche high signal-to-noise handcrafted websites for content aggregators.
Since 2-3 years, I saw a dramatic deterioration of search.
Whenever I am looking for anything that is not very mainstream, I get pages and pages of irrelevant results / SEO hacking sites
Remember "popup spam" and then how enough windows would open that none would respond?
I'm guessing anyone who didn't get browser crashes in the 90s was only browsing a couple of sites or something. Certainly anything before IE5.5 was crash city - in particular Netscape 2.0!
Chrome still crashes all the time for me - in fact, it seems to be crashing more than it used to. It's just that now the crash is isolated to a particular tab.
I've been a developer for a while, starting from QBasic, then Pascal, Delphi, C/C++, Flash, Qt, etc. happily switching from one technology to the next when it makes sense.
Today I use Electron and React Native, which needless to say are not very popular, but for sure I could never have developed the kind of cross-platform software I write today with the technologies I used many years ago. Partly for lack of skills, but also of time (cross-platform development was way more difficult), or simply because computers back them were not powerful enough.
I don't have any special nostalgia for old technologies, some stuff were good, some not so much. And as OP is showing you're still free to use old software if you don't like what's being done today.
MHz level stuff works for 'work' that normal people do - writing documents, emailing, etc. I dreamed for a while about creating a MHz-level processor from the ground up with an open hardware specification and creating a simple kernel + OS to run on top of it, with new secure protocols for networking (websites, not reinventing TCP/IP) and communications (email). Right now every single computer is backdoored by the governments of the most powerful countries: we have Intel MEs and AMD PSPs in the hardware, and the frameworks everything uses - Linux, OpenSSL, etc. - are almost all huge complex codebases that the NSA has tons of zero days for. If you could get open MHz-level hardware and get OpenBSD or something to run on it, and develop applications with a focus on secure code, it would be pretty great.
But ultimately this would all fail. Aside from the network effects - nobody's going to use it - the fast processors and tons of memory are necessary. We just don't think about it unless we're running out of it.
- high resolution photo and video - photo editing programs like GIMP and Darktable can spend some time processing photos; these days even with complex effects you usually never experience lag more than a few hundred ms on many megapixel photos, because our hardware is fast. Same for video, the memory and storage space and bandwith is a hard requirement, and going back to 360p is not really acceptable.
- high resolution monitors - no point in having great looking 4k 10 bit color video without a 4k monitor, and now you're stuck having to push 20gbps through your displayport cable on a 500 MHz processor. And text also looks much better at 4k. Also you could say this is a bit unnecessary but compositing window managers are pretty great and I would say a core requirement of modern GUIs, and having eye candy like wobbly windows, window shadows (those are actually very helpful, try turning them off), etc. is expensive. My Dell XPS laptop from 2017 couldn't handle wobbly windows without visible stutter at 4k60; my desktop with a $400 GPU from 2019 can keep up at 4k@144hz.
- new technologies - VR/AR, fast voice recognition and neural networks - this is all cutting edge stuff but the use cases are obvious and they have started to be applied more commonly. Also the failures like eye tracking, Kinect - they may have failed commercially but they were good ideas and a valid use of fast computers. Also online meetings with many participants, each with their own video streams that require decompression.
- obviously, video games - not that you can't have fun games with shitty graphics, but good graphics are nice.
I thought of a few more but can't remember them at the moment. Oh, also, Bluetooth audio - 192kbps audio, but you need to decode it since it takes compression to get it to that level, and then you have the additional overhead of sending it over a digital protocol instead of just having your ADC do the work. You would need expansion cards or a dedicated audio core in your CPU to accomplish this if your computer wasn't fast.
And of course, science needs big data and big hardware to process all of that data.
If you're only working with text then slow computers are fine, but as soon as media gets involved they are not practical.
Don't get me wrong - I like hi-res images, and bluetooth (when it works), and I'm still amazed that my phone has more grunt that a Cray.
But the stuff we do day-to-day (excepting graphics) is appallingly inefficient. And most of the look-at-this stuff (speech recognition, and you mention VR and eye-tracking) is not actually used for anything day-to-day; we don't routinely talk to our computers ("Hello Computer" https://youtu.be/v9kTVZiJ3Uc?t=10 ).
That's perhaps because doing so in an office or cafe environment isn't great.
Kids are using text-to-speech more while doing their homework. Word has been beefing up its transcription feature for a while now[1].
I got to see this happen during lockdown, where teachers I know recommended transcription for younger kids who don't have touch typing skills and were still expected to turn in work on a computer. Talking to your computer is much more natural while doing your homework in your room.
When I was a young person with executive function issues (late 90s/early 2000s) voice to text was a lifesaver for my ability to write coherently. I found it much more natural (even then) to "write" by speaking my mind to a computer than to struggle to focus while typing slowly. It wasn't until I was older and using IM heavily as a social outlet (which was much more in the "bursty" configuration that works well for my brain) that my typing ability caught up with my pace of dictation.
Saddest part of speech recognition is with all the gigahertz and storage we have, we send it out to the internet to be processed by something else then get the result.
> Right now every single computer is backdoored by the governments of the most powerful countries: we have Intel MEs and AMD PSPs in the hardware, and the frameworks everything uses - Linux, OpenSSL, etc. - are almost all huge complex codebases that the NSA has tons of zero days for.
You've giving the gubment too much credit. It's easy to throw up your hands and say "aww geez I guess everything is backdoored anyway, why bother?" -- this is exactly what they want. The truth is a lot more complex, and as a lot of leaks have revealed, their capabilities are far from the supernatural omnipotence you seem to be implying.
Not perceived: actual, effective latency is higher. Today's hardware is capable of better latencies, but most software guys don't care about latency all the way from firmware to OS to apps to websites. It does matter to some degree in games and vehicles but it's hard to find an industry that values latency outside of toys or industrial applications. Practitioners say this is because "99% of users don't care" but that's only because 99% of users have either (1) never experienced fast latencies or (2) have forgotten them because latencies have gotten gradually slower over the last three decades.
All large systems of systems have "duct tape", I'd go as far as to say it's an emergent property of systems of systems.
It comes about because unless you design every sub-component system in lockstep with every other (which is impossible, the romans didn't lay out londons for cars, the victorians didn't put in sewers with respect to where we'd want to run fiber) you end up with an impedance mismatch at the boundaries.
It's why back-hoe operators from a gas network rip up fiber depressingly often, why (in the UK) the UK transport system has choke points between road, rail and air and on and on.
Modern computers aren't a unified system, they are lots of seperate systems that talk to each other and frankly having some (minimal understanding) of what has to happen for Gnome to appear on my screen when I press my power button I'm amazed that it ever works never mind mostly without fuss.
Exactly. Even if you do care, you can only exert so much influence on the platform you're developing for, unless you're working for a company like Apple or Sony. Even then, there are many, many layers to the latency problem.
Our systems are doing so much now, and all that "extra" lets us do numerous things at once without blinking an eye and it allows new comers to step up to a computer and almost immediately start using it after some exploring. Not clicking something and waiting 20-30 seconds to see what the program is or a blinking green cursor wondering what to do next.
> Sure you might be able to do some of those things at a very low resolution on a Mhz system, but not in any meaningful way.
> You must be really young.
Maybe they are, maybe they aren't, but that's an unfounded assumption. There are plenty of middle-aged people who haven't got a solid idea of the limits of pre-Ghz-scale hardware for use cases that weren't all that mainstream back then, and I'm sure there will also be young people who've tinkered with some old Windows 98 box and could give us a pretty solid opinion.
At any rate, I've used a succession of Mhz-scale systems (starting with a 386 that struggled valiantly to run a copy of Windows 3.1 that someone had installed on it) and I'm definitely not very young.
> >* Graphics design
> We did
With rough and simple tools, limited to low-res designs, very limited brushes, super crude brush simulations (if any). Procreate on the iPad blows all of it out of the water effortlessly with its worst brushes using a finger for a stylus, and that realtime brush simulation has been taxing for earlier iPads, that have always been orders of magnitude above Mhz systems. I used Photoshop CS and CS2 a lot back in the day; they were resource hogs and still very crude compared to current entry-level apps. We've really gained a huge lot in that department.
> >* Video editing
> Ditto
At what, 640x480, 15fps? I remember having annoying render times for things that wouldn't hardly even count as filters nowadays at such resolutions, maybe 800x600, but I'm sure that ball-shaped logitech webcam could do no more than that. Snapchat replaces whole faces in realtime with eerie accuracy on what must be much higher res.
Color grading Full HD, 30fps? As if. I guess you couldn't even play that without stutters unless with dedicated silicon to begin with.
> >* Animation
> Ditto
Low-res, crude, and anything halfway detailed and interactive: Low-fps. Manipulating anatomy models like those at https://www.zygotebody.com/ in realtime? I doubt my Core 2 Duo laptop would have been up to that. I had similar software for Windows 98, and it was absolutely primitive and still taxing the CPU and GPU.
> >* 3D CAD
> Older than you think
With less precision, much, much lower model complexity, and much cruder tooling. A large number of components, complex shapes, joints, ... that's going to hit hard limits very quickly. I'm hitting hard limits with that sort of thing nowadays, but I'm hitting them somewhat later than even a few years ago.
> >* 3D Rendering
> 3D rendering was born here, and we were playing 3D games perfectly in 1996-2002
I don't play a lot, but between Gothic 1 and Witcher 3, graphics have improved by an incredible amount, it's day and night, and I can't even go to full details in Witcher due to my aging GPU. Technically, those systems could do it, sure – but only with super short visibility, very crude models and extremely limited shaders and animations, crude collision detection, ... Gothic required at least a 700mhz Pentium 3, so it's pretty representative I think. Of course, those limitations work better for some games than for others, but they still were brutal limitations.
> >* Multitasking
> Linux, KDE3.
Also quite limited, though. Just what fits on a single Mhz-scale core. On my Windows 98 PC, I had to close CPU-intensive applications all the time because things would start to stutter; I believe Windows 95 would sometimes just bluescreen under such conditions. Things got a lot better on Windows 2000, but I think that may have been on my 1Ghz Athlon already. Those early Linux desktops were pretty unstable with lots of multitasking as well, I faintly remember lots of freezing. Things did slow down perceptibly at any rate when doing multiple resource-intensive things. Technically possible, sure – but it absolutely helps to have lots of fast cores.
Of course, a lot of the legwork to make those use cases perform is nowadays done by GPUs or huge amounts of RAM, and profit lots from multiple cores, but I'd say a Mhz-scale system should have a period video card, too, and MB-scale RAM, and be single-core, otherwise it's kinda pointless. And under those conditions, all of above things were technically possible, but really severely limited – still are, in some cases (CAD...), but wayyy less than back then.
Does that qualify as "not in any meaningful way"? I guess it depends. It was meaningful for me back then, and it made possible things that hadn't been possible before, and of course we were always content with what we had, not like thouse young'uns nowadays, and walked uphill to school, in the snow, both ways, every day – but looking back, the capabilities of my Mhz computers feel incredibly crude and primitive by today's standards, and even a lower-end gaming PC has little problem running Autodesk Fusion 360 (for which there's even a free hobbyist license) with models of surprising complexity, and I'm sure that enables many many more things that wouldn't have been feasible on Mhz hardware.
TBH w98 was a joke on multitasking. You can't even compare NT4 or W2000 (not mention to Linux) with Windows 98, where it struggled even under a Pentium 4.
On freezing, Mandrake was a joke, but Slackware and Debian were rock solid.
My parents got their first computer in the 90s so my mom could do freelance work editing medical texts on nights and weekends after her day job. She used Word Perfect and she taught me to type with it, I still remember the exact shade of blue and the white text. She’d use Reveal Codes and I thought all the special characters were so cool. I can still picture her sitting at the desk working on it while I’d play video games. I was maybe 7 or 8 years old. They’re vivid memories that are kind of meaningful that I haven’t thought about in decades, tied directly to that application. Weird how that works.
Somehow WordPerfect/MultiMate with reveal codes is still a more intuitive text formatting tool than anything I've used since. Or maybe it's just nostalgia. There's nothing I hate more these days than trying to format text in WYSIWYG editors, in-browser confluence and jira, notepads, markdown, etc.
I wish there was a mandate that companies had to register sourcecode, e.g. with the Library of Congress, so that it could be released as Public Domain after a couple of decades. We are losing quite a lot of our software heritage when companies bury projects like WordPerfect or milk older games for a few more bucks. We could have a thriving, open-source ecosystem around the sometimes excellent software that is currently locked away.
We've decided 20 years is long enough for drug patents, and in my opinion, it should be good enough for almost all intellectual property, really.
Create something new? Great! You're officially granted a monopoly on that thing for 2 decades - plenty of time to monetize it, and use the money to create some other new things. If you can't figure out a way to monetize it in that timeframe, then we'll allow others to compete with you and take a crack at it.
Given that Disney made its fortune by adapting public domain stories and then abusing the copyright protection of their adaptation of these stories, I think there are more loopholes to patch than just the expiration date.
Very cool! I couldn't find a trove where I could browse by category or other metadata, although I did find the search interface. Is there a categorised trove?
I had no idea dosemu2 could translate VGA to S-Lang. That's very nice, and I will look into it.
WordPerfect was a great word processor, I remember using version 5.1 for years because it was wicked fast, rock stable, and most importantly, it was predictable. By contrast, MS Word -- even today -- seems to have a mind of its own. You move an image, and suddenly all your numbered bullet points appear in a different font except for the last one, and nothing short of retyping the whole thing in another document seems to fix it.
That’s a problem that no WYSIWYG has been able to solve yet. It’s probably an unavoidable abstraction leakage.
Hence the unbelievable popularity of markdown.
Imagine telling your 90s self your current machine configuration and how awesome it is. Then explain that we, with all this tech, voluntarily chose to type in plaintext format, with notations that kind of work as a more readable form of markup language, because we got fed up trying to make WYSIWYG work.
Oh yes, and GUIs are for noobs, pros use terminal emulators, but that’s another topic :)
> That’s a problem that no WYSIWYG has been able to solve yet. It’s probably an unavoidable abstraction leakage.
I think it comes from conflating layout with writing. In QuarkXPress or InDesign it really isn't an issue. Those tools are quite uncomfortable for sitting to write in, though.
LyX does a decent job of this, I think, though it's not exactly WYSIWYG.
Have you actually used the GUI version of WordPerfect?
I was forced to use it heavily for a year of pure writing for a previous job. It's incredibly unpredictable. It's like whack-a-mole. You make text in one place bold, and all the sudden some of your footnote text on a different page becomes bold.
The people at that job who were really good knew all of the tricks, digging into the codes that WordPerfect inserts to address various issues. But even then, it was an extra step, and I never became as productive in WordPerfect as I had been in Word.
Plus WordPerfect has been on maintenance mode with Corel for decades at this point, catching lots of bugs and half-implemented features.
Replying to say you are correct and GUI/Windows WordPerfect was completely 100% broken when dealing with 'closing tags'. Some people knew & liked to use the codes, but the basic GUI editing was just defective and caused your formatting to spill out randomly. (at least in vs 5 & 6)
Obviously I get this with HTML, but the Mac & MS Word approach of 'object oriented formatting' was just a much better execution for mouse operation.
The only version of WordPerfect I ever used was 5.1 for MS-DOS, and my comments apply to that one only. I heard the GUI version left a lot to be desired, so I never upgraded.
> You make text in one place bold, and all the sudden some of your footnote text on a different page becomes bold.
Tbqh, this is my experience in just about every rich text editor. Well, maybe not quite that bad, but I’ll always bold a word, and then while editing I have to go back and change the word after it, and it will suddenly be bold.
I wish bold/italics/etc acted like caps lock. It’s either on or it’s off, and the computer doesn’t try to guess for me.
Outlook has a setting to automatically capitalize the beginnings of sentences. I turned that off. But it's been long ago, I forgot where that setting is. But I recall it was near the setting that enables smart quotes.
They even do it on the web version of Outlook. The setting to change it is 4 levels deep and two of the options are at the bottom of lists and labelled "more options".
On display? I eventually had to go down to the cellar to find them... It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying "Beware of the Leopard."
I haven't seen it with Word, which I use quite a lot, but there is one similar and extremely long-standing bug. When you use a keyboard shortcut to turn bold/italics on, then type a word immediately before a non-whitespace character, then use the shortcut to turn it back off, it will un-bold the thing you just bolded. So annoying.
I am having flashbacks to late 90s Linux. One of the distros back then was Caldera OpenLinux. It had a Linux port of WordPerfect. It got a lot of criticism at the time for being nonfree.
It requires libc5 - the c library that was common before glibc became common at the end of the 90s. And then needs X libraries that are compiled against libc5.
Probably you could find an old Red Hat 5 or something and pull out those libraries to run on a recent kernel.
I think WordPerfect 8 was where Corel rewrote it. It didn't work nearly as well. I vaguely recall they switched from native to some kind of compatibility layer (maybe Wine?) which just wasn't very good.
I wonder if libc5 is so old that it has problems with docker. But I remember loki games being statically linked to avoid problems with dependencies. You only needed like linux syscalls and X11 for them to work.
docker has no concept of the binaries that actually run, it just sets up namespaces, cgroups and a userspace. worse case scenario, you have to run the docker container as privileged, still no difference from running on host (at least with a properly constructed image, where the binaries wouldn't run as root)
Nota Bene is another text-based word processor from that era. It defaults to showing embedded formatting commands. The primary means of operation is via an integrated DOS-like command prompt.
Nota Bene although no longer text based still exists and is actively developed. Nota Bene used the XyWrite word processor engine. XyWrite was very popular in news rooms back in the day.
That's a description that also matches Arnor's Protext. The command window was the bottom half of the screen, and one embedded rulers and suchlike into the document text with ">" lines.
This is testing my memory but I think WordStar from the early to mid 80s had something similar. Codes all started with a ^ (again from memory) and their visibility could be toggled on and off.
This week I built a home-made computer using an ESP32 instead of a Z80 and hodge-podge of 74xx logic chips to run CP/M and started playing with Wordstar 3 and 4. I used it many decades ago briefly on the Amiga and it was more graphical but I'm only just getting to grips with it now.
I have known lawyers and legal secretaries who just would not give up on WordPerfect, I think because of the footnotes. The two that I most recently had dealings with have since retired, but I'd be anything but surprised to learn that their are recent .wpd (is it?) files on the General Counsel's Office's shared drive.
I liked WordPerfect well enough from 4.2 through 5.1. Version 6 on Windows was not good. Even a slow typist (me) could get ahead of the cursor. But at some point my employer switched to Word and I haven't been back.
During the 4.X and 5.X era at WP, the company paid very close attention to the needs of the legal market and WP could do things with a couple of keystrokes that would take 10 minutes with MS Word.
WP also had a potent macro capability and a good macro editor as well. I was practicing law at this time and had created 80-99 WP macros that made legal docs appear like magic.
There was no e-filing and court clerks and judges could be more than a little picky about the formatting of the paper documents they received. If a doc wasn’t formatted in the proper manner, some clerks would refuse to file it and return the doc to the lawyer’s office for a re-do.
The acquisition of WP by Novell was a disaster. Way different corporate cultures and leadership styles.
Ashton and Bastian had been generous with stock grants to key people, particularly on the technical side. Most of the key people below the two founders had enough money that they didn’t have to put up with the sharp elbows and confrontational styles that predominated at Novell.
Combine a brain drain with a Microsoft push for Windows and built-in integration with the first passable version of Word and you had a recipe for disaster. Some of the WP people who remained claimed that MS had mislead them about Windows vs DOS roadmap and been slow in providing Windows info necessary for WP engineers to build a decent first release of WP for the new release of Windows, but I could never determine if this was what had really happened or just sour grapes because WP stumbled badly after the ownership change.
It was real. They built their own language and compiler to compile to undocumented windows API calls because MS wouldn't share in the early days while they established a WYSIWYG foothold.
In the corel days they finally decided to port to actual win32 code and put aside the custom transpilation
According to this history written by one of the founders they were given a pre-release copy of Windows to get an early heads up on porting, but the engineers generally preferred to work with DOS:
> I'm not aware of any other full-featured modern word processor that can run in an xterm
Word Perfect was available for UNIX as well. The SCO binaries might run with iBCS. I don't know if that would provide a significantly different experience than running the DOS version in dosemu though.
I tried running WP for SCO under iBCS 10-15 years ago. I couldn't get it to work. Don't remember all the details but it seemed they cheated with some of the video routines and I mostly got a corrupted screen. I didn't spend a lot of time on it, though, and I very well might have just not hit on the right incantations.
Doesn't RR Martin use the original Wordstar? I remember reading about him using an old school physical word processor and USB-floppy drive when he needs to backup/transfer his work.
Version 4 got me through the first couple years of college. My freshman year I actually had a steady stream of people coming into my dorm room to use it on my PS/2. Eventually I switched from English to Math and moved on to vi and the various roff markup tools and then LaTeX, but I think what ultimately killed the magic of that generation of WordPerfect was the change in keyboard layouts that moved the function keys from the left side to the top. Before that when combined with Ctrl, Alt, and Shift there were 40 commands available just a pinky reach away, and it came with a cheat sheet card that fit neatly around them.
Versions 4.1 and 4.2 got me through the first couple of years of college also. I briefly updated to 5.0 but quickly reverted to version 4.2. I have never been as productive in a word processor as I was using WordPerfect and a Model F XT keyboard.
IBM spun out their keyboard division decades back when they they spun out printers (into Lexmark). The current inheritor of the keyboards is Unicomp (having bought it from Lexmark/IBM): https://www.pckeyboard.com/
It's a bit cheaper to buy from them than to pay for an antique on eBay.
Also, there's a wave of mechanical options from more recent companies like Das Keyboard or the various DIY kits with choose your Cherry switch adventures.
Reveal codes should be part of any rich text editor. Ever sat wondering why pressing return added a new bullet to a list rather than a gap to the text preceding it? I see it all the time while watching engineers and manager using tools like Jira, Confluence etc. Reveal codes would make it plainly obvious - the cursor was after the list start but before first bullet.
Nothing more delightful to work with than a requirements management tool whose wysiwyg editor changed the text of the bullet points when the indentation level changed. Never figured out what was going on, but there was somehow two different texts occupying the same line, and if you pressed tab to make the line deeper in the hierarchy it would display the other text. You start wondering how bad DOORS really is if this is how the "modern" web app replacement works.
I really don't want a word processor, per se, so much as I want really robust (realtime) text wrapping/unwrapping as I edit paragraphs. I've tried to get Vim and Emacs to do reflows but both were very flakey compared to something like WordPerfect for DOS.
I am finding that semantic HTML and CSS Grid where you style the elements works nicely for documents. This goes against what Tim Berners Lee imagined in that I am typing in the HTML tags and happy to navigate them in text.
I prefer the tags as they describe my work. Visually there is no evidence that something is a 'section' or an 'aside' but I make sure my words do have document structure, so a 'section' will probably have a 'h3' at the start of it.
I wish there was a 'what you see is what you mean' editor that was visual but only allowed you to add HTML semantic elements. WYSIWYG editors tend to just add local styles and 'br' elements, resulting in HTML that is useless for my style of HTML.
I can do diagrams in various ways including SVG that I just write out manually. Or I can use the wonders of CSS Grid and a 'dl' list with first of type/last of type CSS selectors to style up a simple linear flow diagram.
I really wish there was a super neat word processor or HTML editor that only wrote lean HTML with semantic markup for document structure. But here I am in 2020 using my IDE or Vim to write documents and even craft diagrams.
I am impressed that I am not alone in spurning drag and drop wonder gadgets. As much as I would like to give Wordprerfect a go, I feel the world has moved on from documents printed out as memos and put in pigeon holes. Stuff has to look good in a browser and with stripped down HTML that uses the elements and a simple stylesheet that loads a font and puts in some CSS Grid layout I feel I have the tools I need.
Word perfect 5 and 6 in many respects are still vastly superior to any version of word, especially when it comes to aligning text left and right, which are properly treated as line flow instead of a properties of a tab stop.
Without leaving the keyboard I could easily align text on the same line to left aligned, centered and right aligned.
For typing / journaling the non-wysiwyg was a feature. I honestly think wordperfect 5.1 is a big reason why sublime is so popular now.
Also I'd use WP 6.1 for DOS in Spanish if it wasn't a "little" inconvenience: we don't use Pesetas as a currency anymore, but I think there's an Euro patch somewere.
EDIT:
On graphics, that's a solved problem. Everyone uses framebuffer today, even on obsolete chipsets it's doable. Just map VGA calls to FB and everything would be fast enough, I think.
I used WordPerfect in my first job as a tech writer. It was beneficial to have a separation between writing and the layout done by desktop publishing systems.
As a programmer building websites, I do the same thing, writing in vim. Learning how to make your editor work for you is a great investment.
What macros do you use? I have some simple programs that generate documents to print out, and I use LaTeX and a bunch of packages. It takes hundreds of megabytes of software with all the packages. I've long toyed with using the MOM macros but can't get motivated to rewrite my programs and learn troff.
It used to bother me that groff doesn't support Unicode (as far as I can tell) but then realized that all I write is English so why am I fetishizing Unicode? groff will get me any accented character that I would realistically need.
You can have UTF-8 with groff using the preconv [1] tool, just pass the "-k" or "-Kutf8" paramater. It will preprocess your source and replace Unicode symbols with its groff special character form. Of course, the built-in standard fonts do not support many glyphs but it is no big thing to use a more Unicode friendly font like DejaVu Sans or Noto, check [2].
So this is my setup using MOM macros for nice typesetting.
But I fully agree on your last point, in the end I seldom need the Unicode universe for most of my documents.
Little-known fact: groff (more particularly grotty) does colour, albeit that people have secretly turned this off for you. grotty's own man page is in fact in colour.
I have very fond memories of WP5.1.... I still have some Model-M keyboards, so I could really recreate the "glory days" if I wanted. I could print out the little F-key cheat sheet for the keyboard as well.
But...
What about Unicode? Lack of Unicode seems like a dealbreaker for a lot of use cases. Even if you're not using Unicode, seems like you'd eventually want to work with Unicode text.
As an engineer, a lot of the "word processing" I do involves cutting and pasting text from other sources (citations, code samples, names, etc) that tend to be chock full of non-ASCII characters.
Or has this been solved somehow? I see that there are various hacks involved in getting support for the Euro symbol. What about a general purposes solution? I know that WP5.1 had support for international character sets. Perhaps somebody's cooked up an emulation layer that translates Unicode into whatever it is that WP needs.
It's also an odd little page. It includes directions to buy it, if you already have a copy. If you don't have a copy, it includes some details on how to use eBay.
Former employee had a Inventory system based on dos6 but the hardware (the pc and the barcode scanner) was getting really old. So i buy a new PC, installed FreeDOS and a new barcode scanner with Dos compatibility and it works perfect. DOS especially FreeDOS is just great.
Wow. I think Wordperfect may be the last time I enjoyed using a word processor. It was also maybe the first time I realized how important good design is. I couldn't you tell a specific reason it was better than Word, just that it always did what I expected.
Like your style on this. I'm contemplating similar setup but going bit more oldschool with Word 6.0 Win 3.1 edition, gotta work out what the smoothest way of integration with my Linux (Mint) desktop though. Integration is key for me and have to say my prior exp with Wine have been less than comforting (working but just so, not long term functional feel).
Wine doesn,t work great with Microsoft software, because they would take advantage of undocumented APIs, but W6W31 may be old and basic enough to work...
Nice that you are using WP on dosemu2 on Ubuntu on Windows. I once spoke to someone active in research into keeping data (formats) and programs available for very long time (like centuries) and I always thought it was a no-brainer because one could just run X on Y on Z on A on B on C to get ancient software D working.
Allthough I also have to admit that this approach didn't work for my favorite game [0] but now that I'm googling, I does seem to work! So someone got DirectX 9 or something working?
So like SLRN and Jed? Amazing. I used DOSEmu back in the day for Redneck Rampage under an AMD Athlon (Dear DOSBox users, DOSEmu was zillions times faster than DOSBox, I ran it at native speeds). This is impressive.
While I support your travels with Wordperfect (I wrote many term papers on that puppy), I can support the quote by the person you responded on twitter. He just showed arrogance and silliness when he said
"All that was ever really needed by an OS or Office apps was already there 15 years ago. Everything "invented" beyond that was just vanity changes with negligible long term added value, and constant moving around of UI to appear new and better. Change my mind."
I was a huge fan of Wordstar 6! Wrote so much with that thing ... way better than Word, Pages etc. even still. It's no wonder I prefer Markdown for writing / documents etc.
I should try to see if I can get Wordstar running under dosemu2 (I probably have a 3.5" floppy somewhere with it on, but I don't think I have the disk drive to read it).
Amazing. I used WP6.2 to write papers in high school and remember not only the tables and line drawings but you could also place images and annotate them. It's hard to remember how that worked but in those days it seemed those things were only possible on Macs. As I recall there was a big reference binder that documented the features.
Most of what I do in a word processor today, I could have done in Wordstar 3 under CP/M. Perhaps only the printing (no postscript) would have been a problem. Certainly most of what I do in Excel could be done in SuperCalc.
I remember using WordPerfect on VAX/VMS (on a decterm, no less), and I recall them having some flavor of Unix based binary, but, I can't find references to either out there anymore.
There is Wordgrinder[0] and can export to LaTeX for printing.
In classic Unix tradition, it looks way more plain than the screenshots of WordPerfect you posted :-P.
I had that version of WordPerfect on my 286! I remember how you could install the Trident video drivers and get some pretty high console resolutions. F11/Reveal Codes was super helpful too. I wish modern word processors had something similar. It basically shows you the equivalent of formatting tags.
I have the Win3.1 version of WP from that era in a DosBox. I've been meaning to compare it's grammar check to modern word processors. It was way ahead of its time.
I know Corel bough it, but it looks like it spun off again. I wonder if the current release still has any of the old source code, or if it was just rewritten and has the name for the brand: https://www.wordperfect.com/en/
Still owned by Corel (they have a weird "Corel Advantage" logo at the bottom of that marketing site and the Contact Us link goes to a bunch of Corel info).
The Corel of 2020 seems a fascinating Enterprise Licensing zombie. They appear to make all their revenue from (very) legacy Enterprise Licenses and government contracts and all of the software seems stuck in a time warp stasis with only the bare minimum of maintenance (presumably just enough to keep the Enterprise Licenses alive and well). It's almost sad to see how many classic computing brands they own and let languish in this state.
I remember the first computer we had at home. I was five, and Windows 95 actually looked kind of beautiful in all its square and right-angle glory. The first time we interacted with it was to print some Disney character's portraits meant to be filled in with color. I also remember playing Rally Championship on that, as well as the first Tomb Raider (1996), which looked incredibly immersive; we were mesmerized by it, and frightened! Entering that ominous cave at the beginning was scary as hell. There's nothing that evokes that kind of feelings for me anymore.
I too have some good memories about windows 95. I was stunned back then with all the goodies in there after running dos and win 3.1. Little did I know what other operating windows ripped off. I in turn had a pirated copy, I was only 12 years old and sharing was caring back then
Our first one was Mickey Mouse. I have that memory burnt into my visual cortex. So many fond memories... My brother loved the Lion King, we watched that thing so many times the VHS copy we had (still have, somewhere) was all worn out.
When I worked for Boeing there was some critical factory equipment that still ran on Windows 95 and even Windows 3.1. IIRC, the manufacturers had gone out of business so Boeing actually fabricated parts as needed to keep them running.
I recently got to witness the delicate use of an atomic force microscope from a very crude GUI on a Windows 95 computer. I was told by the person in charge of the lab that of course they could find a more recent tool but it would be too expensive and not worth it while the computer keeps working well.
I enjoy nostalgia as much as anyone, but there's no real need today. While attractive, this was a particularly atrocious line of OS in the reliability department.
The mind boggles at using it in a mission-critical environment and speaks to how dire and ridiculous the situation was in the late 90s due to Microsoft shenanigans and competitor blunders. When folks say to forgive MS now they've changed, perhaps, but point to this post so the damage is not forgotten.
Depending on budget one of these paths should be taken:
- Twenty year-old hardware is basically free at this point and can run a version of NT. Security/cost continues to be an issue however.
- Xubuntu or similar + (Wine || DOS/VirtualBox) would be more reliable and secure. If an unmaintained driver is a requirement, this might be the best choice.
- FreeDOS exists and is pretty good, for the truly ancient.
- React OS is a thing as well and there's a decent chance it can run older software.
Personally, I'd pick a Xubuntu or Ubuntu Mate LTS and whip up a native GUI in a robust language with plenty tests to replace such equipment.
I don't think Ubuntu or any of its variants--or "flavors"--are really well-suited for old machines. I find them to be very "bloated", as it were. I'd pick a bare-bones, basic installation of Arch/Artix and go upwards from there, probably using i3 as the window manager, too.
This is immediately obvious if you install them in a virtual machine - even on the latest fast hardware Ubuntu and Fedora both lag pretty hard. Even Xubuntu is pretty slow. I use a VM for Zoom and MS Teams, so after some time dealing with it I finally gave up and installed Arch Linux with a minimal Openbox configuration.
The VM gets to the desktop within a few seconds with autologin and automatic X starting, and is lighting fast (zoom and teams still a bit slow but everything else is instant).
I set the Openbox menu to only contain the items I need - Zoom, unofficial teams-for-linux, v4l2 (webcam) test&configuration utility, pavucontrol (volume control), SimpleScreenRecorder, Thunar (file explorer) with my shared folder to get recordings out of the VM, and a shortcut to a .txt with my Zoom meeting IDs and passwords (Mousepad as the editor).
It'd be nice to make a compact disk image from this and distribute it, but there are some machine-specific things that need to be configured for the VM - not hard, but you can't make it a zero-configuration download-and-run type thing (shared folders, selecting USB passthrough for webcam, configuring resource allocation to the vm, etc.). Installing Arch takes about 10 minutes and installing+configuring all the GUI software takes maybe 60 minutes more.
Of course part of the slowness in VMs is due to not having fast graphics which GNOME etc. rely on, but again Xubuntu is pretty slow too, and if you install a GNOME distro on bare metal it still lags pretty often. Plus the boot times for Ubuntu & Fedora are also really high in the VM.
Super fast startup is not a requirement for this application.
Also, I think you are overstating the case with the exception of Gnome 3. It just had a big performance regression fixed. Mate and xfce never suffered from it.
KDE has decent performance these days. Then again, if one goes out of their way to avoid the default DE, it stands to reason they might want to avoid the distro and its questionable choices entirely.
That may have made sense twenty years ago, but more time has passed that perhaps realized.
They're quite light when no unneeded apps are installed, and time is money, as they say. Spending a lot of time fiddling is a cost that is not going to help a no-revenue maintenance project like this succeed.
Perhaps a light CentOS variant would be better as they tend to have longer support windows.
> The mind boggles at using it in a mission-critical environment and speaks to how dire and ridiculous the situation was in the late 90s due to Microsoft shenanigans and competitor blunders.
Or due to the fact that Win95 was actually a very good user-friendly graphical operating system. There were very few that could really compete. Despite all the shenanigans.
> Depending on budget one of these paths should be taken:
You just mentioned mission-critical systems in the previous paragraph. None of the paths are valid upgrade paths. Especially not if you want to "whip up a native GUI with a requirement for un unmaintained driver".
Just reverse-engineering the driver for a train would be a major undertaking. And that's before you try to add an UI on top of that.
Not necessarily. I gave a range of options on purpose, depending on budget and safety reqs I don’t have access to. Also for desktops left behind for whatever reason. You may be over thinking it.
DOS is a true RTOS OS for the modern computers, because it not multitasking. Just block all the unnecessary interrupts, and you'll get the best response rate possible on the particaulr machine.
Dos was my first command line environment on a 8086 with 640kb of ram and a whooping 20Mb harddrive. I had on that hd so many games and software including win 3.1, it embarrasses me to think how much bloat there is in a modern OS. I wouldn't use DOS now, I'd rather go with a barebones linux distribution.
Got a working copy of XP running in virtual machine on Linux. Gets used for backing up my phone (Nokia E72) and managing the settings on audio DSP board i use in the studio (minidsp). All aging software backed up ready to reinstall if needed again one day.
the biggest advantage is the battery performance, If you are travelling trekking etc, the phone battery reliably can work for a week easily, no charging worries.
It is distractions free phone with just calls and SMS,
It is very robust device, very light weight and sturdy, I have dropped it from a height 4 floors and into a swimming pool and it still worked perfectly fine. The b/w display is high contrast making this the phone ideal phone in construction / industrial site, the modern day equivalent is the overpriced CAT android phone.[1]
I would pay good amount of money for a boxed one today. After few months( android or iOS) of usage I find the charge hardly lasts till the end of the day, and you are running around trying to find a charger, or lugging a battery pack.
Yes modern phones are computing device and consumes less power for what they do, however a mobile phone is a phone first, if I can't make calls at end of the day , what good are all the apps and features
To you; to me, it's a means to access the internet first.
But I still agree very strongly with your post. Smart phones have become this jack of all trades, master of none devices. So almost everyone hates how smart phones don't do well enough The One Thing they use it for the most.
Disabling data will make a big difference on smartphones. I usually use airplane mode for hiking (disabling it only when I need to check the weather or make a call) and have no trouble lasting a week, except in subzero temperatures.
Why couldn't you stick with it? I, too, use a somewhat old Nokia phone (the 6110 Navigator, from 2007), and I absolutely love it. I really like its soap bar form factor, its sliding keyboard, and its battery--it lasts for around three or four days, although it can really take a hit with long phone calls, of which I usually make two or three every week. I don't like smartphones; I think they are the main culprit of many of the ailments of the current world.
Yes, I use it to run Mach 3 CNC with my small desktop Taig CNC milling machine. I run it on an old pentium 2 computer that was made when computers still had parallel ports (needed to interface with the CNC drive box). Ain't broke, im not fixing it.
Probably running some point of sale, ATM or industrial machines where it isn't cost effective to upgrade. However, they are likely running super-customized versions of it. End users are likely running it in a VM.
I sell a SaaS point of sale program[0] for niche retail stores, and this comment made me laugh and cry in equal measure. It's a web app, and I have dozens of users who access it through IE 5 on Windows 95. Not a VM, not a super-customized version -- just a computer they've kept on life support for twenty years and refuse to get rid of. And this isn't even close to the most antediluvian nonsense I've gone through on support calls [1].
As long as it's not connected to a network I guess. I wonder though is someone still scanning for Win95 machines? Maybe it's safer to use it these days than it ever was during it's active period.
Depends on what you're letting past the firewall. Even ICMP requests could (not sure if this was ever patched) be used to DoS a Win95 machine. There are several open TCP/IP bugs with Win9x, the newest one being from 2008.
Lots of lab instruments run Windows 95. It's common on Tektronix oscilloscopes for example.
I remember battling with IT when they told me that it could not be connected to any company network. (I won.)
The new test equipment all runs Windows 10. It's weird to hop between pieces of equipment where some run still run MSDOS and others run Windows 7 or 10. They are all calibrated and still chug along doing what they need to do, just a little slower. The only annoying thing is when you can only grab data off the unit with either a GPIB interface or a floppy.
My initial reaction was that IT were correct but actually, as long as it’s sitting behind an appropriately configured firewall on an isolated network segment, this can be done safely.
15 years ago I ran into a govt agency relying on win95 to run a terminal client to access some as/400 app[0]. It was a tailored win95 under some special maintenance contract. Still surprising but alas.
[0] application being the nicest job ROI I've ever had the pleasure to witness, bare terminal UI, zero learning time, zero waiting time, fully dedicated to help you do your task quick and right.
Ha, that brings back memories of trying to get Windows 95 installed on an old 386 machine with a 100 MB harddrive, 2 MB of RAM and only a floppy drive. I think this was 2001 or so, and you could already download the floppy disk set for Windows 95. If I recall correctly, there was a warning during the installation that Windows would be very slow on this machine. The warning was correct. I think it took 10-20 seconds just to open the start menu. I quickly reverted back to Windows 3.11. That machine is stored somewhere at my parent's house. Last time I checked, it still worked.
As a kid, I didn't have that many spare floppy disks. Simple solution: write the first disk image on your father's machine you used for downloading it, start the installation on your machine, and as soon as it asks for disk 2, remove disk 1, insert it into your father's machine and write the disk 2 image to it. It took a while, but man was I proud when I saw that boot screen that looked so much cooler than that boring Windows 3.11 box.
By 2001 you could get a cheap 32MB SIMM with $10 and run W95 relatively well. Add a 486 which for sure where like $5-$10 (A Pentium MMX was about $20-30) and you'd be relatively well. Opera 6? still ran on 486 aceptability, and Lynx flew.
Yup, a friend of mine has a repeater he needed to reprogram but the software has been out of support since 1999. I ended up setting up a Windows 3.11 VM for him but Win 95 would have worked too.
Many years back I picked up a Toshiba Libretto laptop with Windows 95 on it. Bought it from an old man who was selling various old computing-related knickknacks at the MIT Swap Meet. (Among them was an intact Enigma machine for $250,000!)
It sat unused while I pondered what to do with it, then I started watching retrocomputing channels like LGR and the 8-Bit Guy -- and I realized I had a retro PC of my own! So out it came. I installed QBasic and a couple of other things on it. I wonder how long it'll keep going for. But it's real cool to have -- a netbook-sized, full-power PC from well before the netbook era.
The judiciary of Berlin just migrated from Windows 95 to Windows 10. At the moment the system appears to be next to unusable, and some judges refuesed to work with the new system. Therefore they are considering migrating back to Windows 95 until problems are solved.
You'd be quite surprised how many industrial machines. CNC, or similar run on win 95. Meanwhile the automotive industry basically depends on these machines.
Not Windows 95, but Windows NT 4 lurks below certain theme park rides, reading the PLC diagnostic data and printing out the roller coaster's mood once a day for maintenance to file away in a write-only file cabinet.
IT (us) have never in 10+ years had to help 'em. Can't say the same for the Windows XP-based successor PCs on newer rides though. Something about that 3/4/586 hardware is just solid.
About 10 years ago, one of my jobs in a pharmacology lab involved using a very specialized microscope driven by a PC running windows 95. A new license would cost an obscene sum of money, and it did exactly what it needed to do. The same setup is still in use and will likely continue to be used until it’s impossible to continue.
Then you need to support the train-specific OS, and find developers who know how to write code for this train-specific OS, and IT staff who can install the train-specific updates onto the train-specific OS.
As for why they would not use Linux? Leaving aside that these probably predate the wide adoption of Linux, Windows is often used in embedded devices which need graphical interfaces/displays. Again, ease of finding developers and third party tooling is a big driver.
I don't get it either. Can someone familiar with trains explain what on a train needs a desktop OS? A train is more like a car than an ATM machine, I would expect a collection of computers running either bare-metal or an RTOS like QNX/VxWorks (the pictured train is from 2000, so embedded linux hadn't taken off yet).
You'd probably have an OS with GUI for non-essential services like lights, intercoms, A/C etc. And present some additional info in nice graphics: the state of carriages, situation ahead on the line (as a complement to just flashing a green/red light on the board). The rest would run RTOSes or similar.
They hadn't been replaced as they were running some custom ISA interface boards developed by a research group in the 90s, and the community was still using the data output by the machine. And since it had been trucking along for ~20 years, convincing them they needed to develop a replacement was hard.
When I left, it was still there, chugging along. And once a day, a tech would wander in with a floppy and copy some data off and wander back to the control room where a USB floppy drive was attached to a workstation specifically to read this floppy and let the tech copy the data files off to a network location.