Gary Palter has made it work both on Linux and macOS, both x86-64 and ARM64. He also brought the original Open Genera for DEC Alpha to the same software level.
Yeah, my thoughts precisely. Genera is obsolete. I cannot imagine what possible profit could be made. There may be value in the design that could be of benefit even today, but I cannot understand people who hoard old IP.
Please. Dude could day “hey I want to open source this, someone do the work for me and show me where to sign” and people would trip over themselves rushing to make it happen.
Pain? The hardest part is probably finding a five inch floppy drive connected to a machine that can push to GitHub. Even that could be outsourced to some trusted resourceful LISP enthusiast.
There's a lot of preparatory and legal work involved in preparing a proprietary software project for release as an open source project. It's not as simple as just pushing the source tree to GitHub and calling it a day. For example, if portions of the code were licensed by other other copyright holders, permission from them is required. There are other cases where the code needs to be examined before release in order to edit or remove materials that are personally identifying, embarrassing, or could cause legal issues if publicly disclosed.
Then just put up a fundraiser. There's enough nerds that would pitch in I'm sure it could get covered and someone hired to do it.
Though I think it's more than likely that once people had free access to it and saw what modern conveniences were missing etc the mystique would be lost and it would be kept more as an archaeological asset than as an ongoing thing people wanted to maintain and use.
People would also discover that the architecture is not simple, but grown over a decade while there was rapid development of the basics people take for granted: standards, languages, networking, operating systems, ... In the mid 80s TCP/IP was an expensive add on for Genera. Later it was made a part of the OS. But development stopped before it could add things like IPv6, Unicode, basic security, HTTPS, and other things.
One could add it, but then one had to learn ZetaLisp (70s/80s), Flavors (early 80s), Symbolics Common Lisp (80s/90s), an object-oriented networking stack architecture from the 80s, ...
You would have a second life in a technology stack somewhere between the past and the future, in a parallel world.
It makes more sense to me for some group to attempt to replicate the IDE experience in a modern Lisp (or similar) flavour.
Still I also think these kinds of "live editing" environments don't translate well into some modern "best practices" around version control, deployment, versioning, etc. Remembering my experience with LambdaMOO & similar and then later Smalltalk/Squeak and recently with doing stuff with Julia's REPL, etc. They're fascinating and fairly productive, but I am not convinced about this technique's ability to scale.
> It makes more sense to me for some group to attempt to replicate the IDE experience in a modern Lisp (or similar) flavour.
It may not have the fancy UI, may not have a fancy user experience, may not be an operating system, may have a more primitive Lisp dialect, may not have a good multi-threading story, ... but it might exist and people put a lot of work into it: GNU Emacs and its extensions.
I think half the magic I associate with the Symbolics products is that they also made their own hardware, custom built for Lisp. Which is kinda neat and magical, if entirely impractical.
That plus the lovely keyboards, nice case design, etc. etc.
In the early days they had to - all the hardware was non-standard. Where would one get 36 bit memory cards? Where would one get CPUs which had a Lisp-specific instruction set? In those early days the computer (refrigerator size) would sit in a machine room (with enough power) and the user would be in his/her room with only a console (plus maybe a second monitor), keyboard and mouse - connected via a long console cable to the machine in the machine room. That was the experience for the programmer/user. The machine itself could have a lot of peripherals: network, tape drive, memory boards, color boards, frame grabber, disk drives, cpu accelerator, ... the programmer would see the driver code.
At some point in time they produced cards for the SUN VMEBus and for the Mac Nubus. Then the only thing left was the keyboard.
So, I agree, the Lisp Machine concept was a combination of Hard- and Software. The emulators of today only provide some of the software parts...
Yeah I have here the book published by the folks who do did the Linn Rekursiv computer, which is sorta like the same philosophy as the Lisp Machine but for a highly object oriented system. Tagged memory, hardware supported garbage collection, object-structured memory, etc. I believe it was also positioned as a VMEbus card, etc.
My understanding is Moore's law just ended up making these attempts uneconomical. By the time HW eng and manufacturing is done on the custom components, "orthodox" general MPU capabilities end up leapfrogging it and a well engineered software VM can outperform.
Maybe we could start to see this change again, who knows.
Maybe unikernels running highly tuned VMs on the hypervisor could be this generation's equivalent. Full control over the MMU, tagged pointers all the way down... hmmmmm
Symbolics could have built computers with custom hardware. At some point in time the 64bit RISC CPUs (MIPS, Alpha, SPARC, POWER, x86-64, ...) were all powerful enough to run Lisp. Genera got ported to the DEC Alpha). They would have been also powerful enough to run Lisp as an OS. But that would also mean writing device drivers, interfaces to hardware and a lot of other stuff. It would have cost substantial amounts of money to port Genera to the metal. Who would have bought that? The idea of a high-level language running as an OS on the metal did not have any future. I don't think there are good examples where this was successful on the market.