Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Owner of Symbolics Lisp machines IP is interested in a non-commercial release (hachyderm.io)
230 points by mepian on July 7, 2023 | hide | past | favorite | 143 comments


Don't hold your breath, these rumors have been going for well over a decade:

2018: "The current owner of Symbolics displayed interest in open-sourcing Genera a few years ago but nothing happened since then." https://news.ycombinator.com/item?id=17824330

2014: "The problem is that the Symbolics IP is now owned by John Mallery; he has stated he has plans for making it available but so far (several years) has not yet done so." https://news.ycombinator.com/item?id=7882034

The software itself can easily be found these days, if you're interested for hobbyist reasons.


Yeah, and I wouldn’t give Mallery a single cent for any of it unless he can demonstrate that Andrew Topping actually paid the executors of the Symbolics bankruptcy for it; if not, it shouldn’t have been a part of Topping’s estate.

I absolutely do not understand why Mallery is just sitting on it instead of making it available to everyone. It has zero non-historical value. Just put it out into the world and let it be examined as the historical artifact it is.

(That was my plan when I learned of Topping’s death. Unfortunately, Mallery beat me to acquiring it, and then just planted his ass on it.)


> (That was my plan when I learned of Topping’s death. Unfortunately, Mallery beat me to acquiring it, and then just planted his ass on it.)

That reads as someone who sees it as a purely commercial investment, and is waiting for it to mature.


If his plan was to sit on an asset hoping it would appreciate, he would have been better off buying wine, cars, baseball cards, or even beanie babies.


The potential value of an investment is in the eye of each investor. You believe beanie babies are a better investment, but Mallet might not agree with you.


Fortunately, the market will get to decide which one of us is correct. I'd place my bet on almost anything other than an obsolete operating system.


You're a mallet


This is more than a rumor: Gary Palter is one of the last employees of Symbolics who wrote the emulator in the first place, and he communicated with the owner.

He co-authored this article about the development of the emulator: http://pt.withy.org/publications/VLM.html

The fact that he still has hope is significant.


Is he the copyright holder? Being the last employee doesn't have anything to do with copyright, if you're an employee of a company.


The company (Symbolics) was owned by Andrew Topping. Topping died, and Symbolics' IP was sold at his deceased estate auction. It was bought by John C. Mallery, a (former?) MIT professor, so he is the current copyright holder. (Or at least he asserts he is–some have expressed doubts over the legitimacy of the transaction, but the only way to get a definitive answer to that would be through a lawsuit, and nobody thus far has wanted to go down that path–the risks are that one might spend a fortune on lawyers and Mallery could win.)

Gary Palter was one of the last employees, and as such doesn't own the copyrights. However, he is in personal contact with Mallery, and the enhancements he has made to Genera (such as porting it to ARM) have been done with Mallery's permission.

Nobody seems to understand why Mallery is squatting on this rather than making it publicly available. Palter has never clearly explained it, although I imagine he doesn't want to burn his bridges with Mallery, and that may limit what he's able to publicly say.


> Nobody seems to understand why Mallery is squatting on this rather than making it publicly available.

Clearly he's hoping John Titor shows up and offers to exchange priceless insights about the future in exchange for access to the code in order to repair some weird embedded system in the far future. :)

More seriously, its not uncommon for people to have unrealistic expectations of the value of the things they've collected. They liked them enough to collect them, after all. People seeking them out might help cement the inflated valuation.

The sad thing is that when people die the people that inherit the assets often see no value in them at all and lose or discard them.


> More seriously, its not uncommon for people to have unrealistic expectations of the value of the things they've collected.

Does anyone know how much Mallery paid at the auction? It makes a big difference whether it was $50 or $50,000

If he paid a significant sum for it, he may still be clinging to hopes of an eventual return on his investment.


Rumor is he presented an unpaid invoice for on the order of $10K to Topping’s estate and got the IP that way. If it’d actually gone to auction I was l prepared to pay quite a bit more.


The answer is obviously “money”. An MIT researcher’s legacy will be that he withheld a huge piece of computing history from the world for personal gain. What a stain.


What money? In what alternate universe is a well-heeled investor or commercial entity going to pay a bunch of money for an ancient single-user operating system written in and for a language that, let's be honest, almost nobody uses?


There is enough money sloshing around in tech that it is plausible someone or some company would buy it for a lot more than Mallery paid for it. It was probably bought for, at most, a few thousand US$, on a lark. It's a lottery ticket.


Why would there be a stipulation for non-commercial use if this isn’t about money?


Maybe the owner thinks Genera will become more valuable in the future? My point is I don't see any possibility of that actually happening. I'd imagine there is some revenue for maintenance/bug fixes of existing installations (David Schmidt has apparently been running the hardware maintenance/support side of things for quite some time,) but I don't see how the software does anything other than get less valuable commercially with every passing year.


> I don't see how the software does anything other than get less valuable commercially with every passing year

That’s the worst part of all this. It’s not like he’s sitting on a goldmine, he’s hoarding something of immense historical value because it’s slightly more beneficial for him to do so than it would be to share it.

I can’t imagine steering your legacy from being immortalized as an MIT AI researcher to… that.


Mallery looks like somebody in their 70ies, i guess we'll have to wait for the next deceased estate auction?

I know it's not pretty, but it's an option.


> The software itself can easily be found these days

where?


Good article on getting things set up: https://archives.loomcom.com/genera/genera-install.html



archive.org


Some friends and I tried to buy chips from them recently. However, they wanted a relatively significant amount of money for them that more accurately reflected commercial rates than historical collectible prices. Seems like they are operating as a corporation still and may still be protective over their IP

Background: I maintain the largest die photography collection on the internet. I wanted to take some die pictures for historical reasons. Fortunately I was able to get one to image from a third party, although I haven't posted it yet


I would think that in the few remaining CPUs specially designed to run Lisp have a significant historical value. Just having a working original Lisp Machine is extremely rare these days - given that there were probably only 10000 ever being made and the older ones (from the early to mid 80s) are extremely fragile.


They’re not that fragile.


I doubt you’ll be able to buy working ICs to decap since they may be needed to bring a broken system back to life.


I have 5 Symbolics Ivory CPUs (1 of which is in a working MacIvory III card.)

Recently I found that not far from where I live, one can rent time on an electron microscope for a very reasonable price. But the lab in question does not have the gear needed for decapping and delayering.

If someone knows of a reasonably-priced (mid five-figures or less) commercial facility that will do it with a good chance of success, I'd be interested. But I won't be sending these chips to some random person who may or may not be able to (or even make an honest attempt) to do the job, as I have only this small handful of units and currently no way to get more.


Maybe we can save up and buy it from this dude, then open it something like what happened with Blender? I hate the idea of hobbyist licenses - it reminds me of the torturous OpenVMS hobbyist program or whatever the hell that incomprehensible situation is.

Also, curious how old is the IP owner in this situation? What happens if he croaks with it?


As with any asset, it would become part of his estate, to be sold to pay its debts and/or divided among his heirs or left in his will. It may result in another opportunity for someone to acquire it at probate.


AFAIK Symbolics is basically a one-man operation, and the owner has been unwilling to sell the rights to Genera for any price.


DKS doesn't own the IP, and wouldn't sell (to me at least) any unpublished sources, schematics, etc. for any price at all.

As of several years ago, though, he was willing to part with a number of new-old-stock Ivory CPUs, for roughly the cost of their weight in gold. (And periodically offers Ivory machines for sale on eBay. Here's what these look like: http://www.loper-os.org/?p=2857 )


Damn that’s too bad. Part of computing history and all.. wonder why he is motivated that way


Maybe he's still hoping to use it to implement his "wish": http://web.archive.org/web/20011107164802/www.ai.mit.edu/peo...

"Wish List: a knowledge-based operating system running on a MIMD parallel machine. The system should exceed the productivity of the Lisp Machine by several orders of magnitude and integrate seamlessly with a global knowledge base and with a global computational environment."

Looks like he recently registered a new company: https://opencorporates.com/companies/us_nh/872107


Does he actually have any connection to Genera or even Symbolics?


He developed a well-known web server for Genera: https://en.wikipedia.org/wiki/CL-HTTP


i doubt he was offered >= 1 bln $. that's probably the price he's waiting for


AFAIK it's still owned by one John Mallery, a wealthy MIT prof with military and intelligence establishment connections.

My favourite "conspiratorial" hypothesis concerning this question is that the Symbolics IP was ordered to be perma-buried for "national security" reasons (as it threatens to make practical systems with capabilities much superior to today's state of the art, but running on early 1990s IC fab processes) and that Mallery (who, per rumour, purchased the IP for next to nothing via a backroom deal) was the designated undertaker.

AFAIK it is still unknown whether the chip die masks, source for (the parts not included on the tapes/CD) os/compiler/IC workflow -- was even preserved to this day.


Just learned about blender's initial proprietary beginnings. It's interesting that they found the most success with the open source model.


In the Blender case, the community bought the trademark and copyrights from the commercial owners (some VC investors if I remember correctly). https://www.blender.org/about/history/ Those donors are still listed in a file in the Blender source: https://github.com/blender/blender/blob/main/doc/license/bf-...


It probably can't be understated though how important it was that Ton, the original author was spearheading that whole thing. He would have had both an existing relationship with the investors, and enough clout with the community backers to envision a future for it... trust which was obviously well placed.


> https://www.blender.org/about/history/

citing from the link:

> On Sunday, October 13th, 2002, Blender was released under the terms of the GNU General Public Licence, the strictest possible open-source contract. Not only would Blender be free, but its source code would remain free, forever, to be used for any purpose whatsoever.

keep this in mind when you pick a software license.


Yes, it's a good choice, guaranteeing continuing freedom.


This would be a major development should this come to pass. Even if Genera doesn’t get released under a OSS-compliant license, this will be beneficial for hobbyists and historians (though I don’t know how restrictive non-commercial licenses are for people in academia), similar to the recent release of the Apple Lisa source code (which has similar stipulations). I’ve always wanted to use Symbolics Genera, but I was born around the time of the AI winter of the late 1980s.

I wonder if there were any efforts in the 1990s or 2000s to create a FOSS clone of Genera in the vein of either the GNU project, the Linux kernel, and 4.4BSD and its descendants. I heard that Genera is quite complex, but complexity didn’t stop ReactOS and Haiku from chugging along after all these years.


While not Genera, Interlisp-D at Xerox was their approach to Lisp Machines, and is available now.

https://interlisp.org/

https://www.softwarepreservation.org/projects/LISP/interlisp...

"Xerox PARC:Interlisp D Programmers Tools"

https://www.youtube.com/watch?v=xgMZ9gRhq8A

"The Medley Interlisp Project: Status and Plans"

https://www.youtube.com/watch?v=x6-b_hazcyk


What advantages does Genera have over Interlisp-D?


> I wonder if there were any efforts in the 1990s or 2000s to create a FOSS clone of Genera in the vein of either the GNU project

Of course GNU project originally was heavily influenced by LISP machines, its only later that the lispy aspirations largely died off. Example from the original announcement:

> In particular, we plan to have [...] Lisp-based window system through which several Lisp programs and ordinary Unix programs can share a screen. Both C and Lisp will be available as system programming languages

As I understand the idea was to build free Lisp Machine clone which would have bits of unix in it.


While nowadays they are a bit more open to which languages to use on GNU projects,

https://www.gnu.org/prep/standards/standards.html#Source-Lan...

Back in the day it was mostly about C, hence why C adoption grew again as GNU/Linux gained adoption, when it was already being taken over by C++ in the Apple and Microsoft/IBM ecosystems.

> Using a language other than C is like using a non-standard feature: it will cause trouble for users. Even if GCC supports the other language, users may find it inconvenient to have to install the compiler for that other language in order to build your program. So please write in C.

-- http://web.mit.edu/gnu/doc/html/standards_7.html

Lisp was only considered as part of specific applications like Emacs, as you can read on that surviving version from 1994.


For a while there Guile (Scheme not Lisp, but similar vibe) was being pushed as the preferred glue and extension and scripting language for the GNU ecosystem, but this didn't really go anywhere. Nor did the elisp->euile transition in GNU emacs ever happen.


> but this didn't really go anywhere.

What about guix? It uses guile ...


> I wonder if there were any efforts in the 1990s or 2000s to create a FOSS clone of Genera in the vein of either the GNU project, the Linux kernel, and 4.4BSD and its descendants.

Not that I know of, but here's an install guide for the real deal: https://archives.loomcom.com/genera/genera-install.html


Also, the predecessor MIT CADR LispM has had an emulator and OS for awhile.

https://tumbleweed.nu/r/bug-lispm/forumpost/7475d8a3db


There is also an emulator for the LMI Lisp Machine [1].

[1] https://github.com/dseagrav/ld



You'd hope that that these folks could at least do a GPL-style/copyleft license, which would effectively prevent (non-negotiated) commercial use anyways, while still enabling the open source community to enhance and distribute?


I bet there is a valid argument that there’s nothing here of any commercial value anymore.

The industry has gone beyond “catching up” to surpassing the values of the Lisp Machine and Genera. Any grand ideas of this era have been considered, and either mined, reimplemented, and exploited or simply rejected as being past their time.

The lack of a Lisp machine or environment like Genera is not holding Lisp (much less the entire modern family of Lisp-y languages) back. And modern IDEs are off the charts, even if they don’t check every single box of what Genera has to offer.

As an industry, we’ve not just stood on the shoulders of giants of the past like Genera, we’ve stepped off and up and moved ahead.

I’ve seen the Genera image that’s floating around, it ran in a VM of some kind. There’s a couple YouTube videos of demonstrations, and maybe I’ve seen the wrong ones, but I’ve just seen them as interesting but not necessarily compelling. It would be wonderful to see a thorough review from a modern perspective.

And, yea, they should set the system free. It’s pushing 50 years old, and we’re in “internet” time, so who knows how much that is in “No one knows you’re a dog” years.


When I got that leaked Genera image going in Linux I felt like I'd found a crashed UFO. It was incredibly inspiring and I don't think its true at all that we have surpassed it.

Why do I still have a clunky character-mode terminal instead of a Listener that can display rich text, images, mousable forms? Just think what we'd have today if we'd worked on that paradigm for 30 years instead of fetishizing the limitations of 70s minicomputers.

Why is it that when I type a command into said terminal and forget a parameter, I have to delete it, or open another window to type 'man', whereas on Genera I can hit <help> and view (rich, hypertext) documentation for a specific parameter, inline, while still typing in the command? That little feature was a revelation.

Genera's fluid, ergonomic developer experience is something we are turning away from more and more these days. Programming is increasingly surrounded by the most tedious bureaucratic and administrative work. The hoops I have to jump through before I can start creating something in a programming language are only increasing. If people had paid attention to Genera and to Lisp machines it wouldn't be like this.

And I've only mentioned surface aspects of the user experience. I haven't talked about being able to debug anything, or the idea that what look like applications are actually "substrates" that I can potentially use as APIs for my own work. We haven't scratched the surface yet.


Maybe not of commercial value.

But I took the time to get SLIME set up, with SBCL, and it seems like a completely different world. I’ve used languages with dynamic loading and REPLs before, and it seems like we still have something to learn from Lisp. Like, the experience is far from ideal—Lisp has its own problems. But it is just so damn nice to redefine a function in a running system, without having to then get the system back into the state that I need. I’ve used IDEs with dynamic code patching, and I’ve used Python systems that reload modules, but there have always been problems with the ergonomics.


Never understood this sentiment. I was editing functions without restarting the whole program using visual studio >15 years go and the ergonomics were great: I just click the line number next to the function, edit what I need to edit, and click continue to use the new function.

https://learn.microsoft.com/en-us/visualstudio/debugger/edit...


I’ve used that before, but the Lisp stuff still feels a lot better. I’m not just running a program and swapping out a function, but I’m using a REPL, some powerful introspection capabilities, a compiler, and a running system all together in a way that just works really nicely. I can add new files, import new libraries, and redefine structures. Objects that I print out in the REPL can be copied out and pasted directly into code. Rather than running the program and fixing the problems, you keep the system running the entire time you’re working on it. And the condition system provides a lot of help when you screw something up.

To be clear—I’m not making the argument that people should be writing programs in Lisp. Just that there are some things to learn from the way Lisp development works.

In the past, when I’ve used the various edit-and-continue tools, it felt it was just cutting some time out of the edit-compile-run cycle. The Lisp system feels more like a workbench, where you create fixtures to try things out while you are building the system.


Have you tried a smalltalk environment?

There's plenty of mature options both commercial and free, and to me, many years ago,they have a similar feeling.


Program Edit and Continue is a fragile self modifying C code hack that you would never dare try in production on a customer machine.

It doesn't simply replace a function binding; it actually replaces code and puts the instruction pointer of existing threads that were running that code into some similar location in the new code. (Which is pretty amazing, to be sure).

What's going on in Lisp code reloading is something a lot pedestrian; just new functions are replacing old ones. Thread which are in the middle of running the old functions continue with those ones. When the last thread is done executing a function, it can be garbage collected.


Editing code live in production should not be part of your workflow, so kind of a pointless feature.

I don't work with life-critical software and yet auditors demand that every code-change to production is peer reviewed and linked to a jira item.


Ticketed and peer-reviewed fixes can be deployed live in production as if they were live code edits. Technically it is the same thing; it's just a process difference.


To be clear, I’m not describing “editing live in production” as the workflow. The workflow is editing code live on your development machine, committing the changes, and going through the normal code review + CI/CD pipeline you’d set up for any other project.

Something like Fix & Continue in Visual Studio is good for testing out smaller changes. By comparison, something like SLIME + Lisp is powerful enough to use for developing new features. The running system on your developer workstation is mostly synchronized with the source code on disk because the interface you use for making changes is the editor. This synchronization is not perfect, but that’s why you have your unit tests and CI/CD pipeline.


Tell that to JPL.


It used to be like that, however the new hot code reload uses an improved approach, althought it still isn't Lisp/Smalltalk.


> puts the instruction pointer of existing threads that were running that code into some similar location in the new code

If you're implying that this isn't thread safe, you're wrong. Edit and continue works just like how you described Lisp code reloading. Works just fine in production if you're crazy enough to push a debug build.

I dunno why Lisp people keep advertising ancient and widely distributed tech as revolutionary. Eg have you looked at LLVM XRay? It's yet another "self modifying C code hack", except it's explicitly designed to work in production on release builds: https://llvm.org/docs/XRay.html


> stood on the shoulders of giants of the past like Genera

I don't buy it. What in the world actually traces its roots to Genera in any way, outside of the Lisp microcosm?


Generally Java and the JVM. The source code for J9 and Hotspot VM are taken from Smalltalk, which of course had a lot of mixing with Lisp. If you look at Mark Reinhold and John Rose (architects for the OpenJDK project), then you'll also see that they're Lispers.

I don't think the roots are going to be source code-based, but based on cultural transfer.


The question isn't about Lisp influence, though.

But the idea of Java being Lisp influenced is questionable. Though they had Guy Steele, who said something about bringing C programmers halfway to Lisp, pretty much the the only Lisp idea in Java is garbage collection.

The JVM is hostile against the efficient implementation of Lisp-like languages; it doesn't let you pack tag fields into pointer values.


Is it that big of a step to assume that old school Lispers know about Genera and have been influenced by it?

>But the idea of Java being Lisp influenced is questionable. Though they had Guy Steele, who said something about bringing C programmers halfway to Lisp, pretty much the the only Lisp idea in Java is garbage collection.

Java also has some dynamicism through classloaders and reflection.

>The JVM is hostile against the efficient implementation of Lisp-like languages; it doesn't let you pack tag fields into pointer values.

It's a good trade off for Java, pointer coloring is used by ZGC for example.


Not Genera, but there is a thread of TI Explorer -> .NET.

Maybe PowerShell can also be described as using some of the same concepts of manipulating data as the Genera UI.


What happened was that a lot of Lisp/AI companies had to let people go. Amongst them from the Lisp Machine companies: LMI, Symbolics, TI, Xerox, ... but also Lucid and a few Expert System vendors. Some of these people were very experienced developers. There were only a few Lisp/Lisp-like companies/projects to go: Apple/Dylan, Harlequin, Franz, Clozure, ITA, ... They also worked on Java and .net infrastructure and languages.

Dan Weinreb wrote an object-oriented database at Symbolics -> Objectstore was founded by former Symbolics people. Their C++ database was said to be influenced by Symbolics Statice.

Patrick Dussud from TI went to Microsoft. Dave Moon went to Apple. Gary Palter worked for Clozure. Steele worked for SUN on language design (Java, Fortress, ...). Weinreb later went to ITA -> worked on the flight search engine written in Lisp. There are a bunch of other examples.

A bunch of language infrastructure or even language designs was influenced.


The GPL does not prohibit commercial use at all.

Often when historical stuff like this is released for "non-commercial use", they're not just talking about not using the code itself in your own non-open-source product, but they mean "you can't run your business on this software."

The GPL certainly doesn't stop a commercial entity from downloading and using a GPL-licensed accounting package such a GnuCash to keep track of their company finances.


Having never used a lisp machine, and only having a basic understanding of it -- but if the whole system is a lisp image + apps are just calling into the OS as if they were lisp functions, woulden't a 'pure GPL' be problematic becuase EVERYTHING running in it would have to be GPL? Are there clear linking boundaries in symbolics lisp between apps + the OS ?


The Linux kernel shows that you can have GPL symbols and non-GPL symbols.

Perhaps more relevantly, here is an example of a GPL-ed Lisp implementation which has special provisions that allow proprietary programs to be redistributed which use it:

https://clisp.sourceforge.io/impnotes/faq.html#faq-licensing

One notable rule is that applications that access symbols non-portable internal packages (considered to be CLISP extensions) must comply with the GPL.

FFI is one of those packages. So this probably means that a proprietary application that extends CLISP via FFI (e.g. to call its own proprietary library) must split off that piece away from the application and make it GPL. I'm guessing that the rest of the code, which depends on the CLISP+extension, doesn't have to be GPL.


GPL treats isolated processes and shared libs as a license firewall. That doesn't happen in Lisp world.


It doesn't. You can trip the GPL on anything that makes your code and the GPL-covered code form the same "program." What that term means exactly is strategically ambiguous; but it definitely does NOT mean "same address space."

Shared libraries are very much NOT a firewall either, Stallman explicitly said otherwise[0] and Lesser GPL is there specifically for people who want it the other way round.

Linux is special - it's licensed with an exception that says user space never trips the GPL. Linus has also further interpreted said exception to mean that code that only touches user space equivalent APIs can live in kernel space without tripping GPL. They even have DRM[1] that enforces this interpretation on LKMs.

Absent that exception, who knows. The GPL copyleft is deliberately written to be as strong as copyright laws are, and we live in a legal environment where APIs can by copyrighted. So it's entirely plausible to argue that a GPL operating system trips its copyleft on all software written for it. A less hypothetical example: packaged emulators. If you sell a proprietary game wrapped inside a GPL emulator, that's a single program now, and you're violating GPL. However, while several emulator developers have had their work used in exactly this way, none of them have been willing to demand a relicense of the game their emulator was packaged with.

[0] https://sourceforge.net/p/clisp/clisp/ci/default/tree/doc/Wh...

[1] Digital Rights Management. Yes, Linus could actually sue a driver vendor that circumvented the Linux kernel linker licensing rules under DMCA 1201. GPLv3 explicitly contravenes such interpretations of the code, but well, Linus ain't touching that license with a ten foot pole.


The emulator example is not a great one, it could be argued is an aggregation, not derivative.


Packaged emulators specifically refer to the case where you're selling your own game and wrapping it in an emulator to do so. For example, all the old iD games on Steam are wrapped in DOSBox.

In that specific case, the player has access to all the game files, and it's trivially easy to extract DooM out of the wrapper and play it somewhere else. I would agree that the GPL's "mere aggregation" language probably covers this particular use case - but I use the weasel word because the FSF's FAQ[0] about it is uncertain. This particular sentence comes to mind:

> But if the semantics of the communication are intimate enough, exchanging complex internal data structures, that too could be a basis to consider the two parts as combined into a larger program.

Pretty much any program whose responsibility it is to host other programs (either OS kernels or emulators) is offering up interfaces that will carry intimate details of the host into the guest. In fact, this is precisely the reason why emulator development is a never-ending rabbit hole of implementing other people's bugs to get games to work. Another example of 'intimate communication' would be something like io_uring, where the OS kernel and user program are writing into ring buffers and trading ownership over subordinate data buffers around.

While the FAQ says a judge would ultimately decide each case on its own, practically speaking judges are going to defer to the guidance of the people who wrote the license unless there are facts to the contrary (e.g. an emulator developer said it's OK to aggregate game and emulator this way and then tried to change their mind). If Stallman says "sharing intimate details of execution constitute combination" then judges will take his side.

Another potential argument comes about if we swap out Steam for, say, the iOS App Store. Apple doesn't provide any ability to extract resources from iOS apps[1], nor do they allow shipping unpackaged emulators[2], so the user cannot meaningfully disaggregate the emulator from the game program as it has shipped on the App Store. Why would this specific case not be combining two programs into one?

[0] https://www.gnu.org/licenses/old-licenses/gpl-2.0-faq.en.htm....

[1] They aren't encrypted per se, but you don't have full filesystem access as the owner of the device, so...

[2] Apple doesn't want users downloading third-party code. This actually has nothing to do with the GPL - you ARE allowed to ship GPL code, if you provide custom EULA language that says that the GPL prevails over the App Store EULA.


Interesting how differently how one can view things.

I'd say, emulators are the easiest to defend. First and foremost, they emulated something which existed before. And the fact that there are different hardware and software emulations, several of them completely different to one another internally, proves that the program does not the intimate details of execution. Also, I read somewhere that especially if something existed before is a big difference when judging a GPL program. If a GPL program clones a previously existing behaviour, how can you with a straight face say an aggregate is derivative?

(Of course, there can be other things like GUI integration which muddies things up.)


If a GPLed shared library is loaded into a process, everything in it has to be compatibly licensed.

GPL-incompatible applications that dynamically load a GPLed library and use it optionally can probably get away with it.

If you make a proprietary program which can optionally use a GPLed dynamic library, which you don't ship with the program, you're likely untouchable in court, if your attorney argues the point with a tongue more silver than the other guys' attorney.


a different definition could be applied here, which would be fine because it would not be an additional restriction but less restriction than what the GPL would demand.

on the other hand, not having this boundary would mean that a lot of commercial use would be prevented, which would work in favor of those who originally were looking for a non-commercial license.


As long as you don't distribute it, you are OK.


For GPLv2


And also GPLv3. It's only AGPL that has stipulations for software you don't distribute t ok others.


kinda. not really. but I think this is an important point about the gpl. it hinges on this notion of 'linkage' which is a specific technical implementation which is partially on its way out.

but I don't see why it would be a concern here? you're saying to develop a non-gel application on top a gpl-d genera? I guess so.


Lisps have linkage via symbols. Loading a shared library on Unix via dlopen() is just a frankenstein version of loading a compiled Lisp file.

Linkage is a very abstract concept; it just means name references in this piece here connect with name definitions in that piece over there. The tech may change, but the concept won't easily go away.

Function bindings being established or replaced, and used by code in other files, is linkage.

If you load a proprietary compiled file (.fasl or whatever) into a GPLed Lisp program/image such that either uses symbols in the other, that is a GPL violation.

However, a given Lisp can spell out exceptions to the rule. Like that a proprietary module may be used, provided it only uses certain symbols (in the module -> program direction), and certain registration mechanisms for its code being hooked in (program -> module direction). If it uses GPL symbols then it must be GPLed. Likewise, if the program is hacked to bypass the GPL-free module registration mechanism, so that it calls some proprietary symbols directly, that is also a GPL violation.

Same as Linux kernel .ko modules, basically. If you hack the kernel to call some function in a proprietary .ko, then that's a tainting situation. A .ko calling non-GPL functions, likewise. A proprietary .ko's module_init function being called by the kernel is not a GPL violation, and that module_init calling a non-GPL symbol to register something is also okay.


> it hinges on this notion of 'linkage' which is a specific technical implementation which is partially on its way out.

The GPL does not care about linkage status. The GNU Library license, which I originally wrote, is for cases such as this, and the lispm code could be licensed that way.

I don't understand what you mean by "partially on its way out".


GPL3 kinda cleans this up a bit, doesn't it?


I mean I guess I should have explicitly said "derived distribution for commercial projects" but I kinda thought that was obvious since nobody is going to use Genera as is, it would require substantial upgrading to make it useful for the 21st century.

Which would be a derived work under GPL and require publication of sources if distributed etc. etc.

GPL makes a lot of sense to me for something like this. Nobody is going to make money off Symbolic's old IP, but if they somehow were, clauses in the GPL3 or some variant of a copyleft license would likely force them to contribute back.


i can take something like Octave (very GPL) and do commercial things with it all day. the GPL would only be relevant if i start distributing Octave itself. (or, in your suggested world, if i started distributing the hypothetically GPL'd genera)


I just don’t know why much of the Lisp ecosystem is closed/commercial, like if you wanna do Common lisp you have to pay for Lisp Works. Could it be one reason for Lisp’s low popularity? If I want to pick up Python now, there’s amazing free tooling for it. If I want to pick up Common lisp, well …


The era when Lisp had its greatest commercial success (the 1980s) was a time when free (as in either beer or speech) high-quality development tools for any language were rare. This was the golden age of proprietary software; people were expected to pay for operating systems, compilers, editors, and other tools, and there was increasingly no expectation of having access to the source code. The GNU project was started in 1983 by Richard Stallman, who used to work on MIT Lisp projects (many of his colleagues became part of either Lisp Machines, Inc. or Symbolics, which all came from MIT’s work on Lisp machines).

Back to proprietary software, during the AI boom of the 1980s, Lisp machine vendors had success selling high-end workstations to customers willing to pay good money for Lisp environments. This dried up during the subsequent AI winter, though some customers were able to move their Common Lisp solutions to commercial Lisp implementations that worked on workstations or servers running Unix or Windows. But at this point Lisp no longer had the same level of commercial interest, though the legacy and mindshare of Lisp grew through the use of Scheme in CS education (e.g., SICP and the team that wrote PLT Scheme, which was renamed Racket) and the advocacy of Lisp from prominent developers and researchers like Paul Graham, Richard Stallman (Emacs), Eric S. Raymond, Alan Kay (while he’s of course a Smalltalker, he’s spoken fondly of LISP 1.5 and also of The Art of the Metaobject Protocol), and many others. There are also many people who used Symbolics Genera in particular and who speak highly of its development environment, sometimes expressing the sentiment that modern development environments don’t compare to it.

There are FOSS Common Lisp implementations with wide usage. The most notable is Steel Bank Common Lisp (SBCL), and I also hear of plenty of people using Armed Bear Common Lisp (ABCL, which runs on the Java virtual machine) and Embeddable Common Lisp (ECL). There are other FOSS Common Lisp implementations that I know less about.


Thanks, this is interesting. I always like to know more about the history of the tech I use.


Then I highly recommend the book "Hackers: Heroes of the Computer Revolution" [1]. It describes the hacker culture and the hackers behind a lot of things we use to this day, from the early 50s to the mid 80s.

[1] https://en.wikipedia.org/wiki/Hackers:_Heroes_of_the_Compute...


Thanks, I'll check it out.


Symbolics is actually the founding reason for Stallman becoming full Stallman and all of the free software / GPL thing.


I will throw makerlisp in there which is a hardware lisp from Luther Johnson.


Just use SBCL - https://www.sbcl.org/

Pick an editor: https://lispcookbook.github.io/cl-cookbook/editor-support.ht...

Good enough for games (and other commercial offerings) - https://kandria.com/


>Pick and editor

Well, Emacs, SBCL and SLIME are like bread and butter for obvious reasons.


I’ve recently been enjoying using Alive with vscode(and copilot). Everyone suggests emacs+slime but it always felt like too many things to learn at once. Being able to use my usual ide has made it so much more pleasant. Recommend it to newcomers.

https://github.com/nobody-famous/alive


And you’ll get a pile of UNIX tools that are approximately nothing like Genera in actual use.


Yes, in that you can actually use such tools to rapidly develop and ship software that runs on consumer machines (even Windows). Or did you have a more interesting point?


The request was for an environment similar to Genera, not an optimal environment for deploying to consumer systems. While it may work quite well for the latter, it fails at the former.


I don't see how you read that from behnamoh's comment. The 'request' or rather confusion was that the only way to do Lisp, historically and now, is with proprietary stuff like Genera or more modernly LispWorks, so what's there to do if someone wants to get into it now. On the contrary, I pointed out if anyone wants to get into Lisp now, in 2023, there is also amazing free tooling that's very capable just like many other languages for building and doing the sorts of things people typically want to do in 2023. The only other part of the comment I didn't address was just my agreement that proprietary emphasis did hurt Lisp adoption, though not getting Quicklisp until as late as 2010 hurt it more. (But maybe you could argue that was an extension of proprietary focus.) Things don't seem so gloomy now these last several years at least.



Realistically, "good enough for Google" is probably largely an accident of acquisition. QPX, the low airfare search engine, is not exactly a small piece of software. In c. 2007, it consisted of about 1 million lines of code divided unevenly between Common Lisp and C++, which, by then, had been developed for roughly a decade for the airline industry. Rewriting it would have been a non-trivial task.

Furthermore, Carl de Marcken, chief scientist/co-founder of ITA, had chosen Lisp because it was what he was most familiar with. He had told me in 2007 that if he had to choose a language again, he probably would have chosen Java, a choice Google would likely favor over Lisp.

I say this not to disparage Lisp—I enjoy Lisps, and I use Lisp professionally—but to contextualize this particular use of Lisp at Google.

EDIT: Just saw your reply to a sibling comment explaining that you meant SBCL, not Lisp per se.


See my other comment in this subthread.


ITA software is an acquisition, and I’m sure most of the Lisp development at Google is just done by engineers tweaking their .emacs files.

The problem of parsing data from airlines is fiendishly hard. There are a ton of different rules for how to calculate prices, and all sorts of deals and promotions that may affect a particular route on a particular day. If you have a system which can parse this data, then you wouldn’t want to rewrite it. I’ve read a number of articles about ITA Software’s Lisp code and it’s really interesting.

Lisp may be a critical part of ITA’s success, but “good enough for Google” is probably the wrong take here—Google, I’m sure, purchased a company to add its working product to their portfolio. I doubt that Google’s processes would allow someone to ale a new product in Lisp.


I meant SBCL specifically, not Common Lisp in general. They could have been using LispWorks or Allegro CL instead, but they aren't.


Can you share some of the articles you mentioned about ITA Software's Lisp code?


Keep in mind that the information is old.

http://www.paulgraham.com/carl.html

I’ve read other stories, and seen presentations, but the information is getting harder to find.


> if you wanna do Common lisp you have to pay for Lisp Works

Nothing could be further from the truth.


Could be part of the reason, but I think a bigger reason is that it's just not very pleasant to read. Syntax exists for a reason! Nobody really wants to hand-write an AST (or read it).

Imagine programming Python by writing out the AST in JSON. That's what Lisp looks like.

I guess it makes the parser really simple and elegant, but they definitely went way too far towards the programmer on the "make it simple for the programmer/user" spectrum.


I find lisp much nicer to read than most languages.

C/Java etc (which I was a professional in) have too many bits to the syntax - braces and semi-colons and the execution is not necessarily shown by the format.

I also like python for similar reasons.


Is there still some kind of revenue stream from Symbolics Lisp or does the owner just want it to die with him?


In 2021 there was port to Apple M1 MacOS, so it seems like there can be small revenue streams still. Open Portable Genrea runs on 64-bit x86 and Arm.


Gary Palter has made it work both on Linux and macOS, both x86-64 and ARM64. He also brought the original Open Genera for DEC Alpha to the same software level.


Yeah, my thoughts precisely. Genera is obsolete. I cannot imagine what possible profit could be made. There may be value in the design that could be of benefit even today, but I cannot understand people who hoard old IP.


My guess is that there are maintenance contracts which can be quite lucrative.


Still? That would surprise me.


Someone's in a pickle if they need support. The older the contract, the more it's probably worth.


It’s a pain to do the work associated, with no personal upside, so it’s probably neglect more than anything.


Please. Dude could day “hey I want to open source this, someone do the work for me and show me where to sign” and people would trip over themselves rushing to make it happen.


Pain? The hardest part is probably finding a five inch floppy drive connected to a machine that can push to GitHub. Even that could be outsourced to some trusted resourceful LISP enthusiast.


There's a lot of preparatory and legal work involved in preparing a proprietary software project for release as an open source project. It's not as simple as just pushing the source tree to GitHub and calling it a day. For example, if portions of the code were licensed by other other copyright holders, permission from them is required. There are other cases where the code needs to be examined before release in order to edit or remove materials that are personally identifying, embarrassing, or could cause legal issues if publicly disclosed.


Then just put up a fundraiser. There's enough nerds that would pitch in I'm sure it could get covered and someone hired to do it.

Though I think it's more than likely that once people had free access to it and saw what modern conveniences were missing etc the mystique would be lost and it would be kept more as an archaeological asset than as an ongoing thing people wanted to maintain and use.


> mystique would be lost

People would also discover that the architecture is not simple, but grown over a decade while there was rapid development of the basics people take for granted: standards, languages, networking, operating systems, ... In the mid 80s TCP/IP was an expensive add on for Genera. Later it was made a part of the OS. But development stopped before it could add things like IPv6, Unicode, basic security, HTTPS, and other things.

One could add it, but then one had to learn ZetaLisp (70s/80s), Flavors (early 80s), Symbolics Common Lisp (80s/90s), an object-oriented networking stack architecture from the 80s, ...

You would have a second life in a technology stack somewhere between the past and the future, in a parallel world.


It makes more sense to me for some group to attempt to replicate the IDE experience in a modern Lisp (or similar) flavour.

Still I also think these kinds of "live editing" environments don't translate well into some modern "best practices" around version control, deployment, versioning, etc. Remembering my experience with LambdaMOO & similar and then later Smalltalk/Squeak and recently with doing stuff with Julia's REPL, etc. They're fascinating and fairly productive, but I am not convinced about this technique's ability to scale.


> It makes more sense to me for some group to attempt to replicate the IDE experience in a modern Lisp (or similar) flavour.

It may not have the fancy UI, may not have a fancy user experience, may not be an operating system, may have a more primitive Lisp dialect, may not have a good multi-threading story, ... but it might exist and people put a lot of work into it: GNU Emacs and its extensions.

That's how it is...


I think half the magic I associate with the Symbolics products is that they also made their own hardware, custom built for Lisp. Which is kinda neat and magical, if entirely impractical.

That plus the lovely keyboards, nice case design, etc. etc.


In the early days they had to - all the hardware was non-standard. Where would one get 36 bit memory cards? Where would one get CPUs which had a Lisp-specific instruction set? In those early days the computer (refrigerator size) would sit in a machine room (with enough power) and the user would be in his/her room with only a console (plus maybe a second monitor), keyboard and mouse - connected via a long console cable to the machine in the machine room. That was the experience for the programmer/user. The machine itself could have a lot of peripherals: network, tape drive, memory boards, color boards, frame grabber, disk drives, cpu accelerator, ... the programmer would see the driver code.

At some point in time they produced cards for the SUN VMEBus and for the Mac Nubus. Then the only thing left was the keyboard.

So, I agree, the Lisp Machine concept was a combination of Hard- and Software. The emulators of today only provide some of the software parts...


Yeah I have here the book published by the folks who do did the Linn Rekursiv computer, which is sorta like the same philosophy as the Lisp Machine but for a highly object oriented system. Tagged memory, hardware supported garbage collection, object-structured memory, etc. I believe it was also positioned as a VMEbus card, etc.

My understanding is Moore's law just ended up making these attempts uneconomical. By the time HW eng and manufacturing is done on the custom components, "orthodox" general MPU capabilities end up leapfrogging it and a well engineered software VM can outperform.

Maybe we could start to see this change again, who knows.

Maybe unikernels running highly tuned VMs on the hypervisor could be this generation's equivalent. Full control over the MMU, tagged pointers all the way down... hmmmmm


Symbolics could have built computers with custom hardware. At some point in time the 64bit RISC CPUs (MIPS, Alpha, SPARC, POWER, x86-64, ...) were all powerful enough to run Lisp. Genera got ported to the DEC Alpha). They would have been also powerful enough to run Lisp as an OS. But that would also mean writing device drivers, interfaces to hardware and a lot of other stuff. It would have cost substantial amounts of money to port Genera to the metal. Who would have bought that? The idea of a high-level language running as an OS on the metal did not have any future. I don't think there are good examples where this was successful on the market.


I look at the interlisp effort, being reworked..

So would be really difficult to setup a public/no-profit startup/company to bring up all the stuff, and coordinate all the people involved? maybe even a new hw can be made again..

Probably the real problems arise from different opinions between owners and former employees about the future and possibilites of the technology?

But this story everytime make me think at the billions of lines of code leaved in the dark (or in the reels of 9inch tapes), representing so many years/man of work, developed and debugged, only to be forget and sometimes reinvented, some better it's progress of course, and may times only worse, reinventing the wheel every time. Even this is a form of waste/pollution?


Just a virtual machine that runs a lisp os. Might as well just run clojure on jvm


What makes Symbolics Genera interesting is not the fact it’s a Common Lisp implementation; there’s SBCL, ABCL, and plenty of others. It’s the development environment inside Genera that many users fondly remember and make it stand out even today. The excitement about the broader availability of Genera centers around its development environment.


Even the device drivers were written in Lisp.


"Non-commercial"?! How much money are they making from it now?


what is there that was not supplied on the CD-ROMs and tapes they sold?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: