One of the things that Allen only barely hints at and I'm not sure Gilads piece really touches on was the shifting economics in Software development. In 1995/96 when Java gut punched the wind out of Smalltalk and the other OO solutions (CLOS, Eifel, Beta, etc), they spent a butt load on marketing and gave the stuff away for free. In other industries it would be called "dumping." It wasn't just Sun giving Java away. It was also the beginning of the rise of Linux, also free. I remember a colleague pointing out "there is no money in tools anymore, there's a few niche islands left, but they'll fail too."
I think neither of these articles aptly covers this counter trend enough. Whilst Smalltalk had come out of the kind of freeish academic research zone with its small band of developers of un-patroned developers trying to eke out a living, the industry was clamoring to free and open source.
What I wouldn't give to have closures that were as simple and robust and approachable as Smalltalk in any of the languages I work in today (Swift, Kotlin, Python, Dart, C).
(Did 20 years of Smalltalk, including 6 years at Cimcom)
Not only did Sun lose a lot of money on Java, they got other people (I remember Borland and IBM but there were probably others) to throw away money for a few critical years. Why the new CEO of the merged ParcPlace/Digitalk thought it would be good idea to abandon the lucrative Smalltalk niche to focus his company on this bloodbath I will probably never know.
ParcPlace hired a CEO who knew how to solve problems by buying and selling companies. It worked for him once or twice (he previously sold Ashton Tate to Borland).
In context - compilers and operating systems used to be expensive. Really, really expensive. Hundreds/thousands of dollars per seat per language expensive, in 80s/90s money.
But what really killed that model wasn't the cost - it was the lack of lock-in.
For comparison, Oracle is still really, really expensive, but its market lock-in means that it won't be overtaken by Postgres for some time yet.
Languages are more disposable. There's significant pain involved in shifting to a different code base, but it's not nearly as painful as trying to extract your data from a proprietary DB that considers lock-in a feature.
I do embedded on the metal with no or RTOS, in C. Embedded on Linux (think Raspberry pi form factor) with Python and other Linux stuff. At Swift/Kotlin/Dart for mobile apps. A different sort of full stack where we're developing the gadgets and the apps that interface with them through bluetooth.
Interesting! Once you're at Swift and Kotlin, is there any reason to pick up Dart?
As a web dev, I've done Ruby, JavaScript, and Common Lisp; and have dabbled in Dart, Elixir, and now SwiftUI. If I wanted to start building mobile apps, I'm wondering if that would suffice or if I'd have to pick up Kotlin as well.
From inside ParcPlace, it looked like we were starting to gain traction. There'd been a number of spectacular C++ project failures, and people seemed to be willing to try this weird, new (to them) thing that let them get prototypes up quickly and then grow them.
Then Java arrived and sucked all of the air and mindshare out of the ecosystem.
Java lucked into ideal circumstances, and Sun Marketing played their hand very well. The Web was just taking off, VCs were looking for something new and hot, publishers were looking for the next big thing, and programmers were looking for something that wasn't C++. Java looked familiar enough, cost nothing, had Garbage Collection and Interfaces (putting Gang of Four Patterns within reach of many), and let programmers keep their favorite tools and editors instead of adopting Smalltalk's strange model of doing everything in a (mostly) sealed environment.
Then Java arrived and sucked all of the air and mindshare out of the ecosystem. Java lucked into ideal circumstances,
Not only that but Java was free and Smalltalk was $50k/seat. There was a chicken-and-egg problem, no one could learn Smalltalk unless their organisation had already committed to it. It sounds weird to say it now but Java won by grassroots adoption.
Also, you could share Java code, or entire libraries or applications, by posting some files on a website. You already had the files, and someone downloading them could use them straight away. To share Smalltalk code on a website, you would go through the file out / file in process, which was considerably clunkier.
Also, i don't think Smalltalks of the time had namespaces, which also makes sharing code harder to get right. Java's reverse-DNS package naming is a surprisingly important innovation.
>There'd been a number of spectacular C++ project failures,
Do you have any examples in mind? Big corporate software project failures are always super interesting imo. Especially those that happened during such pivotal times as rhe 1990s.
C++ programming in the early 90s was, in retrospect, not all that much fun. templates were very underpowered and not widely implemented, the library was minimal. Everybody defined their own string and vector classes, or used products such as Rogue Wave Tools.h++, which at the time still offered macro-based container classes.
There were a lot of fumbling exploration into what kind of OO design would prove long term workable, and in what existed of the C++ standard library at the time, some classes, like iostream with its multiple inheritance and virtual base classes, may not have given particularly sound examples to emulate.
The arrival of the GoF design pattern book and of the STL were, in my opinion, massive game changers.
From Aaron Hillegass Cocoa® Programming for Mac® OS X, Third Edition
Once upon a time, there was a company called Taligent, which was created by IBM and Apple to develop a set of tools and libraries like Cocoa. About the time Taligent reached the peak of its mindshare, I met one of its engineers at a trade show. I asked him to create a simple application for me: A window would appear with a button, and when the button was clicked, the words “Hello, World!” would appear in a text field. The engineer created a project and started subclassing madly: subclassing the window and the button and the event handler. Then he started generating code: dozens of lines to get the button and the text field onto the window. After 45 minutes, I had to leave. The app still did not work. That day, I knew that the company was doomed. A couple of years later, Taligent quietly closed its doors forever.
It is asserted a bit later than C++ was the problem.
Not the OP, and I don't have any specific examples, but this was at the height of Waterfall (not in name, but in principle), where it was generally still taught as the way. Agile wasn't even a glimmer yet, but Extreme Programming was what Agile is now - interesting ideas, but mostly horribly executed.
Plus OO was at it's height as well, and C++ was seen as being incapable of handling OO sophistication, but Smalltalk could! (spoiler: it couldn't either) Then Java came and took over, and while it is a great language, it turns out massive, sophisticated OO approaches just breed their own complexity and so it picked up the "bad" moniker (it isn't), and now we find ourselves in other waters with a lot better understanding of everything, but still with a long way to go!
The ones I'm thinking of were were in "boring" places like large power companies, and back-end systems for telecom. The stories we heard, which are hard to grok now but make sense if you think back to mid-90's hardware, where of multi-day build and test runs causing projects to grind to near halts. Fertile, but painful, soil for the idea of "Refactoring" to take root in.
It's one thing when a tech company with lots of technical talent like Google decides to build its own (core) tools in a language like C++. It's another thing entirely when large numbers of IT departments, who have vastly more business than technical knowledge, try to build sprawling business applications in it. That's where most of the failures were occurring and why they were looking at Smalltalk, Java etc.
I took my first college-level programming class in 1997, learning Smalltalk.
1. IBM VisualAge didn't run in Linux, so I had to buy a second hard drive and a copy of Windows 95.
2. I wanted to be working in a terminal, Smalltalk only let you work in an IDE. I couldn't figure it out. I couldn't believe that anyone would want to mouse-around to poke through the available classes and methods.
3. The text was the IBM manual. It was terrible.
4. I failed the class twice, so I changed majors.
I didn't come back to coding until ten years later.
I'm a contributor to Kubernetes, the Go Standard Library, and I'm a full-time Go developer. Things turned out OK.
Smalltalk, OO, and IDE-based development set my career back by ten years. Don't believe the hype.
You didn't like Smalltalk. It didn't work for the way your brain works, so much so that you failed a class twice. I'm not quite sure how this adds to the discussion. Clearly IDEs and having to "mouse-around through the available classes and methods" works for many people since this is how documentation in Xcode/Android Studio/Visual Studio works. If the documentation for one Smalltalk implementation was bad, that doesn't mean that others were bad. I don't quite see your point other than that Smalltalk (and one specific implementation of it at that) didn't work for you, a sample size of 1.
Absolutely. It's great hearing tales of people who used and loved Smalltalk.
I'm pushing back against anyone repeating the received wisdom "zomg genius! it was too good to live in this fallen world" without having ever used it.
I don't pretend to be a sample size greater than one, but I think it's worth noting that, in this batch of comments, if you only count the people who used Smalltalk, and not the people who saw an old Alan Kay interview on Youtube, you're likely to be somewhere south of n=20.
Not only did I saw plenty of interviews of Alan Kay, I learned OOP with Turbo Pascal 5.5, improved my OOP skills with Turbo Vision 6.0 in TP 6 and Borland C++.
Later moved into Turbo Pascal for Windows and Turbo C++ for Windows, using Object Windows Library, followed by early versions of C++ Builder and Delphi, using Visual Component Library.
Used Smalltalk/V during university, was introduced to the alternative OOP programming models of SELF, SWI Prolog, Native Oberon (including Oberon-2 / Component Pascal) and plenty of other stuff, including being a Java early adopter and pushing C++ with IDE tooling on places still stuck in C.
So yeah, Smalltalk was a genius environment and everyone has a different learning process.
For example I try to stay away from UNIX cli as much as I can, yet I learned UNIX via Xenix in 1993 and used almost every major UNIX variant since then.
> if you only count the people who used Smalltalk, and not the people who saw an old Alan Kay interview on Youtube, you're likely to be somewhere south of n=20.
To support your view:
A competitor "stole" a large maintenance project from us.
I suggested to someone at my company that they would have trouble recruiting as I assumed we were the biggest smalltalk community in Norway.
Their answer was that we were probably the only real smalltalk community.
They might be wrong, but I haven't seen anything to suggest that yet.
And that competitor kept calling all of my colleagues that had - at any point - worked anywere near smalltalk :-)
Even we do the vast majority of our development in other languages.
I sent you a mail to the address in your profile :-)
For anyone else, I'm not in a position to hire anyone but if you know Norwegian, Danish or Swedish reasonably well and want an interesting job with nice people (well, if you like people from the Nordic countries) then feel free to ask me.
We mostly work on larger systems, some of the systems was born a decade before the youngest persons who maintain them, others are brand new today but will hopefully be equally valuable and live a long life as well :-)
I use Smalltalk nearly every day, it's my prototyping / experimentation substrate. I've also done several interactive presentations and taught classes using Squeak as the medium.
I'm convinced that Smalltalk was so far ahead of its time, that it will take several decades [if it happens at all] before "programmers" get equivalent functionality from mainstream development platforms. Which is fine by me, since I am an EE by day and view programming as an artistic process thus do it for fun, not money. Which is another reason why Smalltalk failed in the market. It was made by geniuses and was aimed at art & research, not a bunch of cogs stuck in cubicles.
I took my first college-level programming class a year earlier than you, using Smalltalk, and I hated the language. It just didn’t make any sense to me. But in high school I had programmed in C, and earlier than that in HyperTalk and earlier still in Basic. I was too immature at the time to see the similarities between HyperTalk and Smalltalk, and too hung up on what I thought of as C’s close-to-the-machine nature to appreciate Smalltalk’s high-level power.
I think that my previous programming experience — which really wasn’t very extensive! — made it more difficult for me to understand what was good and desirable about Smalltalk. It really was Blub for me.
But I don’t blame Smalltalk for that; I blame me. I was 18, immature, unable to imagine a world outside my head, and with strikingly little actual experience. I’d like to blame all that on my age, but of course there are folks who were better on these metrics at the same age.
(I stuck with my major, though in large part because later classes were all in C, and maybe one was in Lisp and another in Java — it has been so long now that I cannot remember. Glad I did!)
I get the feeling it has a lot to do with how much your teacher loves the language. I was introduced to Common Lisp at university, didn't get it at all. At least Prolog could do cool tricks. Took me several years of Graham yabbering about it to make the effort, and to my surprise it was all that and more. I realize now that my teacher didn't have enough experience to really understand what was great about Lisp. I have since learned Smalltalk as well, and admire many aspects of it; but with the wrong teacher it would have failed just as epically.
This is how learning OCaml and having access to Native Oberon and Smalltalk/V did to me.
It opened my mind to an alternative universe of safe systems programing and computing models.
This kind of stuff is a bit like explaining monads I guess, it only groks if one goes through the experience themselves and it always hard to convince others just with words.
I had. I wrote some Applesoft BASIC as a kid, and was handy enough with LOGO that they brought in teachers from around the city to have me teach them at age 12. My first and only lisp.
The other side of this story is that I always encourage people to keep their modern-day children away from BASIC, at it only makes learning any other language harder.
My bad Smalltalk experience is hard to imagine today: Hey kid, here's a language that can only be written in one IDE on one operating system. Don't worry that the IDE doesn't run on your Linux computer, because the programs that you write won't run on your computer, either. The artifacts you generate won't be files, they'll be images. We're not going to actually cover that part, so when you're ready to turn this in you'll need to copy-and-paste the working code into your email client. You'll know it's ready when you press "F5" and nothing bad happens.
about 2. even though I enjoy text, I can also accept that smalltalk placed itself into the interactive graphical realm first and thus would put you into a visual clickodrome of programming constructs.
> Smalltalk and the ideas it was based on revolutionized personal computing. There is raw, unbridled genius in it and that should be widely recognized.
This is not historically correct. What we understand today by "personal computer" was described by Butler Lampson in his famous paper in 1972 and was first realized in the famous Alto Computer. The first desktop GUI and the first WYSIWYG applications for the Alto were implemented in BCPL (not Smalltalk). The first version of Smalltalk, which most closely corresponds to our todays understanding of Smalltalk, did not appear until 1976, and two other important features were not even introduced until 1980. So if you are looking for geniuses, you will need to increase your search range a little.
Who's to disagree with your aesthetic judgement? You're not wrong to like it, if you like it.
I found it sufficiently obtuse that I walked away from my life's dream. The nostalgia circuit touts it as an amazing educational environment, in my experience it was not.
I was self-taught in Java, C and Foxpro, and then joined a company of Smalltalkers who had to transition to Java. They used Visual Age for Smalltalk and Visual Age for Java.
For various reasons, for a few months at work, I had a really low end computer that couldn't run Visual Age for Java. (I couldn't afford a computer that could run Visual Age for Java/Smalltalk).
However, I got to learn from some of the best Smalltalkers in the world. Thanks to the wonderful foundation they gave me in OOPs, I have since then been able grasp domains very well.
In a different set of circumstances, I got to learn how agile and Test Driven Development work very well even in larger teams. I've worked on teams with 250 odd people from three different companies and two different timezones and cultures. Due to my current work, I get to see how horrible a person's experience can be with agile and with Test Driven Development.
Today, I had an illuminating 90 minutes session on Complex Numbers. This was my second class online. After 30-odd years of being furious at Mathematics, I have now come to embrace it.
I have concluded that the right mentorship and learning materials and experience can go a long way in helping make a topic exciting and interesting.
Your hurt is evident. From what you have shared, it seems to be the absence of good tutorials that were a roadblock, and not Smalltalk or Visual Age for Smalltalk itself.
> I couldn't believe that anyone would want to mouse-around to poke through the available classes and methods.
So I dont know a ton about smalltalk outside of what I've read and poking aside in squeak for a few minutes, but is this how smalltalk apps were supposed to be distributed and run? Inside a sandboxes system like that?
On certain systems, there are tools that make them feel more native. Eg. on Dolphin Smalltalk which ran on Microsoft Windows, you can run a utility to stripped off the unnecessary parts (eg. the browser/editors and unused classes) of the image and it gets bundled into an .exe which includes a VM. A user wouldn't know it's running Smalltalk.
It turned out to be an excellent tool to write Windows app during its time. The live environment worked extremely well in working with and figuring out ActiveX and COM objects.. Even better than VS Studio at that time.
I worked for 3 years using VisualAge and GemStone/S.
As Allen says in the article, Squeak was a special case: based on Smalltalk-80 is oriented to have an environment to support the experimentation with e-toys (a platform for learning).
Other Smalltalks were different. VisualAge supports native UIs, and you have tools to produce the final product. The usual process is to create a reduced image based on modules that you load from the version control system (Envy in the case of VisualAge). Then you package an EXE that uses your image.
At some point, one actually does poke around and understand what is going on to know how a system works. alrs states that they are contributing to K8s. That does require poking around to learn the plumbing of the system.
You could very well write Smalltalk apps without poking around the underlying classes. The Visual Age for Smalltalk environment just made it very easy to do so. You can do this with any language IDE today where you attach the source code of libraries to the project workspace.
I have been using vim for about 2 decades so I can empathise with not having the familiar key bindings and speed of moving around.
A good Smalltalk environment does let a developer navigate the image [1]. On many systems that means it allows you to poke at everything about that Smalltalk system, not just your own code, but 3rd party libraries, as well as the Smalltalk system itself, including what we normally consider the text editor or IDE.
It's true that you lose access to vim [2], but you trade that it for the image as well as wonderful code browsing capability in the sense of looking up a type definition, senders and implementors of a message, stepping into any code, live debugging and editing during debugging, etc.
So Smalltalk is a tradeoff. If one can give up vi and work with the paradigm, it is really really great. [3]
[1] The Smalltalk image is a wonderful and beautiful concept. There is one glaring downside to it for most Smalltalk systems. It creates a divide between the image and the rest of the world.
[2] Since you can redefine how any code in the image work, it's possible to define a few shortcuts so you get a very limited set vim functionality, but it's not feasible to reproduce the vi environment (even if we want to invest the effort) because the paradigms are different.
[3] One can say that's generally true for everything. "Once you can accept X, you get Y", but the argument is that Smalltalk is really great once a developer does that.
> … how someone would want to poke around with a mouse…
Back in the day, not having a TKL keyboard could become a pain.
Back in the day, poking around could be an effective way to learn not just what batteries were included but whether, with a tweak and nudge, they might solve le problème du jour.
Poking a running program not just a source code file.
Speaking as a Smalltalk programmer active in the 1990s we pretty much dropped everything when Java showed up. It had the things we loved in Smalltalk (portable virtual machine, garbage collection, self-documenting APIs, collection classes, etc.) with the familiar syntax of C++ and concurrency.
In a way Smalltalk prepared the way for Java in that it showed how great VMs and garbage collection were.
It’s funny because I’ve only started using it recently and what really captivates me is the live environment. The fact that you can dig in and evaluate/change everything in the same environment you are running it provides an amazing feedback loop. To me it really seems like the “killer feature” of the language but I can’t think of any modern language that has adopted it.
Smalltalk was inspirational, but the environment was a double-edged sword. In ParcPlace Smalltalk even things like stack frames were editable classes. One of the first mistakes I made was trying to edit the stack frame class in a running VM. It immediately corrupted the VM.
BTW another brilliant innovation of Smalltalk was incremental compilation. Edit a method, save, and it was instantly part of the runtime. It must have existed in other systems but Smalltalk was the first language where I encountered it.
BTW another brilliant innovation of Smalltalk was incremental compilation.
Actually, Smalltalk compilation to bytecodes isn't incremental, from a certain point of view. The compiler finishes completely with bytecodes, usually instantly from the POV of the programmer. The compilation is complete. However, the binding is late. (Usually termed "late binding." You see, usually only one method has to be compiled at any given time.)
If the VM is a JIT VM, then there's another compilation step to machine code. Arguably, one could term this "incremental compilation" as well. It fits the colloquial definition of "incremental."
Another weird way one could term the features of Smalltalk, is the argument that it's a strongly typed language. Yes, that's right. It's just that the only type is Object. The differing of behaviors of instances of different classes is just runtime magic. (Late binding again.)
the key feature i like here is that code can be changed and recompiled while an application is running.
when an error occurs, the application does not crash but freezes allowing me to fix the error and resume.
to the point that when the aplication is a website and an error occurs while a browser is accessing it, the loading of the page just stalls until the error is fixed, at which point it resumes without any user intervention from the browser side.
Heh...Symbolics Genera environment (Lisp) had the same sort of footgun. You could edit pretty much everything down to bare metal on the fly, and the closer you got to the bottom, the more likely you were to touch something that should not be touched by mortals.
Same: the first thing i ever did in a Smalltalk environment - after grinding through swapping floppies in and out of my Mac Plus for twenty minutes - was define an empty LinkedList class, or rather, redefine the LinkedList class that the Smalltalk kernel used to schedule threads to be empty.
They did and do support version control etc. E.g. even Smalltalk back in the day kept a log of every single change of code in what's called a "changes" file.
"Corrupting the VM" just means that you were editing the stack frame class in the live system running on the VM and as soon as you hit "save", the VM crashes. So you simply run it again. And if needed, replay the changes log up to before the crashing change.
This only takes seconds, and is part of the reason why Smalltalk is so productive.
I only did a small amount of squeak programming many years after smalltalk’s heyday but I found the live interface incredibly productive. It was lisp’s image based systems with an incredibly useful gui. I created a small audio app with it very quickly.
I wish more languages would incorporate it. The closest thing I know of is inspecting DOM elements through javascript’s debugger and using the developer console to run operations on them. But that only lasts until you press refresh.
With smalltalk, seeing an object and then being able to edit the object and then making it interact with other objects made OOP come alive, rather than Java’s inheritance soup with dubiously modelled classes that eventually made developers want to throw away OOP.
I can't understand. People says that ST ecosystem is much superior to nowadays languages, so why people at that time switched to the starter version of java, that is so bad compared to today? After the shift, did you miss something?
I actually went in the oposite direction: I started with C++ and Java. Until I had the opportunity to work with a very interesting team using Smalltalk (this was in 2004, so St was not popular anymore, but I liked the team and it was an opportunity to learn).
The article is very good describing what failed, from business to technology.
Why people jumped from St to Java? Besides the licenses and the OS movement that took many St companies off guard, there were also technical issues:
- Java was better for server side work. It had mature frameworks for Http/XML/Database access. Big companies allocated tons of resources to create Java frameworks and tools... the situation for St was different, you were at your own. For example, I had to debug and fix the WebServices framework available in VisualAge... because I didn’t have another choice.
- The JVM was superior in many areas: native threads, different GC algorithms, JIT, and many tools for monitoring and troubleshooting. Most of the St VMs only had green threads, it means that any blocking native libraries (like DB drivers) are problematic.
- As the article says, enterprises moved from desk apps to the web. And web frameworks for St arrived late.
- The classic St-80 was not designed with the idea of packages in mind. Loading a package is basically executing code that modifies your image. It’s powerful, but problematic because it can introduce unexpected bugs. Strongtalk tried to fix that, but it was a research project.
- To the package isolation problem, add the fragmentation between St implementations... it didn’t help to create an open source ecosystem of shared libs like in Java.
What I miss?
The environment!
In St you have absolute control of the environment. You can inspect everything (including your editing tools), and evaluate code everywhere without having to stop the world. Today you have editors or IDEs with plugin APIs, but is not the same. A comparable experience is modifying things live in the web browser, but it lacks the inspection/authoring tools from St.
I worked on a project at Sybase that attempted to rewrite the Sybase SQL Server using VM-based technologies.
We looked at Smalltalk early on but found a number of problems. They included slow execution speed (there was no JIT), lack of static typing (hard for maintenance), and difficulty to strip the VM environment to create an efficient runtime. Finally it had very limited ability to manage storage and memory directly. Without these it's difficult to build fast database systems.
Sybase tried to acquire a Smalltalk vendor in order to fix the VM problems but could not reach a deal so we ended up moving to other technologies. When Java came along we felt it was the language we had been looking for even with the early performance problems.
I would guess pretty much all the DBMS vendors in the early 1990s had people looking at VMs and were mentally prepared for Java when it arrived.
In addition to other advantages DBMS designers were looking for better ways to compile queries into executable representations (aka bytecode). That path still exists but I think the tendency now is to harness tools like LLVM and do JIT compilation.
No, the project got cancelled before Java showed up. Sybase pretty much went down the tubes after that as a major DBMS vendor. Not because our project failed, but because Sybase had a dud release in 1993 named System 10 that had serious quality problems and tanked sales.
> 1993: Sybase and Microsoft dissolve their partnership. Microsoft receives a copy of the SQL Server code base. (...) Sybase SQL Server version 4.2 and Microsoft SQL Server are identical. (...) From this point the products diverge as Microsoft includes more Windows features whilst Sybase adds Enterprise features (performance and scaling).
> June 1993: Sybase announces its latest generation of software, named the System 10 product family; (...)
Correct. MS SQL Server is based on the Sybase 4.2 release. System 10 was 3 releases later (4.8 & 4.9 added MP and internationalization support respectively).
As I recall, there were a few "ambitious" projects on the UI side, application builders or frameworks or something that were failing around then. I think there was a real loss of management focus on actual products in favor of chasing the next big thing. We also forgot to ship 4.8 (MP) for months as everyone jumped to 5.0 (aka System 10).
The free part certainly helped adoption over time, but at the time Java first arrived everyone was conditioned to pay for IDEs and most compilers for that matter. Java took off because of the reasons I cited up-thread rather than cost. It was initially marketed as a language for web browsers. Most of us just ignored that use case and put it to work in enterprise apps.
I wonder if there are surveyst to be found about IDE usage from those times... IDEs were common in some circles in the early 90s but less so in others. Java wasn't a very high level language even in those days - we had Perl, Tcl, Python, PHP, etc used for web and non-web things. And on the low level side we had C. IDEs were less used with those languages.
Ironically Java was first and formost influenced by Simula 67 and C++, not by Smalltalk. The first Java version was essentially Simula 67 with a C like syntax. The influence of Smalltalk is generally overestimated.
As other people here have said, Java was a death knoll to large Smalltalk adoption.
I had some early experience. A Xerox Special Information Systems salesperson gave me a one month Smalltalk license for my 1108 Lisp Machine. I really enjoyed it but I had already gone down the Lisp rabbit hole. Years later, a friend at Parcplace gave me a license but I got sucked into the Java world soon after (Sun had a link for a year on their main Java web page to a blog article I wrote on Java; for about 10 years I was the number one search result for 'java consultant').
I am back to using Common Lisp almost exclusively for my own projects and research but I do enjoy Pharo Smalltalk, and can't help but wonder if a free Pharo had been available in the mid to late 1990s how that might have changed things. Probably not much because Java does fit a good enterprise niche, but with the high license costs of Smalltalk, that war was lost.
Since Allen was there and I was not he is in a far better position to know. But my impression was that the Smalltalk group had been reasonably happy with their 8086 based NoteTaker computer and expected their partners (HP, Tektronix, Apple and DEC) to use commercial processors instead of designing custom ones. Even if they were far slower than the Xerox Dorado.
About Squeak in a web browser, it was a completely different project than the current SqueakJS. Alan was frustrated that the IT people at schools wouldn't allow Squeak to be installed on their machines. So the native code VM was bundled into a plugin for the then popular web browsers and it became as easy to use Squeak as Flash. Except that the IT people allowed Flash but not Squeak so it ended up not helping. But in terms of performance it was better than Flash, Java (not Hotspot) or Javascript.
Yes, my recollection of the Squeak browser plugin was that the performance was indistinguishable from running regular Squeak, several orders of magnitude better than SqueakJS. And the startup time was vastly better than any Java plugin code.
However, if you were on dial-up, and your browser didn't already have a cached copy of the smalltalk image, then you were in for a quite a wait. (Though no doubt this could have been addressed - there were people arguing at the time for the creation of a much smaller Squeak image, but a combination of the lack of a clear goal to motivate this work, and the fact that for most existing users it was a non-issue, meant that nothing came of this within the timeframe where it might have made a difference.)
But it may be that this is all moot, because it could be argued that any browser-plugin deployment model would ultimately fail.
I used the Squeak browser plugin with a class of university students in the late 90s - it worked really well. Native speed, much more interactive GUI than the browsers at that time, and I could update their essentially desktop app between classes and not have to get it reinstalled by IT.
I did have to do some persuading to get IT to install the plugin in the first place though.
It was doomed tech in hindsight though, together with all the other plugins.
Dan Ingalls talks about Notetaker/Smalltalk-78 in his forthcoming HOPL 4 paper which should be available within the next two weeks.
There were only a few Notetaker machines built and apparently they were quite slow and not really very useful. But the Notetaker image and VM was also ported to the Dorado. In 1979-80 that was the primary version that was being used within the LRG and was used as the starting point for creating Smalltalk-80.
In my article is a link to the the invitation letter to Tektronix to participate in the Smalltalk-80 dissemination processed. In it they said: "We estimate about one work year for a very expert systems programmer to implement the microcode for the virtual machine and implement the i/o primitives." In the following discussions and visits they only showed us and talked about the microcoded Dorado and Dolphin implementations. They we repeated said words to the effect of "We want you to design a computer to run Smalltalk". LRG was quite secretive about some things and I don't think I had ever heard of Notetaker until 1993 when I read Alan's HOPL-2 paper.
One interesting thing I recently learned from Dan was that NoteTaker used a linear stack with overlapping activation records. This was dropped for the Dorado implementation and Smalltalk-80. That was unfortunate as the overhead of heap allocated stack frames was one of the major performance bottlenecks when implementing Smalltalk-80 on conventional processors. It took those of us doing such implementations a couple years to develop techniques for "cheating without getting caught" while using hidden linear stacks. In retrospect, the poor performance of Smalltalk-80's heap allocated activation contexts in combination with reference counting significantly delayed the viability of microprocessor based Smalltalk-80 implementations.
Allen, thanks for confirming the microcode thing. It certainly wasn't the impression I had gotten from the green book, but history does tend to get cleaned up as it is retold.
To be fair to the PARC people other people had done Alto clones:
I myself got into a graduate computer architecture course even though I was just an undergraduate so I could design a TTL microcoded Smalltalk computer. I was extremely disappointed to learn this was illegal and seriously considered moving away from Brazil at that time (1983, I think).
Yep, I was surprised by the linear stack when I implemented the NoteTaker JS VM with Dan. In the Lively interface, you can stop the running VM and enable the two little checkboxes on the top right to see the whole stack, rather than just the current frame.
Re "the NoteTaker image and VM was also ported to the Dorado": both the NoteTaker image (a.k.a. Smalltalk-78) and the Dorado image were initially generated from Smalltalk-76 running on the Alto. In the Notetaker image there still are branches depending on what system it's running on (see e.g. UserView>>buttons).
Smalltalk-80 was also built on Smalltalk-76 by "backporting" some of the more interesting changes from the Notetaker version, plus adding much more.
Didn't Digitalk Methods and Smalltalk/V also use a linear stack? Self did as well and to make that simpler they made using a block from a method that had already returned an error (an optimization that some have regretted).
About the NoteTaker being too slow to be usable, having used Squeak on 386 and 486 machines I am hardly shocked. On the other hand people would be being Osborn 1 a few years after the Notaker with just a fraction of the speed (but not trying to run Smalltalk, of course).
By the way, quite a bit of information about the NoteTaker on the web is wrong. So I was really glad when Bitsavers got hold of all the original material (mostly memos):
> the Smalltalk value proposition was: Pay a lot of money to be locked in to slow software that exposes your IP, looks weird on screen and cannot interact well with anything else; it is much easier to maintain and develop though!
Sounds like another framework we’ve got today, except now at least it’s free.
Discussions about Smalltalk almost always remind me of Self[0] from Sun. It is an extremely simple prototype-based programming language. An object can have many "parents", and defines properties on itself to expose an interface. It's really fun to mess around with. The Morphic UI is kind of archaic though.
I've never used Self personally, but I remember reading that JavaScript's OO provisions are modeled after it rather than, say, C++ or Java. Does that sound right to you? I always thought Javascript's class provisions were a bit counterintuitive myself.
What you can download from https://selflanguage.org/ looks rather archaic, not the improved version implemented on top of Squeak which looks and feals really fancy.
Unfortunately the version in Squeak is, in my opinion, fancy looking but a backwards step from the Self version in terms of the underlying system. Morphic doesn't seem to really fit all that well with class based systems.
While being class based does make Squeak's Morphic a bit more awkward than Self's, the big difference is that it is a hybrid system that can run older MVC (Model View Controller) applications with little or no modification. That means that there are two different ways of doing most things and someone looking for examples to copy might run across the MVC version instead of the "proper" Morphic version.
My experience of doing morphic stuff in both Squeak and Self, is that in Squeak there was a bit of a disconnect between the instance side and the class side - you could click together morphs, but in the end you had to write code in morphic classes to reconstruct those morphs.
There wasn't any automatic way to go from the morph instances to equivalent morph constructors, or any way to save the morph instances to copy and use later. So I ended up ignoring the instance manipulation and just coded morph creation methods.
In this way, the morph class hierarchy became much like a hierarchy of factories.
This mattered when I was doing a GUI to run on my Compaq iPaq because a lot of morphs (such as menus) were being created afresh each time they were needed. This was really slow and I only made it usable by caching the created menus and only displaying them when needed.
On Self I would have just copied the previously constructed menuMorph prototype and displayed it (if of course I had been able to port the huge pile of C++ that is the Self VM to the iPaq :)
This was ages ago though and I haven't played properly with modern Squeak or Cuis.
Anyone here shipping a Smalltalk built application these days? I had fun playing with Pharo and Squeak, but in terms of shipping desktop apps to clients I don't know how it could be done with Smalltalk. Maybe Cincom is the only game in town, not sure. Anyone has a story to tell?
Check out our (web based) product at https://ag5.com . We’ve built it with VisualWorks Smalltalk among other tools.
Every year we ask ourselves if we should be using such an niche tool but somehow almost every new employee at our company gets hooked on smalltalk. So it now has become our secret superpower.
My first encounter is Visage Smalltalk as part of the study to replace the VSE mainframe :-). But even the IBM SE is not that confident. Hence, we are off. (And strangely one of the issue is the source is too available, that the environment is too open to manipulate ... sorry just too used to COBOL and Java like fixed program than this dynamic things :-)).
But for the main history, just read about how Steve Jobs during my search of HOPL 4 about Next history that he missed the two key elements of the Lab (Ethernet and Smalltalk). He fixed both on the Next computer and in fact the prototyping and performance of Interface Builder and the underlying Objective C is so much practical than Smalltalk is too complicated.
Has anyone recently used Pharo Smalltalk? How did its IDE functionality enhance your experience?
"Pharo is a pure object-oriented programming language and a powerful environment, focused on simplicity and immediate feedback (think IDE and OS rolled into one)."
After trying it, going back to Java/C#/JavaScript was kind of rough. I realize that despite their external polish popular IDEs/languages today have some glaring, fundamental deficiencies.
I never used Smalltalk directly, but it's (weirdly) related cousin, Objective-C. My first exposure to OO was reading the Byte Magazine edition about Smalltalk (the one with the Balloon on the cover) which led me to experiment with adding OO to plain C in the late 80's (I did not have access to C++ or Objective-C at that time).
Shortly thereafter, C++ arrived in the form of "cfront" which was a cross-language translator that would generate .c and .h files from .cxx and you'd compile those with a C compiler. There was no actual C++ compiler at first.
I have a question to people who used smalltalk. Maybe I have read it somewhere but couldn't understand the unique advantage in modern day terms.
What is so different in smalltalk that is not possible / easy in modern IDE+Debugger environments? Especially those with hot reload or 'Edit & Continue' features?
You have access to the full stack, whereas even on .NET and Java there is a certain separation between runtime and lets call it user space.
On a proper Smalltalk environment, beyond a set of pre-defined VM intrinsics you can access and change everything, GC, JIT compiler, compiler, how the debugger works, IDE like features, you name it. Then it is very uniform, everything is an object, even it isn't implemented as such, it is transparent for your as developer.
The only other kind of environments that went as far were the Interlisp-D and Lisp Machines, to some extent the surviving commercial Common Lisps.
I'm not actually a "Smalltalk fan", but I recently re-implemented the Smalltalk-80 virtual machine and some additional tools to analyze the original Smalltalk-80 sources and virtual image files. My goal is to run it on LuaJIT and compare the performance with other Smalltalk implementations. See https://github.com/rochus-keller/Smalltalk. I'm also working on a Simula 67 compiler, and a Golang to LuaJIT bytecode compiler is also on my list. As you can see I write everything in C++, which also says something.
Have a look at http://www.squeaksource.com/Hobbes/, but don't expect impressive performance. It's not the code in the Bluebook part 4 though. If you want to implement it yourself be prepared that it might take many weekends until it runs as expected; debugging Bluebook errors took many hours.
This includes projects from various different groups as well as my own projects (which I should have clearly labeled as such).
The key events: from 1977 to 1992 Brazil had a "reserved market" policy that only allowed local companies to make and sell micro and minicomputers. After reading the 1981 Byte magazine about Smalltalk I wanted one, but by 1984 it was clear I would have to build my own. I toyed with the idea of designing my own processor (with TTLs) but went instead with the Motorola 68000 I had already used in other projects.
In 1986 I joined forces with Softec, which had launched the first PC clone in Brazil. Note that the reserved market thing kept the Macintosh out of the country (Apple was able to kill a local clone) so this would have been the only option for a graphical computer. This ended in 1988 after we had built several prototypes and operating systems. I had previously dropped out of the university so I went back to finish with a focus on chip design. In 1990 I got a scholarship to do an object oriented processor, but with the wonderful results the Self group at Stanford had with software JIT compilation I changed the project to be a modified Sparc. Even that got cancelled the next year in favor of a 64 node machine with 68020 processors (bought back in 1987) and I did research on parallel Smalltalk/Self on that.
People tried to convince me there was no longer any point in special hardware and that I should just do software for PCs instead. But in 1998 I was working on a digital cable TV project and came up with a VLIW MOVE processor that would be great for that but could also implement Smalltalk/Self very well. NEC was not interested in it but I thought it was too good to just drop it, so I left the university and set up a company to develop it.
The first products would have to use FPGAs since I didn't have funds for custom chips. Things went slowly since I had to do consulting. With the new Virtex II FPGAs there was enough internal RAM that it made more sense to have a bunch of small processors, each with a little cache, than a single large VLIW processor with external RAM as cache. Compiling for VLIW is not easy at all, as many people have found.
In 2007 I got into a master's program at a local university and in late 2008 I started to collaborate with a group that wanted to do a SqueakPhone and a Croquet cloud. To make it easier for us to work together I adapted a 2004 SqueakProcessor I had sketched into SiliconSqueak, optimized both for running bytecodes and as a target for JIT compilers.
This evolved over the years but with weak single core performance in favor of more cores per silicon area and better energy efficiency. Since late 2018 I have been developing version 5 of SiliconSqueak to have both a very high peak performance for a single thread and many slower threads depending on demand. And the adaptive compilation technology can not only handle bytecodes but efficiently simulate any processor (x86 binaries, for example).
Does anyone know where the strange name VisualAge came from? Wikipedia says "The name "VisualAge" is the result of a contest between the members of the development team." with no reference, and I can't understand how such an opaque and irrelevant-seeming name could win such a contest. "Visual" I get, but "Age"?
The infantile interface and oh-so-precious baby talk that accompanies Squeak (a modern vision of Smalltalk 80) put me off the several times that I started to use it.
I think neither of these articles aptly covers this counter trend enough. Whilst Smalltalk had come out of the kind of freeish academic research zone with its small band of developers of un-patroned developers trying to eke out a living, the industry was clamoring to free and open source.
What I wouldn't give to have closures that were as simple and robust and approachable as Smalltalk in any of the languages I work in today (Swift, Kotlin, Python, Dart, C).
(Did 20 years of Smalltalk, including 6 years at Cimcom)