I remember the fantasy that more and more people would be able to develop simple basic apps for their own very personal customized needs. It was a vision of a more accessible and democratic compute infrastructure, where we'd all be 'makers' creating the compute environment we lived in. Things like hypercard were part of this vision.
Those days are gone. The article doesn't explicitly mention the fact we all know: Very few people have the capacity to create a 'home-cooked meal' like the OP, for their own use.
In fact much fewer than could create a little hypercard app. It's a world where there are pretty large barriers to most people being software 'makers', and instead we are just consumers who can choose from what is offered to us by various companies trying to figure out how to monetize our needs and desires.
Part of it is the increased complexity of app development. Part of it is the walled gardens of our suppliers.
> I distributed the app to my family using TestFlight, and in TestFlight it shall remain forever: a cozy, eternal beta.
Indeed. Because the very infrastruture ignores the needs of one-off "home cooked meal" apps, it assumes they don't exist, you have to pretend it is some kind of "beta" of a mass market commodified product instead (and be glad our suppliers still let you do that for 'free'). Our computers (increasingly the phone/device is people's primary computer) are no longer ours to do with as we will.
It is sad. If those who created the computing revolution 30 years ago (including prior versions of us) could see where it wound up.... the level of technological sophisitication of our pocket audio-video-geolocated-self-surveilling-communicating devices is astounding; the lack of 'empowerment' or control we have over them is utterly depressing.
This is so pessimistic! What is the basis for thinking there are large barriers to most people becoming software makers?
I was in high school 12 years ago, when iphones just hit the market at like 800 bucks each and there was no firebase or react native or Medium articles or youtube tutorials covering 1st step code generation commands to deployment for every tech stack. The language of the day was Java. There was no npm and not a trillion python libraries that you can do anything imaginable with.
Is there any specific reason a 16 year old today couldn't make the app this guy did?
As a software engineer I like the idea that I do a black magic that nobody could ever understand, but I genuinely don't think it's true.
> What is the basis for thinking there are large barriers to most people becoming software makers?
You have an idea for an app. How do you build it? Your options generally are:
1) Do it the hard way: get Xcode/Android Studio, spend years learning to a) program, b) deal with the bloated and overly complicated frameworks, c) deal with the bloated and overly complicated build and deployment infrastructure. Also pay the platform owner for the ability to make it work / distribute it to someone.
2) Do it the easy way: pay some service so that you can make the app in a simplified/low-code way. Now your app is tied to someone else's service.
3) Do it the expensive way: pay someone else a lot of money to do 1) or 2) for you.
What's missing is the option 4): build it in some free, low-code tool, distribute it for free to whoever you want, without entering a relationship with any company whatsoever. That would be "the HyperCard for mobile".
On top of that, these days you're likely to want to have capabilities that are available only either through option 1), or through an API, requiring entering another relationship with some other third party.
Closest thing to 4) I've seen on Android is Tasker, with its ability to dump an .apk containing whatever UIs and if-this-then-that rulesets you created. But it's not exactly ergonomic, it's Android-only, and doesn't solve the distribution problem.
I don't see why it would take years to learn to write an app in Swift or Kotlin (or React Native or Flutter), even starting from zero. I'm not trying to be cute here.
There are probably 50 high quality end to end tutorial for building a video streaming app. I guess using Firebase is tying you in to a service, but so what? If it's for 4 friends it doesn't have to be infinitely scalable. Just by writing an iOS app you're tied to Apple, right? And surely an app isnt a failure just because it has dependencies?
If I were a non-programmer, I'd not have time to learn the basics of Swift or Kotlin even if it took only a month, if all I wanted was to build a small app for friends. With Apple I already have to deal with by virtue of owning an iPhone, but I would be reluctant to enter a relationship with another party just for the sake of my small app. On top of that, careful observation of the space tells you that an app is likely to outlive its service dependencies.
Dependencies you don't own are a liability. The less you have of them, the better.
Unfortunately there is an entire generation or two now of mainstream developers who never had the opportunity to experience things like Hypercard firsthand. It is always difficult to describe the full extent of the power of systems like it without being immersed in it, and we don't really have contemporary equivalents to make the case.
12 years ago the author of this post would have had a little website and now he has the ability to record and instantly share video with his family in a pretty delightful app :)
So are you saying the barrier remains the same? Earlier we had less access to stuff, and could write simpler software. Now we have more access, but software has become complex
I think as computers have advanced in power the technology that runs on them increases in complexity to utilize the new power. With economies that prioritize growth, people in charge of things like web standards want new ways to innovate over their competitors. Developers don't want the power afforded by additional computing cycles to go to waste. They also find things lacking, like the difficulty of accomplishing "holy-grail" webpage layouts. Standards are expanded to accommodate these pain points, and new features are added, but nothing can be taken away as this would break things that already work. This continues to the point where the web implodes under a plethora of animated SVG hammers striking anvils and autoplaying videos.
Thinking of it another way, computers of a long time ago could do only basic things like printing strings or drawing lines. Computers of today can still do those things, in addition to a host of other things that are now only possible due to technological advancement, like streaming video. But in some environments it is possible to limit yourself to having the computer do just the simple things - it's just a matter of deliberately opting-in rather than doing the only thing that's possible to do. For example I'm trying to write a "Web 1.0" webpage in the style of early 2000's design like Praystation[1], ignoring the shiny new technologies that do a lot more, because I believe that "more" is not necessarily "better". I believe you have to be ideologically motivated to do this now, because it seems like a lot of average web users have come to expect SPA-style apps with fancy animations and client-side interactivity, and that's where a lot of interest in web design appears to lie at present.
Of course, for a video-sharing app this doesn't really apply because the platform it runs on was proliferated fairly recently. The oldest model in the smartphone lineage came out at the start of 2007, and in order to build apps for it, it was necessary to install a full-blown developer toolchain with visual layouting tools and hundreds of APIs available for use. That's the simplest it can get. In the 1980's you could just 20 PRINT "HELLO WORLD".
I think it would greatly help if the author published the source code, as he briefly mentions considering. At least then it would be possible to judge exactly how technically complex the software is, and people could learn from it also.
However, at the point where you need to rely on complex cloud infrastructure like AWS to build things like video sharing applications, it might turn off anyone not completely interested/invested in app building. Although, I do believe a completely invested/motivated 16-year-old could pull off something similar, given the amount of documentation and free libraries on the web. That's not discounting the difficulty of actually accomplishing such a thing - for a young newcomer it would be necessary to learn many disjoint concepts for such a thing as video streaming. But given a significant amount of effort and motivation, it's at least possible to do in the present age.
> I believe you have to be ideologically motivated to do this now, because it seems like a lot of average web users have come to expect SPA-style apps with fancy animations and client-side interactivity, and that's where a lot of interest in web design appears to lie at present.
A key point the author made in the article was that simplicity is appreciated. Here's an analogy: if Snapchat is the equivalent of an SPA-style site with fancy animations etc, then his app is a static webpage.
Just consider for a moment: the reason these analogies to old technology are even necessary is because what this one amateur did today would have definitely taken a team of engineers to accomplish 12 years ago.
You are absolutely correct, the standards for professional pop tech are always being raised to new levels, just like with every field, e.g. cooking. Nobody is saying opening a successful restaurant is getting any easier. However, amateur gourmet cooking for the family is demonstrably much easier today than ever before in history.
> Part of it is the increased complexity of app development. Part of it is the walled gardens of our suppliers.
And our current culture contributes to both of these issues. We got the kind of computing that fits that culture, which is one that emphasizes profit making over other types of activity, and one soaked through with short term thinking. We need well funded basic research (to create computing media systems in the spirit of hypercard et al) and companies willing to push malleable computing systems out to their customers. Hypercard was extremely popular in its day, and Apple let it die on the vine. It's because they didn't know what to do with it -- the culture had become about shrinkwrapped solutions, and it no longer made sense.
Definitely, as culture is in hefty ways influenced by (and itself influences -- dialectic!) political economy.
> To change this "culture", you have to change our economic model for how we allocate resources in the world.
Yes, but the severity of this change is open to interpretation. Part of the "culture" I was referring to was a business culture that emerged in the late 70s and called for viewing shareholder value as the sole purpose of corporations. This was not necessarily the received wisdom in the prior decades -- decades which gave us places like Bell Labs and Xerox PARC, and the decades in which, arguably, the biggest qualitative leaps in computing occurred.
It's worth noting that most of these leaps in computing come from the ARPA research culture and/or from Bell Labs, the latter of which was a government regulated monopoly and not the case study of a normal private company doing basic research. Ditto for PARC, which was more or less an extension of the ARPA group at a time when the government funding was threatened.
The lessons from this (for basic research in computing) are simple: fund people and not specific projects, while having a general overall vision; fund at the appropriate timescale of half to full decades; don't interfere.
America has public schools, but they mostly suck. I would take a profit driven private school over a public one any day. I would also take dealing with a corporation’s customer service over a government office like the DMV any day.
And while you mention healthcare, the actual problem is that they aren’t profit driven enough and instead exist as this Frankenstein’s monster of a public/private partnership. I currently live in a 3rd world country with no health insurance or government intervention in medicine and healthcare and the system works well(ish). Fixing a broken bone or getting a tooth pulled is only going to set you back $10. And while this is a lot for poor people, it’s still only between 2-10 days salary for an average person.
That being said, there are major trade-offs in quality. I wouldn’t recommend giving birth or getting a life saving treatment here, but for anything somewhat routine, the healthcare provides great value. Also, all meds are over the counter and are very cheap. One med in particular is 100x cheaper here than the out of pocket costs in America.
> in TestFlight it shall remain forever: a cozy, eternal beta.
When I read this I thought -- oh man, I hope Apple doesn't decide to somehow limit this feature in the future. I wouldn't be surprised if it's already against the terms of service somehow to deploy a 'beta' app and then just keep using it forever. (Besides, don't you have to pay to be in the developer program? That may be somewhat acceptable for some adults, but kids whose parents would prefer that their kids "study" instead of "wasting time playing computer games"?)
Anyway, this is one of the reasons I use Android. While becoming root isn't realistic on many devices, I think it allows you to do enough for kids to find their way into programming. Maybe the challenge of getting a root shell/flashing your own OS is enticing to some too. And maybe the tech used to protect devices from their users is interesting to some people too -- and since OEMs need to know how all this stuff works, it also happens to be documented pretty well. (I think this tech is still in its infancy unfortunately.)
What is the equivalent/alternative to "TestFlight as eternal beta" in Android, how do you (or "people") get self-developed "bespoke" "home-cooked meal" software onto their Androids? It's easier than iOS? I am not familiar, I'm a web developer not a device developer.
On Android, you can just download an .apk and install it (you need to change the "Allow insecure apps" setting). No code signing, app store, or corporate approval necessary. This is the #1 reason I will never be getting an iPhone (which is a shame, since iPhones are beautiful devices in almost every other way).
I sort of feel that way, that recent technological progress for the last 10-20 years has made life significantly more convenient for us as consumers, but not as producers. I’m not going to deny that we have better tools now like GitHub and Visual studio, but what’s the equivalent of two day shipping?
There’s a few companies pushing the envelope, whether they know it or not. Notion for example allows people to build shareable webpages with their blocks; if their API ever gets released, you can imagine normal people using it as a CMS for blogs or websites. Another example is Levels.fyi using Google Sheets as a backend; Firebase is hard to use, but Word and Excel aren’t.
It’s difficult to say where this trend will go, but end users shouldn’t be underestimated. We could probably teach more people python by asking them to manipulate numbers on a spreadsheet than with a black terminal and plain text editor.
> There’s a few companies pushing the envelope, whether they know it or not.
Part of the problem is, they're pushing the Envelope-as-a-Service. Notion is all fun and games, until they change something you needed from under you, or get bought and thank you for the incredible journey. Similar for the others.
HyperCard being a desktop product was a feature, not a bug.
> More and more non-CS people are learning to program every day.
We (the computing people) make this much harder for them than it needs to be. Instead of giving them malleable computing systems like Oberon, or Hypercard, Smalltalk, or related, we give them glistening time sharing systems and tell them that "programming" is, really, entering text into a teletype emulator and watching it do things. People are not stupid, but insofar as computing has been dominated by industry and limited by open source, we have given them stupefying options. We have better examples from the past and should know better.
Lack of popularity of these systems has often different reasons than the technical aspects but your comment sounded like the judgment of someone who dismisses some technology as second-rate just because it is not popular.
For every one person learning to program, ten people are "learning to program". I would say that we're going to have a huge problem where most applicants aren't qualified for a given job, but I don't have to - we're already there.
Still, BOTH numbers are trending up, so I guess at least that part is nice.
If we can make software development less of a specific job and more of a companion skill of most jobs, maybe people will start and develop systems for their fellow non-technical co-workers.
We're going the opposite direction though, for good reasons. Have you ever experienced the result of business processes dependent on "hobbyist" systems developed by in-house dabblers? It's not good, and always needs to be cleaned up later.
But it was actually a popular thing in the earlier days of computing, and still happens sometimes.
One of the most likely places for it to currently happen is actually crazily complex Excel "macros" -- basically software apps written in Excel. So on the other hand to my original point, actually maybe those aren't always disastrous (yet?), there are a surprising number of them around powering all sorts of businesses, and their internals would seem horrifying to a software engineer, but they are working...
This is just a result of people having finite time and resources. There are thousands of possible hobbies, and making home-cooked software from scratch is just one obscure one.
For instance, can you make a literal home-cooked meal, without acting as a passive mass-market consumer (no grocery stores or supermarkets)? After all, you're living in the richest part of the richest part of the richest time of the world! Can you raise and butcher your own pig, grind your own flour, grow your own vegetables in your backyard, distill your own spirits, and chop your own wood to build your own fire? Cooking has always been a much more popular hobby than anything technical, and likely always will be, and yet almost all its aspects have been outsourced. Why expect software to turn out any other way?
Yes, but... in my childhood in the 80s, simple carpentry (for example) and other "handyman" type skills was a very popular "hobby" among men (yeah it was gendered) across the nation. Because it wasn't just about what "hobby" seemed "fun" (although it may have been enjoyable for people who did it), it was about control over your environment, being able to make the things you needed instead of relying on having to pay someone else to, being able to customize them to your needs, etc.
Software in today's world is similar. But does not occupy a similar place as a prevalent "hobby" that can produce useful things for your daily life.
Now it's true that such skills even "hands-on" are much less prevalent in younger generations. What the reasons are, I' not sure we have totally identified.
But I'm sure it's not just about considering them as "hobbies", if that means recreational activities people might do because they are "fun", like bird-watching or drawing, divorced from their utility. Carpentry/handyman skills were not so popular because people found them more "enjoyable" than other "hobbies".
This is absolutely a false dichotomy. A home-cooked meal is not one conventionally thought of as being from items hyperlocally sourced. At least, not in the last 70-100 years. We’ve had Sears Roebuck and the like for quite some time.
> A home-cooked meal is not one conventionally thought of as being from items hyperlocally sourced.
Exactly. To allow the category of "home-cooked meal" to exist at all, beyond professionals and a few dedicated hobbyists, we have to loosen the criteria. Similarly, "home-cooked apps" don't exist in the strict sense, but do exist with looser criteria, where we allow people to work with standardized, mass-market tools (e.g. drag-and-drop website and form builders). Expecting lots of people to start from raw source code is like expecting home barbecue to start from the pig.
> you have to pretend it is some kind of "beta" of a mass market commodified product instead
Or you could use enterprise distribution, which is the "correct" route for apps that don't go through the app store. But TestFlight is probably the simplest way to distribute an app to a handful of people and the only real problem with it is the need to distribute a new build every 90 days.
Since Apple really loves to lean in on the “anyone can learn to code” message around iOS (see: Swift playgrounds), it would be cool if they could figure out how to do a family-and-friends version of enterprise distribution. Some official way to let small groups of people build things for each other without having to list them in the App Store for anyone to find.
Those days are gone. The article doesn't explicitly mention the fact we all know: Very few people have the capacity to create a 'home-cooked meal' like the OP, for their own use.
In fact much fewer than could create a little hypercard app. It's a world where there are pretty large barriers to most people being software 'makers', and instead we are just consumers who can choose from what is offered to us by various companies trying to figure out how to monetize our needs and desires.
Part of it is the increased complexity of app development. Part of it is the walled gardens of our suppliers.
> I distributed the app to my family using TestFlight, and in TestFlight it shall remain forever: a cozy, eternal beta.
Indeed. Because the very infrastruture ignores the needs of one-off "home cooked meal" apps, it assumes they don't exist, you have to pretend it is some kind of "beta" of a mass market commodified product instead (and be glad our suppliers still let you do that for 'free'). Our computers (increasingly the phone/device is people's primary computer) are no longer ours to do with as we will.
It is sad. If those who created the computing revolution 30 years ago (including prior versions of us) could see where it wound up.... the level of technological sophisitication of our pocket audio-video-geolocated-self-surveilling-communicating devices is astounding; the lack of 'empowerment' or control we have over them is utterly depressing.