Somehow, everybody pretends as if Webkit is some proprietary, closed source rendering engine, but after all, it is the great success of the KDE/Konqueror rendering engine, being adopted by some huge corporations (Apple, Google). [1]
While I appreciate that we still have the independent Firefox and its brand new Quantum engine, sometimes I feel like the Konqueror/KHTML team does not receive the appropriate tribute for laying the foundation for the dominant KHTML/Webkit/Blink engine.
While I largely agree, a ton of work was done by Apple before Webkit was usable. Most sites would show subtle rendering glitches and it took a while before it matured.
Not sure why you had issues. In my experience under KDE it was rock solid. It was my primary browser from around 2002 until ~2009. I often tried other browsers and Konqueror led the pack in terms of web conformance and performance.
The way I remember it is that KHTML and Mozilla worked perfectly fine if websites followed the real standards instead of MSIE's defacto standard. Which, back in the early '00s, was often not the case.
As one of the few who paid for a web browser (paid $29 for Opera 7 in 2003), it looked to me like Opera really had no choice.
There was a post from an Opera insider that I can't find but it was basically this: the web's complexity was evolving faster than the Opera team could maintain their proprietary Presto rendering engine.
Switching away from Presto and building on WebKit was a basic matter of survival. Yes, they still became irrelevant but they would have also stayed irrelevant with their Presto engine.
I agree with their assessment because around 2009, I started encountering more and more web pages that broke Opera. The Opera forums had more and more complaints from users reporting broken web pages. The Presto engine was becoming a liability.
It was a vicious feedback loop because web authors wouldn't bother to test their sites with Opera ... which led to more user frustrations. I had to switch to Chrome to get a usable web surfing experience back. I originally paid for Opera because it had the fastest rendering engine which was very helpful for slow dialup connections. As Presto started falling further behind, that speed advantage was negated.
Opera did try to some interesting features such as "Unite" which -- if you squint a certain way -- was a form of p2p decentralization. Yes, it's interesting to have a built-in web server in the browser but not enough people cared about it.
The author's predictions for Firefox's Gecko engine meeting the same irrelevant fate as Presto didn't happen because unlike Opera, Mozilla from 2004-2014 got massive funding from Google. Mozilla could afford to keep programmers enhancing the Gecko engine. Opera couldn't do the same with Presto.
(I worked at Opera during the WebKit/Chromium transition and work at Mozilla now. But in both cases I'm just a normal individal-contributor type employee with no special insight into strategy or decision making).
There are always options. Opera could have doubled-down on Presto; putting more people on the core team, and focussing efforts to keep up with, and surpass, WebKit/Gecko. After all, that's basically the option that Mozilla took, which has resulted in Firefox Quantum. Would that have worked? It's hard to say. I would guess not, but I'm not sure the alternative really did either. Maybe Opera was already too far down the web-compat death spiral to engineer a way out. I don't know of a long-term bet like Rust that could have come good at the right time.
Certainly the top-level culture of the organisations was different; Opera's leadership were very concerned with maintaining/maximising the value of their shares, whereas Mozilla is more clearly driven by ideological goals around the success of the open web. Opera also had a (historically well justified) belief that they could do more with fewer engineers than other compaines. That seemed to work up to a point, but once the difference in resources became too great it was hard to change the approach.
Certainly one lesson is that it's hard, maybe impossible, to be a niche browser with a unique rendering engine. That is arguably a failing of the web, but nevertheless it's a strong indication that arguments that e.g. Mozilla should aim Firefox at small ideologically-driven markets are dangerous. One interpretation of the Opera history is that they were too focussed for too long on the subset of users who wanted a browser with lots of features and configuration possibilities. A product that suits those people might be actively offputting to other users, so inhibiting marketshare growth when faced with competition targetting simplicity and sane defaults.
I read a post once from another (claimed?) ex-employee who said that around 2009 or so, Opera wasn't doing so great and they laid off a dev or two and then after they recovered half a year later, they never re-hired to fill the gap. The poster accounted the technical falling behind to that layoff. Do you share this impression?
As for me, I still use Opera 12 almost on a daily basis and the main issue I have is not broken websites but inaccessible websites because of HTTPS and Opera not supporting enough recent ciphers. The second most frustrating thing is that the JS engine shows its age performance-wise; pages that make heavy use of it for all kinds of dynamic shenanigans get rather sluggish. So my uneducated guess from these observations is that it should have been possible to keep up if they'd wanted to. It's probably that management simply thought using an engine that someone else maintains for them makes it possible to cut down on devs even more. But that part might just be my make-believe world...
It seems to me to be overly simplisitc to take a specific event and say "that's the decisive moment where it all went wrong".
Technically Presto had some unique features; for example the inturruptable script engine allowed the browser to feel performant and responsive without having to heavily invest in parallelism via multiple threads or processes. But it also had some architectural differences to other browsers, and never had the market clout to ensure that the Opera-unique features were reflected in platform features and so had to be implemented by the competition, or even to ensure that features that were hard/impossible to implement in Presto did not become required for web-compatibility. For example Presto was unable to implement beforeUnload without a significant rewite of the core document loading pipeline, but that omission from Presto wasn't enough to prevent sites depending on it when it worked in Gecko/WebKit/Trident. Similarly a lot of effort would have ben required to port Presto to multiple processess (similar to the multi-year "e10s" effort for Gecko).
Presto was also highly optimised for memory consumption and so was ideal for running in resource constraimed environments like early smartphones and games consoles. But I think the launch of the iPhone and Mobile Safari changed consumer expectations for the web experience on mobile and Opera didn't manage to respond in an effective way. I have no idea what the optimal response would have been, but if Presto could have achieved double-digit marketshare on high-end mobiles (as opposed to low-end devices running Mini), we might have avoided many of the compat issues that currently affect the mobile web.
Organisationally I think there were other issues; I already spoke about the focus on the particular use case of making a highly integrated, highly configurable, desktop browser product, which doesn't look much like the more successful mass-market browsers today. I think later there were other problems, but I was far away from the executive decision making, so maybe I'm not best placed to comment on what the actual company goal was.
The fact that Opera couldn't keep up with the standards is also a product of the fact that WHATWG is mostly run by the big orgs. They standardize features at the same rate that the big orgs say that they are developing them.
Idiotic behavior like keeping bugzilla secret (sadly again adopted in Vivaldi due to same people in charge) didnt help either. They didnt even try all that hard before giving up. Opera had an option of open sourcing Presto.
How is Quantum doing? On my computer, it's a lot slower; there are slow spinners in the tab titles and another kind of spinner for loading pages that often keeps me from seeing pages even after I already loaded them.
Wow, really? I've gone from regarding Firefox as a complete also-ran that I would never use unless necessary, to making it my default browser. Quantum is significantly faster than Chrome on my machine - it does eat more battery, but I feel it's worth it for something this fast & smooth. It feels like I got a new computer. (I'm using a 2012 MacBook Pro with 16GB RAM, still on El Capitan.)
In my experience, it's spectacular. I've switched back from Chromium to Firefox. It feels snappier than Chrome, and the newer dev tools means I don't miss Firebug.
I noticed this a few times when having a massive number of open tabs around (500+). In these cases one process of firefox consumed 100 % CPU and another 40-60 %; restarting Firefox removed the excessive CPU consumption and the spinners as well.
It's not a popular opinion here, but FF57 is bad enough that I have given up on it for the first time in maybe 7 or 8 years. I am now using either Waterfox or Qutebrowser for everything.
>Opera did try to some interesting features such as "Unite" which -- if you squint a certain way -- was a form of p2p decentralization. Yes, it's interesting to have a built-in web server in the browser but not enough people cared about it.
Unite is a typical example of something that worked fine and disappear all the same. I haven't find any browser that surpass Opera 12 to this day. All the new "exciting stuff" are no use to me. I don't want to adapt to my browser, I want a browser that adapt to me.
Opera also lost most of their distinctiveness (i.e. features) along the way, laid off the bulk of their developers and sold out to a consortium including Qihoo360 (who, as well as the usual FUD—well-founded or not—about Chinese companies, were simultaneously found to be really bad at being a competent Certificate Authority). There’s plenty of scope for Mozilla to have used WebKit and not done the subsequent parts - Brave’s perhaps as good an example of that as any.
Saying that, roc proposes porting XUL and XPCOM which are largely dead these days (the former by virtue of Firefox switching to WebExtensions). With browser.html for Servo being a thing, building Firefox’s UI in a way that it would work in WebKit is more possible than ever.
I wonder whether this scenario is now going to play out in the opposite way, with Google ditching Blink for Servo once Mozilla finishes morphing Gecko into Servo with the Firefox Quantum project.
Who would win then? Google's money and marketing power, or Mozilla's independence, trustworthiness and being the renderer's creator?
Not sure why this is down voted. I would not be surprised if Google was working on a next-gen browser engine along the same lines as Servo, or even a Servo fork. After all, there is a Rust toolchain for Fuschia...
Mozilla's open source license was created 20 years ago precisely to make it easier to distribute proprietary bits with open bits. I'm not kidding, that was a primary goal.
> the more code engines share, the more de facto standardization of bugs we would see, so having genuinely separate implementations is very important.
Well it's not like there aren't any bugs in the specs. And whether there are bugs in the code or the specification, it's the same process for fixing them : politics :)
There has been a lot of progress recently on the web-platform-tests project [1]. This is a cross-vendor effort to improve interoperability on the web platform through testing.
Historically the process for building web-platform features has been to write a spec and then assume that implementations of the spec would reach compatibility in an ad-hoc manner by patching until sites worked. The big innovation of the last decade — still controversial to some — was the idea that it's OK to adjust the specs themselves when implementations have converged on some other behaviour or when the spec is otherwise wrong. The goal for the future is to use the same engineering discipline that you would use for developing software to developing the platform itself. In particular the objective is:
* Every change to a spec that influences browser behaviour must be accompanied by a corresponding test case.
* Every change to a browser that affects a cross-platform feature that isn't already adequetely tested must be accompanied by cross-browser test cases.
* The results of those tests must be visible to browser vendors in a way that makes it easy to identify the higest value bugfixes (e.g. cases where N-1 browsers agree and one is different, or cases where a spec change makes previously correct behaviour — which may not have shipped yet, but just be enabled on nightly builds or behing a flag — wrong).
Apart from the work of actually writing the test, achieving those goals involves a lot of infrastrucutre work to ensure that the cross-browser tests are well integrated into the development cycle of each browser and are able to cover as many scenarios as possible (testing often involves manipulating the browser in a way that is not exposed to normal web content). Despite the large scope of the project, I think it's agreed that modernising the way the platform is developed, and prioritising interoperability at all stages of spec and browser development is essential to avoiding existential threats to the open web in the long term. Certainly Mozilla put a lot of resources into post-hoc fixes for site-compatibility issues, and whilst there will probably always be some bugs that slip through, it will be much more efficient to catch those problems up-front before they ship to end users.
I sadly don't have data to backc this up, but I'm pretty sure we're already seeing the effects of this effort, with recent, complex, features shipping with fewer cross-browser issues that we would have predicted five years ago.
Web developers code to Web browsers, so bugs in Web browsers lead to sites depending on those bugs, making those bugs unfixable unless you have developers testing in multiple browsers. Spec bugs don't become unfixable that way.
Though spec bugs go the opposite way: if you discover an issue in the spec, then it is very difficult to change it because someone somewhere may have been - at the time correctly - relying on that behaviour.
Web developers can only rely on a spec bug if the browsers they test with actually implement that bug. So it still comes down to what browsers implement, not what the spec says.
Or, you could just be Google, and change your browser to work the way you want, and ignore all the pages breaking by insisting that the change is "within spec", giving a big middle finger to anyone that complains.
It doesn't have to be this way. Apple doesn't have fear of breaking old apps sometimes, for example, and they are successful. I think that absurd backwards compatibility should be considered harmful and some breakage should be normal. It's always possible to install old browser into virtual machine, if content is so precious.
Apple doesn’t have the competitive pressure that I can take my broken iOS app and run it somewhere else as-is, or tell my customers to do that.
“It works in Chrome” (and previously IE) is a real issue and the browser that broke compatibility like that would have to be supported (potentially through the coercive power of its own market share - see the initial “any open source browser versus IE is a good thing” switching to “we need multiple browser engines, even if they’re all open source”) by the others or face irrelevance.
Someone said the same thing in the comments. In a reply, the author does not know why the MD5 fails but confirms that the post is correct:
I wondered if anyone would check :-).
I'm not sure what the problem is with the first post. It's been a long time. You'll have to take my word for it that the first post is the right text :-).
it only have optimistic views of google promoting webkit.
completely ignores the fact that google effectively took over the project, killing old features it didn't like and preventing any new contribution from making it to main line unless they fit their plan.
I don't think that's necessary. This isn't really a cryptographic use of the hash. Between the legitimate text of the blog entry and whatever arbitrary (likely nonsensical) string produces an MD5 collision, it's going to be pretty obvious which is which.
Legible, maybe, but I don't think you'll be able to collide the hash with natural language, so you will need an excuse as to why the revealed text has weird nonsense in it that is actually there to collide the hash.
You can also "see" this, once you suspect it's going on, if you're a cryptanalyst and have the tools to see inside the hash, you can see basically such collisions involving getting the internal state into an awkward place, and then forcing it from there where they want to go, that could happen by accident, but only by truly astronomical bad luck, so it's a pretty good smoking gun.
MD5 is a bad idea for situations where a machine will be verifying, because machines aren't good at saying "that's odd...", enhancing them to do so is _way_ more effort than just using SHA-256 instead. But for something like this where a human will be examining things by hand, it's fine.
For SHA-1 (which we knew at the time would be broken shortly, and then in practice it was less than a year before Google and some academics announced their full collision) we reviewed special "Exception" SHA-1 SSL issuances by hand on m.d.s.policy after the official deadline for SHA-1 issuance and I asked for one application to be explained or rejected on the basis that the requested certificate had bizarre short gibberish values for "Organizational Unit". The applicant provided an explanation (which was maybe plausible but not up to the standard of transparency needed in the circumstances) but agreed to accept certificates missing the OU value altogether instead. That's the sort of thing you'd catch if a human examines what was hashed rather than a machine. I had no reason to believe that applicant was trying anything sinister, but the point of the manual examinations was to ensure everybody can see there were no shenanigans going on (and to make it a complete pain -- if the process was easy it would have become routine and defeated the purpose of prohibiting new SHA-1 issuance, being annoying was a feature).
At the time this internal skepticism about the future of Gecko was very palpable from outside. Which is why it was infuriating to see Mozilla jumping on every bandwagon they could, eventually ending up with the OS silliness: it really felt like they were trying to run from their own browser and from their own tech, like they were ashamed of not being cool.
Thank god they eventually “saw the light” and they’re now back on track.
FxOS started in 2011, not 2008. But yeah, that's the usual scapegoat to explain any of Mozilla's issues from the last 5 (or 10?) years.
You also totally misunderstand the goal pursued with FxOS, which had nothing to do with "run from their own browser and tech". If anything, the current work to remove XUL and xpcom puts Firefox closer to how FxOS was built, not further away. At the time some employees even build a new desktop browser around the same tech (not the failed Tofino experiment), that was outperforming Firefox because it had a lot of the "new hotness" like e10s and web extensions. Guess what, the desktop team ignored it, only to do the same thing later.
Mozilla is not back on track at all, they are still playing catch up on the desktop market which is not growing much and totally irrelevant on mobile.
But they have enough money to last years, in a weird way of "to rich to fail".
> You also totally misunderstand the goal pursued with FxOS, which had nothing to do with "run from their own browser and tech".
FxOS embraced web technologies, but didn't embrace the browser. I think Mozilla is still having a hard time embracing the browser, though at least now the will is there.
Who at Mozilla ever talks about hypermedia? About link text? About navigation? About bookmarks? FxOS was a demonstration of how hollowed-out the philosophy of the browser had become at Mozilla. It used web technologies to faithfully clone non-web UI and OS organization. It would be like reimplementing JavaScript in JavaScript: an interesting intellectual pursuit, but completely useless.
> If anything, the current work to remove XUL and xpcom puts Firefox closer to how FxOS was built, not further away.
Technically yes, but the motivation is different: this is work to make Firefox better, not to make a better-thing-that-is-not-Firefox.
> At the time some employees even build a new desktop browser around the same tech (not the failed Tofino experiment), that was outperforming Firefox because it had a lot of the "new hotness" like e10s and web extensions. Guess what, the desktop team ignored it, only to do the same thing later.
The Firefox Desktop team was too small to pursue much of anything. It was like a dozen people maintaining the Firefox frontend. Progress couldn't happen until the organization was aligned to actually support Firefox.
You are wrong on the long term goal and vision for FxOS: in 2.6 we started to add very webby stuff to the overall UX with "pin the web" features.
And I wonder who the platform team was supporting then, because they were not supporting b2g either...
I'm sympathetic to the idea of building a great desktop browser, but at this point in time that should not be MoCo priority #1. Hopefully the platform improvements done recently will be reused in a different context.
> You are wrong on the long term goal and vision for FxOS: in 2.6 we started to add very webby stuff to the overall UX with "pin the web" features.
(I should say that I don't think there was a more webby version of Firefox OS that would have had more success.)
The webby features weren't the point of FxOS. It was always clear what the point was: create a phone OS/platform built entirely on web technologies, with Gecko as the core. UX was an afterthought and the basic transitions were all based on phone apps and not the web. And it
> And I wonder who the platform team was supporting then, because they were not supporting b2g either...
It's interesting that you felt they weren't supporting b2g, because from the other side we got the message that other things couldn't get done because of b2g.
There were really three directions vying for attention with Platform: Firefox OS, Firefox, and Platform's own goals (generally related to advancing web technologies). Notably when Platform was moved into the Firefox group there was also a message that we should stop trying to get ahead on new web APIs, and the focus became very clear: support Firefox. At the time of Firefox OS it wasn't at all clear what the focus was.
When things are confusing I think there's a tendency to fall back on your own competencies, and for Platform that meant sticking to what they knew how to do, not the most most gnarly (but potentially very impactful) b2g needs.
> I'm sympathetic to the idea of building a great desktop browser, but at this point in time that should not be MoCo priority #1. Hopefully the platform improvements done recently will be reused in a different context.
The desktop browser is why Mozilla has any money to do anything, of course it should be priority #1! People spend millions of hours every day in Firefox, and those people are worthy of attention.
I'm not sure why you think you know what the absolute truth about "the point of FxOS", but that doesn't matter anymore.
The way they played the "platform can't cater to all needs" story is a shame. I was naive enough to think I could trust people in MoCo to not just lie to my face but they acted like any corp. No real justification was given (I asked for numbers, bugs, etc. for months with no answer), 2 VPs did the dirty work for their CEO and the top level module owner declined both to support a community led project around b2g and to write down that they declined (exercise left to the reader to guess who this is. Hint: Brendan Eich was not top level module owner anymore).
Read [0], watch [1] and tell me if you are proud of your leadership for not even engaging a discussing on the topic.
Please read my comment, before trying to read my mind. I'm telling you what it looked like from the outside. Nobody ever cared for the internal politics of Mozilla; what we saw was a wobbly org that looked anxious to build anything that wasn't their own browser. I didn't say the OS was responsible, nobody cares if FxOS ended up making choices that the browser should have made or whatnot. The problem was not FxOS; FxOS was a symptom of an institution that had lost its way before the OS was even in the picture.
> they are still playing catch up on the desktop market
It's not the sort of wave you turn in a month. 57 is a big release, give it time. It had a massive surge of good press, which is a good sign. If they can come up with developer tools that can beat Chrome at something, they will see numbers go up.
> totally irrelevant on mobile.
They are making inroads on Android, which is the only market they will ever play into. iOS will only be cracked open by legal coercion. Anything else is wishful thinking.
> But they have enough money to last years
Mozilla is not just a company, it's basically a public institution. Their role is to champion a view of the web as an open utility, not to be the most popular widget maker. They don't need bazillions of money to do that.
> Their role is to champion a view of the web as an open utility, not to be the most popular widget maker. They don't need bazillions of money to do that.
To maintain Firefox as a viable product they do need a lot of money, and significant market share too.
Without Firefox as a viable product Mozilla would be a very different and much less useful organisation.
Inroads on Android? Got data on that? As much as I love Focus, it's irrelevant in terms of marketshare (and it's also just webkit/chrome under the hood). And Firefox for Android marketshare was going down ever since Mozilla shelved developing it. And Firefox for iOS is neutered thanks to Apple's policies.
I agree about their role, but they need a strong, successful product to defend this ideal of the web as a open/public utility. That is what they are struggling with, and giving up on owning a platform was a mistake imho - they have less options than their competitors.
As Fabrice said, FirefoxOS was really the opposite --- doubling down on Gecko by making it the core piece of a huge bet. And it mean a big investment in Gecko improvements such as multi-process which eventually paid off on desktop.
>* There is a huge overhang of security-critical bugs; we have to choose between addressing that and making forward progress. We are putting code-cleanup projects on the back burner for the same reason.*
Have they considered rewriting it in R... oh, hang on.
10 years is a long time. This was back when I still heard people seriously asking if the internet would last. It was even plausible! Google was only a few years out from its IPO. Few people thought the internet could help elect a president (and I don't mean the current guy).
A lot changed in the last ten years. It may not seem like it because most of those changes were refinements on things we already had at the time. We've just been in a long period of revisions. New programming tools, new social networks, new [insert thing we've had since the '90s]. The late '90s and first half of the '00s were full of radical changes, so it seems less significant in context.
We're probably on the brink of another radical upheaval like the '90s. There's lots of money flowing around, lots of amazing and well-refined tools, and all the low hanging fruit is more or less captured. No one's getting rich on "x as a service" anymore. People will have to really change something to capture the next huge payday, and the first few will set off a cascade of changed expectations.
Wow. You just made me realize that most of the big changes I'd mentally spread out over 20 or 25 years happened just in the last decade. In 2007, I was considered a pretty high-tech guy for just having a blog and being able to edit it. It was still possible to add entries to Wikipedia about things that were not super obscure or recent news. phpBB forums still ruled the Internet. And reddit was a toddler!
It has been like that C. S. Lewis quote - day by day, it felt like nothing big has changed, but looking back, everything is different.
"people seriously asking if the internet would last"
I guess I can see this seeming reasonable in the early 1940s, but it was fully crazy even before Tim built this awful Heath Robinson hypertext system we're using, let alone this century. The Network is not like cars, or pants, it's not a passing fad that societies grow bored of, it's at the heart of what we are as people, it will outlive us all.
I remember trying to explain a lot of this stuff in the early 1990s when I was a teenager - to my parents and being bewildered by how little they understood what had already happened and what would inevitably happen next. It was really like I'm stood on a beach, there has been a distant rumble and the tide is going out very fast, and I'm saying "Inland. Mum, Dad, right now. Don't stop to pick up towels and stare at the horizon, we must hurry inland or we will drown" and they don't get it at all.
Ten years ago my home page showed up on the first page (top ten) of search results for a Google search for just my first name, or just my last name. I've never been particular famous—there were just no individuals with a web presence, outside of tech.
[But mostly I just came here to practice using an em dash correctly…]
Still, as someone who lived the browser wars of the 90's, "ancient" is not an adjective I'd use for something that happened only a decade ago.
Or put in a different way, if the browser changes in 2007 are "ancient", how would you describe the ones that happened 10 years before that? Prehistoric? Jurassic?
While I appreciate that we still have the independent Firefox and its brand new Quantum engine, sometimes I feel like the Konqueror/KHTML team does not receive the appropriate tribute for laying the foundation for the dominant KHTML/Webkit/Blink engine.
[1] https://en.wikipedia.org/wiki/WebKit#Origins