Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I disagree with the first bit. HTTP is an ambiguous, overcomplicated mess. I regularly have to wedge myself into the stack and deal with things like cookie directives, pipelining, persistent connections and utterly broken caching semantics etc. It's just horrible from end to end.

HyperText itself is fundamentally a well engineered concept (I mean it worked fine for HyperCard etc) but to base the public facing WWW on SGML was just a plain horrible idea.

For a number of years I actually preferred Gopher, WAIS and Usenet. I still do now when I think about it for the sheer simplicity and the fact it's designed to push indexes and information to rather than to ooze marketoid vomit.

Agree with the "wrong kind of professionals" statement though.



I see where both of you are coming from, but I can't help but notice the number of competing network protocols and formats that HTTP/HTML defeated in the market. Did the web succeed despite its technical failings or because of them?


Possibly a mix. The fact that the protocols were free an unencumbered, and reference implementations of both server and client software were provided helped immensely as well. Any idiot could come along and either pick up working pieces or modify them. And they did.

Tim O'Reilly's had a few things to say about watching the Web take off. Its primary competition was closed systems: either entirely closed networks, or proprietary protocols, or both. He wasn't willing to bet his company on any such thing. When the Web emerged, it was clear to him that it was the solution he'd been looking for.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: