If you are just starting out as a web developer and aren't planning on changing careers anytime soon, the #1 language you should learn is JavaScript. It, or languages that compile to it, will still be in fashion in 5 or even 10 years. That's because the web is shifting to thick clients, and JavaScript is the only choice in browsers. JavaScript is also finding its way into server-side code (Node.js).
"I think PHP is the perfect language for beginning web development for two reasons: Direct HTML embedding."
Server-side view rendering is on its way out. So this is not a real advantage.
"I find it more than a little strange that PHP is simultaneously (a) out of fashion, and yet (b) absolutely everywhere."
This is simply how technologies rise and fall. By the time a technology is ubiquitous, the original problems it solved have changed and new solutions are being created. Just because you have a bunch of legacy code running lots of web apps doesn't mean it's the future--it means it's the past.
Also, why pick a language that pays poorly? Ruby is by no means inferior to PHP, and you will get paid more per hour to program in Ruby.
The seductive advantage of server side rendering is centralized control of what all clients see. Whenever you push too much business logic down to the clients you have to contend with how to ensure all of those clients are using the same logic at all times.
This is just a result of loose architecture. If your models are cleanly abstracted using REST principles on the server-side, then server templating becomes much more appealing.
I say this because I am doing it -- building apps with REST interfaces on the server side. It's not a common pattern yet but I think this is where things will actually go (as opposed to the client), because it's actually easier to use, understand and iterate on.
If you can write Javascript templates that interface with your back-end using REST calls like "GET /accounts", you can do the exact same thing in a server side template with better performance and dramatically simpler security.
Exactly. Correspondingly, my hope (along with one of the folks below) is that advances in hardware will obliterate any speed difference and enable the back end to be king.
I'm not convinced that it will fade, especially given the rise of mobile devices (as the gentlemen has already pointed out) and what I expect will be a marked improvement in browsers in the coming years (Google Chrome/V8 is a good example). The rise of JS libraries that enable you to use models and persistence in the front end make a lot of people really question why going to the server side and asking for a view is always necessary. Still necessary sometimes, absolutely. But better to go to the server side only when necessary.
there is a place for both. There are certain things that make sense for server side rendering. If I have a blog, why render it on client side, its evergreen content... makes sense for the server to deliver the static content. For web apps, there is a lot of value to client side rendering.
Well said. RESTfulness will ALWAYS have its place, and blogs are the example par excellence. Even in Node, which prides itself on its async/non-blocking capabilities, there are all kinds of ways to incorporate REST.
Twitter is a web site, not a web app. First-page load performance is critical for them. I wouldn't have chosen a client-side rendering architecture.
My perspective is that of web apps. By using client-side rendering I can prefetch more data than the user needs at almost no extra cost, and render it when they click without hitting the server. Not having to contact the server at all beats any server-side rendering architecture. Even web apps that supposedly do server-side rendering will end up with some prefetching for performance (e.g. tooltips and detail panes are already embedded in the page but hidden).
The reality is that the big bottleneck in any web app is not the fetching of data but the fetching of the front-end. The more front-end code (images, js, css) that you can fetch from browser cache or CDN the faster your app. The "ideal" case is not having to fetch anything except the data from the server. You can do that today with client-side rendering combined with far-future expiration or HTML5 appcache. The caveat here is that the performance benefits only start compounding for client-side rendering if you can prefetch data. Twitter couldn't do that, so there was no benefit.
Like any complex technical issue, there is no right answer in general, only a right answer in context. People should just be careful about what they're measuring. The important benchmark is time to load the page, not time to process the page on the server (a huge difference), and measurements should be globally distributed (you should be measuring transcontinental page visits).
History has shown that with time these numbers shrink, there will come a time when the difference is between two very small numbers and that won't be a factor. It almost is. A couple of years. 4G LTE for example.
The issue in this case isn't (primarily) bandwidth, it's latency. Cell connections have terrible latency, universally, and there are a lot of problems to solve before that stops being true.
You could be right about these numbers becoming meaningless. But in the meantime, there is a noticeable difference, and that can mean life or death in the jam-packed web dev market that we see today. If cramming as much application logic as possible into the front side continues to shave off milliseconds or even seconds in some cases, then I think that trend will continue.
That said, I hope you're right. A world in which front vs. back end didn't matter with respect to time would be one in which developers were free to reveal far less on the client side, and that would be a victory indeed!
The Internet is becoming more ubiquitous. Being "on the Internet" will mean less and less being in front of your computer. The proliferation of devices you can access a web app from is increasing (smartphones, tablets, more coming in the future) so it is becoming more useful for web apps to be like a "smart database in the cloud" and for the view logic and code to be on the client.
I'm really not sure why this is being downvoted. It's not trollish. I believe what I'm saying to be true: mobile is increasing, and this is leading to thicker clients and more JavaScript and therefore less server-side view logic. Please correct me if I'm wrong, but downvoting me is confusing.
Thanks for the courtesy of a constructive discussion. :)
Yes, that's a valid point, if you mean that smartphones have less resources (memory, CPU) to handle JavaScript execution. But keep in mind that a native smartphone app is a thick client; it has all of the view logic.
Perhaps I need to back up and explain my point better. Maybe people are misunderstanding me because they think I'm talking about what's going to happen in the next few months. Server-side view rendering is not going anywhere anytime soon. But it's on its way out in that it's an anachronistic way of thinking about the web; it's a relic of the past--of the time of web "pages"--and not of the future, of web "apps" and "resources". As long as you are serving content that's primarily to be "read", like articles, books, blogs, etc., server-side rendering has its place.
But look what's happening to the web. Native iPhone apps have to store their data somewhere, so they communicate to the backend via an API. Cars are reading Twitter. Thinking about the past of the web, only browsers consumed web content. Now non-browsers consume web data via APIs, and I see this trend as increasing.
I see a trend where more web apps will be like a "smart database in the cloud", meaning they hold data and business logic and communicate via JSON APIs. There may be a variety of client devices accessing this data in the cloud: native iPhone apps, desktop browsers, other Internet-enabled devices.
The main point is that clients are no longer only browsers. The clients could be a variety of devices running on a variety of operating systems. So it doesn't make sense for the server to render the view and send it down the pipe for the client to display. It makes more sense for the server to send data to the client and the client to decide how to render it.
I'm not claiming that server-side view rendering will go away completely; it has its place with articles and blogs. But with apps, I see a movement toward thick clients and I don't see this trend slowing down. It's hard for me to envision a world in 10 years with less smart phones, less devices hooked to the Internet, and people only accessing web apps and web app data with browsers.
Let me also put this in context of the article's original point, that PHP has a nice feature of being able to be embedded directly in HTML. Yes, this is nice. But my point is that if you are starting your web career and therefore are looking toward the future, the types of web sites that will use inline PHP will be more like content sites and less like web apps. And I expect a bigger demand, higher pay, and arguably more interesting work for web app creators than web content site creators.
You seem to confuse server side rendering for page based apps, these are unrelated. I can build fully dynamic Ajax pageless applications that are effectively just as snappy as anything you can with client side rendering while keeping all rendering on the server. Rather than sending down json, I'd just send down rendered fragments of HTML and update the contents of some tag.
It's a vastly simpler approach that makes client side code nothing more than update this or that, and works just as well while allowing the traditional server side templates. It's not a relic of the past and it's not on it's way out, nor is it inferior to client side rendering, in fact I think it's better.
> Server-side view rendering is on its way out. So this is not a real advantage.
server side is "on its way out" the way relational databases were "on their way out" when everyone was hysterical about NoSQL 18 months ago. That is, while client side rendering will become much more prominent, server side will remain quite popular as well. Hybrid approaches will likely become the norm.
We've written an app here with 100% client side templates, and while there are some nice patterns that fall out of it, unless you're writing your whole server in javascript as well, you have to give up a lot of conveniences offered by server side templating.
Agreed, it will be hybrid for a long time, but I expect the trend toward view logic being more on the client and less on the server to continue, simply because the web is less and less about "pages" and more and more about "apps" and "resources".
I don't understand this argument. The way I use server side templates is much simpler and more powerful than any demo of client side templating I've seen. I think you're reading a little too much into the hype, otherwise maybe you can share an example of superior Javascript templating patterns because I can't find them.
It depends what you are trying to do. If you are only targeting browsers, and especially if you don't need a lot of responsiveness, it's much simpler to just render views on the server side and optimize your server for speed. It takes a lot of effort to write a thick client.
But if you are targeting a browser and a native iPhone app, the view logic lives on the iPhone app. It used to be that you could just serve up a web page and assume that your client was a browser. Not anymore, and that's my point. And my point is that the trend will continue: I think we will have more variety of non-browser clients in the future, not less. Your car, refrigerator, geranium, etc. will all be hooked up to the web.
Keep in mind that I'm not talking about whether you should create your web app in Backbone.js today, but I'm looking years into the future, anticipating trends.
I think I see what you're talking about: more non-browser clients exist so the web servers need to be able to expose an API. I agree with that.
Besides that, the ability to "template" output on the server is still my ideal way to transform the business API to something the client can read. In the case of a browser, I'm templating for HTML. For a mobile app, we may just pass through in JSON and let the native mobile app handle the views, even output HTML5 for mobile in addition.
In all of these cases, server side templating of output plays a role I think. For browser targeted output, server side templates are still simpler in the implementations I've seen.
I don't think the necessity of writing your app so that it exposes APIs has any correlation to whether or not the traditional web server interface to your app can still serve HTML from the server. It's just as easy to wrap API methods on the server and feed that data into a server side template as it is to call the API from a client side method (well, it's actually much easier, no need to manually futz with the browser URL/history and all that).
It's hard to predict 10 years out, but not too hard to see 5 years ahead. Generally speaking, what's popular now existed in some form 5 years ago.
JavaScript has a monopoly as the only language that runs in browsers. If this were to change, it will take years, not months, for browsers to start supporting another language (or go to a bytecode model). Then it will take more time for developers to start using these new languages and then an ecosystem of tools and libraries to form around this. Then it will take even more time for the "old" language JavaScript to seem obsolete.
JavaScript is currently gaining traction and is the lingua franca of the client-side Internet, which is growing. This is why I predict it will still be strong in 5 years, because it's got a lot of momentum and I only see the trends as increasing (mobile devices are increasing, browsers are becoming even more like operating systems with memory management etc., JavaScript as a language is continually being improved).
We really need a quantum leap in language technology. Currently the Internet technologies have grown organically and because of this, most of what we use to program has been repurposed.
HTML was meant to describe text, now it describes regions of the screen to be repainted as well as the "content" to be repainted.
CSS was meant to add simple effects like bold, italic and simple newspaper-like positioning (float this image left). It's been repurposed to describe visual rendering and even animations.
JavaScript was meant to add some scriptability to the DOM, and now it's morphed into a full-fledged programming language.
So the state of the web is a depressing mishmash of old, repurposed technologies.
It would be nice if there were a programming language that "knew" about the web. It would realize that the programming effort would be shared between two computers, client and server, and would have appropriate security and other conveniences baked in. For example, you should only have to describe your business logic once (as opposed to once on the server and once on the client). Also, this programming language should no longer be tied to the anachronistic concept of rendering a "document" with "styles".
But alas, it's very hard to simply go into an academic tower and design a language and release it into the wild and expect it to catch on. These things tend to grow organically and incrementally. So whatever next step there is has to embrace where we currently are. This is why you see not merely new client-side languages, but new languages that compile to JavaScript. If they didn't compile to JavaScript, how could they ever be used?
Absolutely agreed. I really love JavaScript for a lot of reasons, but it still astonishes me that that our lingua franca was designed in 10 days. In 1995. For NETSCAPE.
I'm not so sure I would describe the mishmash as unambiguously depressing. It's also pretty inspiring seeing people using the mishmash in a sort of bricolage kind of way. One thinks of libraries like Underscore and Backbone that seek to account for some of the "deficiencies" in JS.
And don't even get me started on CoffeeScript.
But I think we would both agree that it's strange that no one has come along attempting to really REALLY break the mold by designing a new browser from the ground up that wasn't tied to JavaScript and the DOM (and maybe not even HTML!).
This is true now, but Dart is also less than a year old. I don't have strong feelings toward Dart either way, but these things have a gestation period. Internet Explorer didn't support JavaScript at first either, and Safari and Chrome didn't exist in 1995. So it really remains to be seen.
I think you're right about a lot of things. Things like Backbone (which I wrote about at http://blog.appfog.com/putting-some-mvc-meat-into-your-app-b...) will make recourse to the back end more and more selective as time goes by. JavaScript is indeed quite a force, and as far as I can tell that will stop only when browsers start reading other languages directly (which is not completely unthinkable, but also not really on the horizon).
My point was not directed so much against JavaScript as against the notion that PHP is simply an anachronism.
There's also nothing stopping you from using lots and lots of front-end JavaScript in a PHP-driven context...
"Ruby is by no means inferior to PHP, and you will get paid more per hour to program in Ruby."
Any decision you make to learn anything based on payback has to factor in what will happen if more people choose to learn something. If more people learn Ruby because it pays better (let's assume that is correct) then supply and demand could equalize and the pay will go down. Right now if pay is high for Ruby it means most likely there is a supply and demand in balance. When that changes so will the pay. The only time I haven't seen this happen is with the rates lawyers charge. And the reason for that has to do more with personal relationships and quality of work.
Right right right. I would add this: Ruby could well be considered "passé" someday (history tells us it necessarily will be), and the Rubyists of the future would really bristle at the suggestion that what they're doing is worth less simply because the market says it is.
The Rails MVC architecture is already past its prime, since it solves the problems of 2004. It's possible that startups may move more toward backend-as-a-service + JavaScript front-end instead of using Rails as their MVP.
In any event, the transition should be slow and I fully expect Ruby to be around for a long time, but people may notice a slow decline in demand for Rails programmers in the next few years compared to other hotter technologies. Sinatra (or some other Ruby framework) may replace Rails as a backend.
I'm a Rails programmer and I don't bristle at the suggestion that market demand changes. I welcome the change and adapt. I don't mind learning new languages.
I think that Sinatra will become increasingly popular within the Ruby community (and financial help from Heroku/Salesforce can't hurt!), but I also think it's interesting that the micro-frameworks inspired by Sinatra (Flask for Python, Express for Node, plus many others), will become the fall-back for more advanced developers because they do a lot less specification and leave a lot more space to the developer.
So I say don't give up on Ruby. And if you insist on Rails-style MVC, there's always Padrino for Sinatra, Tower for Node, etc.
"I think PHP is the perfect language for beginning web development for two reasons: Direct HTML embedding."
Server-side view rendering is on its way out. So this is not a real advantage.
"I find it more than a little strange that PHP is simultaneously (a) out of fashion, and yet (b) absolutely everywhere."
This is simply how technologies rise and fall. By the time a technology is ubiquitous, the original problems it solved have changed and new solutions are being created. Just because you have a bunch of legacy code running lots of web apps doesn't mean it's the future--it means it's the past.
Also, why pick a language that pays poorly? Ruby is by no means inferior to PHP, and you will get paid more per hour to program in Ruby.