Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> what I am sure about, is that websites don't need to be complicated to be effective

While I agree somehow, we live in 2022 with high broadband and powerful browsers/CPU.

Let's not limit ourselves for the sake of it. We can do simple javascript, lightweight images, etc...



> we live in 2022 with high broadband and powerful browsers/CPU.

Some of us do. I think it's important to keep in mind that especially those of us living in tech hubs are in a highly distorted bubble when it comes to tech — for us, things like gigabit internet and 1-3 year old top of the line phones and laptops are the norm.

Beyond that bubble however are a lot of slow internet connections (sub 1mbps DSL is still a reality for many North Americans) as are computers that are either pushing between 5 and 10 years of age or are of similar power to computers that old (think bargain bin x86 laptops and Chromebooks).

Occasionally I'll pull out my circa-2008 Dell laptop (which can still run modern operating systems fine) and use it for a few hours to remind myself of this. It mostly does fine until I have to use some unnecessarily heavy website.


This, tbh.

In my area the ATT only provides internet that runs 18 mbps maximum. Its infuriating when I am browsing a javascript-heavy website at peak hours and it takes forever to load. I don't think HTML-only is necessarily a good idea, but less javascript for basic things that HTML does well anyway is certainly welcome.


That's the one that kills me. When I am on a barely functional internet connection, and I need to download megabytes of js so that it can then do a fetch for three paragraphs of text.


To add to this, something occurred to me recently, during a train ride:

I don't know how many high speed connections a moving train has, but when I cannot load a simple mostly text website, then some people on board apparently are doing other things than I am doing on that shared connection. One hunch I have is, that they are downloading many megabytes of bloat JS libraries, while I am trying to just read some text on a website and have mostly JS blocked. Some more might even be watching movies or running big downloads or windows updates or whatever.

Anyway, one result of bloated websites, even if we have high speed connections in our homes, is that we struggle with the shared connection, like on a train. If every Billy needs to load Facebook, Instagram, YouTube and whatnot, it surely is not going to improve the situation for other people on the train. Of course another reason might be, that the train's connection is bad in the first place.


> Some of us do.

I live in a suburb outside of a large metro area. Not at all considered rural. The only internet provider we had when I moved from my old house to this new suburb was Xfinity (Comcast). The ISP I had previously did not have service in my area and after an exhaustive search, the best I company I could find that WAS NOT Xfinity was a commercial DSL line with a dedicated 5Mbps up and down line for the same cost as and Xfinity line with a 400Mbps up and 15Mbps down. It wasn't even close. I also had to have this company install the line which would've been even more money.

In the end, it was pretty surprising how many areas still only have a single choice for their internet service.


> 400Mbps up and 15Mbps down.

The other way around, I assume?


Yes, you are correct. Thanks for the correction.


> Occasionally I'll pull out my circa-2008 Dell laptop (which can still run modern operating systems fine) and use it for a few hours to remind myself of this. It mostly does fine until I have to use some unnecessarily heavy website.

But this doesn't have to limit you to html only websites. 2008 (or 2006) js was perfectly fine for most tasks.


Yeah I'm never going to make the argument that HTML alone is adequate in most cases. Light JS, like as you said was featured on most sites of that era, is perfectly fine since the utility added is significant and the drawbacks very minimal. Same goes for images… highly optimized small PNG glyphs and small JPEGs are fine, you only start getting into trouble when loading multiple megabytes of images for purely ornamental purposes. My single core G5 and P4 machines handled such sites with ease, even with the (relative to now) badly optimized web engines of the 00s.

Problem is, light JS and small/optimized images are becoming more the exception than the rule. When devs have ample bandwidth and powerful machines they're much less likely to carefully weigh every dependency and unnecessarily large image.


I have a 2017 MacBook. Simple sites that should just f**ng work (e.g. JIRA) are completely sluggish.

Yes, let's limit ourselves a lot, I shouldn't require a modern M1 to use the web. In fact, Javascript shouldn't exist, because the industry apparently can't use a super optimised runtime correctly.


I have an M1 Pro and Jira is barely usable every other day. I don't think it's a matter of processing power.


Jira was just an example, 2 out of 3 sites nowadays are pure CSS/JS-induced waste of energy.


> we live in 2022 with high broadband and powerful browsers/CPU

That's a bit smug. I live in one of the richest nations in the world, and still much of the country has only slow connections available. Even those have often become intermittent since waves of climate-change enhanced disasters have started regularly sweeping away much of our infrastructure. Those disasters have also further impoverished much of the population, making it hard enough for many to keep a roof over their heads (thousands living in tents and caravans), let alone 'powerful CPU's.


> we live in 2022 with high broadband and powerful browsers/CPU

It would be nice if this were true but we're far from it. High broadband is not a given, browsers are slow (yes), CPUs stopped scaling vertically and don't compensate for bad programming anymore.

It seems to me that the choice of not using HTML-only has more to do with the inability to do so, rather than the desire to not limit oneself.


> It seems to me that the choice of not using HTML-only has more to do with the inability to do so, rather than the desire to not limit oneself.

That's why I write

> Let's not limit ourselves for the sake of it. We can do simple javascript, lightweight images, etc...


A fair portion of the time people are on phones where they have no idea how fast/reliable their connection is going to be from moment to moment.

Whether that (among other cases) creates an obligation in everyone to account for semi-failure and full failure (let alone retreat to pure markup) is another thing, of course, but the industry would be better and its practitioners more deserving of the term "engineer" if we did.


I'm not neccessarily sure of this analogy but I want web browsing to be like a book. I "open" a web page and it's there in it's entirety. Waiting and ready for wherever I want my eyeballs to go.

I'm not averse to javascript or to images(my site is almost all images) but the slowness of so many sites these days says the high broadband and cpus can't keep up.

And maybe someone can explain this to me but why does going back in the browser seem slower or more intensive than loading a new page? Is that because the broswer itself is trying to load the previous state?


What if my highspeed internet flatrate is used up? After that I only have a slower speed (64 kbit/s)

This should be enough to render text-based sites. E.g. HN or news sites. But the reality is, that basically no side is able besides HN is usable at that internet speed. Soo much content out there could be accessible at that speed. Sure no videos or images. But everything text based should still work.

There are probably a lot of tools in the webdev-toolbelt that would allow to allow even e.g. image heavy news-sites to be usable during images and scripts load.


Let’s not though, unless there’s some benefit to the user.


If I run a website, to some degree, I’m running it for my benefit. If I want to put fancy schmancy JavaScript on there to show how clever I am, or because I think it will make me a billionaire, I’m gonna do it. Tech-Puritanism is not going to stop me.

Now if you tell me I shouldn’t because it’s less likely to make me a billionaire, I might want to listen.


You shouldn’t, because it’s less likely to make you a billionaire.


> While I agree somehow, we live in 2022 with high broadband and powerful browsers/CPU.

There's plenty of situations where one or both of those statements are temporarily untrue and plenty more where they're permanently untrue.

Most users don't have flagship phones or brand new MacBooks. They don't have ultra fast WiFi or 5G. Even if they have more powerful devices they might be on shitty school/Starbucks/public/office WiFi.

You don't need to build everything like it's 1996 but it's absurd to simply assume every user is on a MacBook with gigabit Ethernet. The web is full of terribly built web pages pretending they're "apps" and using megabytes of JavaScript to show some text and images.


While I agree somehow, we live in 2022 with high broadband and powerful browsers/CPU.

There are millions of Americans who do not have this luxury. And let's not pretend it's not a luxury.

Just a couple of weeks ago there was an item in the news that a million people in New York City who do not even have cell service in their homes.

I think it was in the Times article about the 5G towers popping up everywhere.


I burned 3 billion tons of coal and all I got was this lousy javascript.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: