Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suppose someone who is running a ten year old system doesn’t tend to care too much about the input latency of their terminal emulator?


A ten year old system is a Core 4xxx or 5xxx series, which is plenty fast. The main struggle would be the integrated GPU at higher resolutions, but presumably a 10 year old laptop has a pretty low resolution.


It is about testing in general.

Also you never know, one patch that improve perf on latest gen systems might do the opposite on much more humble ones which still represent a sizable portion of the computer running today. It is important to know what you are gaining (or not) and where.

And I don't agree about your initial premise. The fact one is running an old system that might be super slow on most js hungry websites, playing AAA ganes or compressing 4k videos doesn't mean slowness has to be expected on native and/or small or simple apps.

Also, while we can expect an old computer to be slow at things it wasn't designed for like decompressing/compressing videos with much higher bitrate/resolutions, handling traffic with more cpu intensive encryption algorithm, we should not accept a computer being slower at things it was designed for and was working well a decade ago. My motorbike is probably marginally slower from wear and tear of its engine, yet still goes like stink at the green light and has no issue reaching the speed limits. My well maintain 35y old bicycle is still as fast as it ever was, even probably faster thanks to better tires. My oven and gas stoves are still heating my meal the same way it did before. Why should an old computer be slower at doing the same things? This is a disgrace and the whole IT industry should be ashamed of accepting and enabling this.


Don't get me wrong, I get where you're coming from. I just hate the overall sentiment

Here we have a person who

... observed an interesting change in a open source project

... built out a tool to carry out his own testing ' ... shared the firmware for set tool

... ran some benchmarks and visualized the data

... took the time to write an informative article outlining the though process, motivation and sharing the results

And your comment essentially came down to

> "Testing methodology bad, not representative of your personal usecase, should have been done different, data is trash"

I think its incredibly rude and a steep ask to expect benchmarking to be done on a meaningful set of legacy hardware. Sure, legacy systems are a non insignificant portion of computers operating today. But the author used a system he had available and showed the results gathered. I'm sure his time is better spent on other projects or another blog article rather than benchmarking the same stuff on the n-th processor generation.

The Linux community can be accused of many things, not caring about performance certainly isn't one of those. The beauty is, if you deeply care about a specific configuration that you feel like is being neglected, you can step up and take over some optimization efforts. If you don't care enough about that scenario, why should anybody else?

Limiting this to that specific instance: the hardware is affordable to maintain, the firmware is linked. If you want to benchmark a different system of yours, go for it an publish an article. I'm gonna read it


If you want it to "do the same things" you can still do that. Go run the same software from that time then.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: