Not perceived: actual, effective latency is higher. Today's hardware is capable of better latencies, but most software guys don't care about latency all the way from firmware to OS to apps to websites. It does matter to some degree in games and vehicles but it's hard to find an industry that values latency outside of toys or industrial applications. Practitioners say this is because "99% of users don't care" but that's only because 99% of users have either (1) never experienced fast latencies or (2) have forgotten them because latencies have gotten gradually slower over the last three decades.
All large systems of systems have "duct tape", I'd go as far as to say it's an emergent property of systems of systems.
It comes about because unless you design every sub-component system in lockstep with every other (which is impossible, the romans didn't lay out londons for cars, the victorians didn't put in sewers with respect to where we'd want to run fiber) you end up with an impedance mismatch at the boundaries.
It's why back-hoe operators from a gas network rip up fiber depressingly often, why (in the UK) the UK transport system has choke points between road, rail and air and on and on.
Modern computers aren't a unified system, they are lots of seperate systems that talk to each other and frankly having some (minimal understanding) of what has to happen for Gnome to appear on my screen when I press my power button I'm amazed that it ever works never mind mostly without fuss.
Exactly. Even if you do care, you can only exert so much influence on the platform you're developing for, unless you're working for a company like Apple or Sony. Even then, there are many, many layers to the latency problem.