iOS-style runtime permissions (for things like location access) were introduced in Android 6.0, which was released back in 2015. That's quite the long play.
The biggest reason was that the Apollo program was able to return film from the surface of the Moon to Earth. Many of these photos weren't returned to Earth in film form. Instead they were developed onboard the spacecraft, scanned, transmitted via radio, and then recorded onto tape on the ground. Every step of that process incurs a decent bit of quality loss.
You can actually tell which probes had a reentry module because the photo quality is pretty stunning compared to the photos transmitted over radio, look at Zond 8 which returned film negatives to Earth.
While I greatly dislike the nationalistic undertones in your comment, you are – unlike what some responses claim – not wrong in that the video cameras for the Apollo 11 mission [1] were indeed American and made by Westinghouse [2] and lenses by Fairchild [3]. Not sure how much I would argue that it was “consumer technology” though, at least not at the time.
As for the more famous non-video cameras, they were indeed made by Hasselblad [4] (Sweden) and the lenses by Zeiss [5] (West German). I am of course acutely aware of the latter as I shoot Zeiss lenses and prefer their aesthetics to pretty much anything else I have tried, although these days a chunk of them are made by Cosina [6] (Japan). If you want to dig deeper, I liked Hasselblad’s official homepage on the matter [7].
The delay after taking long exposures is caused by the camera taking a dark frame with the same exposure length as your actual image. When you take a long exposure, the sensor can exhibit noise caused by "hot pixels". The dark frame is taken with the shutter closed so that the camera can get an image of just the hot pixel noise, which it then subtracts from your actual long exposure.
You can turn this off in the settings, but I imagine it's way easier for the camera to correct for this specific kind of noise than for Lightroom to do so.
I believe it kicks in once the exposure length is >2" on my Sony A7ii.
On the one hand that's correct. On the other hand you could completely avoid that if you did long exposures the way a Google Pixel (2+ I think) does it (in software):
Take lots of short exposures and fuse them. If the device is handheld you get variability in positioning for free, if it's on a tripod, it will automatically wiggle the OIS slightly to achieve the same effect.
It seems like movement would be effective for averaging out random noise, but the described process isn’t for eliminating random noise, it’s for eliminating persistent hotter pixels on the sensor.
Indeed, and in fact the pixel does exactly the same dark frame trick to identify hot pixels. I don't know that it takes as long though - perhaps it's hidden by the fact you can keep doing other things while the photo is processing.
You can also try turning off long exposure NR in the menus. I don't know if it makes enough of a difference on the a6000 to be worth leaving enabled and paying the extra cost in time, but that's a way to find out.