This all will be better discussed after the recovery or memorials or whatever (Godspeed to them, I honestly hope for a miracle)
This metaphor is extremely double edged, especially in Silicon Valley ethos. When it fails, everyone piles on about the game controller and the lack of ‘flight worthiness.’ When it works they are lauded as heroes because of the extreme savings of using proven COTS products. The drone pilots are a great example, some officer got a nutty idea to try a game controller as his command had “more experience” with them and their success rates improved; now we are talking about it.
Space and the deep sea are the extreme limits, I think ‘regulation’ is really difficult because it is risky no matter who does it. In a way, this is the ultimate regulation that is going on here. And we still don’t know what happened or what could have failed yet.
When (most) people talk about the shitty controller, the options in their heads aren't "3rd party Logitech" or "$10,000 bespoke controller". They're "$200 elite OEM controller", "$60 standard OEM controller" or "$30 notoriously bad piece of shit Logitech".
>Space and the deep sea are the extreme limits, I think ‘regulation’ is really difficult because it is risky no matter who does it.
But this is why regulations exist in the first place. That's like saying we shouldn't regulate surgery, because it's always risky.
We should require certain credentials and safety measures for people who want to take civilians to the ocean floor in a tiny sub. I don't think that should be a controversial take.
The onus to make sure that a service they are going to use isn't any more dangerous than it has to be, shouldn't be on the consumer. I don't want to live in a society that thinks it's acceptable to hand-wave death and suffering away as if human lives are acceptable collateral in the "free market" correcting itself.
Customers would still understand that the thing they're going to do is super dangerous, but they should be able to rest easy knowing that the company they're using for this meets some minimal level of safety.
Regulation in areas of extreme innovation is by definition difficult. When you have people onboard any space or deep sea craft, you do have some level of moral obligation.
Whatever you feel about Elon Musk, I appreciate his distinction in approach between crewed and un-crewed craft at SpaceX. I'm paraphrasing but he has said that when something is uncrewed, you can push it to the limit over and over again to push the technology fwd. But the second you have people on board (i.e. Crew Dragon), the margin for error goes down to zero. There are unavoidable risks, but you want to be damn sure you've minimized the avoidable ones. So you can be both -- a fast moving startup, and a "safe" organization.
Obv we still know so little about the Titan, but what upsets me is that the 'uncrewed' extreme testing does not (afaik) seem to have been particularly rigorous.
This all will be better discussed after the recovery or memorials or whatever (Godspeed to them, I honestly hope for a miracle)
This metaphor is extremely double edged, especially in Silicon Valley ethos. When it fails, everyone piles on about the game controller and the lack of ‘flight worthiness.’ When it works they are lauded as heroes because of the extreme savings of using proven COTS products. The drone pilots are a great example, some officer got a nutty idea to try a game controller as his command had “more experience” with them and their success rates improved; now we are talking about it.
Space and the deep sea are the extreme limits, I think ‘regulation’ is really difficult because it is risky no matter who does it. In a way, this is the ultimate regulation that is going on here. And we still don’t know what happened or what could have failed yet.