Everything is theater if one is cynical enough. One can very obviously find value in blocking the camera, even if other sensors remain active.
That said, I do see merit to flagging these. Related surprises are usually e.g. speakers being possible to use as microphones, and accelerometer data being possible to use for location tracking in lieu of GPS / any kind of radio, just not remotely & live ofc.
What value is there in blocking a camera if the microphone isn’t blocked? No one can glean anything from the camera in my pocket or face down (or up) on table - they ca. glean a lot from the microphone
And why would I have my phone in a position when I am nude that anyone could see anything but my face? While I’m in decent shape, if I lost my job I won’t be opening an OnlyFans account
IU was correctly used everywhere else in the article except that one place with the mistak, so the LLM didn't hallucinate a correction, it correctly summarized what the bulk of the article actually said.
That's a bit of a non-sequitur, isn't it? The debated point is how oral intake as a delivery method can pan out specifically (and its limits), not the dosage limits of Vitamin D in general. Think consuming a drug vs injecting it.
I don't get why + addresses always come up in this. They're machine-undoable by design.
Using randomized relay addresses instead gives you an immensely higher confidence that when a given contact address starts getting spam, it is misuse stemming from a specific entity. Especially if you rotate it at a fixed time interval, cause then you can even establish a starting timeframe.
Still not perfect but it can never really be, and not even out of email's fault. As long as DNS and IP addressing rule the world, there's only so much one can do. Once identity is private-default, it becomes a secret handling problem at its core, a capability these schemes were never designed to provide.
One big reason I can think of that would make one want a permanent data purge feature, is that the data is not on their premises but on the service provider's. I think GDPR might even require such a feature under a similar rationale.
So maybe a better formulation would be to force the user to transfer out a copy of their data before allowing deletion? That way, the service provider could well and truly wash their hands of this issue.
Forcing an export is an interesting idea. But, like, from the article it sounds like almost anything would be a better flow. It didn't even warn that any data would be deleted at all.
One further refinement I can think of is bundling in a deletion code with the export archive, e.g. a UUID. Then they could request the user to put in that code into a confirmation box, thereby "guaranteeing" the user did indeed download the whole thing and that the service provider is free to nuke it.
Wouldn't really be a guarantee in technical actuality, but one really needs to go out of their way to violate it. I guess this does make me a baddie insofar that this is probably how "theaters" are born, rituals that do not / cannot actually carry the certainty they bolster in their effect, just an empirical one if that.
I don't think such an idea is consistent with the existence of trashbin features, or the non-insignificant use of data recovery tools on normally operating devices.
I can definitely see the perspective in clarifying that ChatGPT didn't lose anything, the person did, but that's about it.
It's always interesting to see how hostile and disparaging people can start to act when given the license. Hate AI yourself, or internalize its social stading as hated even just a little, and this article becomes a grand autoexposé, a source for immense shame and schadenfreude.
The shame is not that he was so imbecile to not have appropriate backups, it is that he is basically defrauding his students, his colleagues, and the academic community by nonchalantly admitting that a big portion of his work was ai-based. Did his students consent to have their homework and exams fed to ai? Are his colleagues happy to know that probably most of the data in their co-authored studies where probably spat out by ai? Do you people understand the situation?
It's not that I don't see or even agree with concerns around the misuse and defrauding angle to this, it's that it's blatantly clear to me that's not why the many snarky comments are so snarky. It's also not as if I was magically immune to such behavioral reflexes either, it's really just regrettable.
Though I will say, it's also pretty clear to me that many taking issue with the misuse angle do not seem to think that any amount or manner of AI use can be responsible or acceptable, rendering all use of it misuse - that is not something agree with.
It seems you are desperately trying to make a strawman without any sensible argument, i don't personally think it is "snarky" to call things as they are, plain and simple, you, as supposed expert and professional academic, post a blog on Nature crying that "ai stole my homework", it's only natural you get the ridicule you deserve, it's the bare minimum, he should be investigated by the institution he works for.
A reasonable amount of AI use is certainly acceptable, where "reasonable" depends on the situation, for any academic related job this amount should be close to zero, and no material produced by any student/grad/researcher/professor should be fed to third party LLM models without explicit consent, otherwise what even is the point? Regurgitating slop is not academic work.
Sorry to hear that's how my comments seem to you; I can assure I put plenty of sense into them, although I cannot find that sense on your behalf.
If you think considering others to be desperate, senseless, and erroneously reasoning without any good reason improves your understanding of them, and that snarky commentary magically ceases to be or is all-okay because it describes something you find a big truth, that's on you. Gonna have to agree to disagree on that one.
I have a hard time chuckling at data loss. Espcially given that exporting and backing up your data from online services has an even smaller tradition than taking backups of one's local data, which is sadly rare in itself on the individual level.
That said, I do see merit to flagging these. Related surprises are usually e.g. speakers being possible to use as microphones, and accelerometer data being possible to use for location tracking in lieu of GPS / any kind of radio, just not remotely & live ofc.