Is there support for this idea? I've seen it in plenty of novels, but I'd want some stats on how it affects a known, relatively modern, deployed gait recognition system.
I wanted to go RGB free when I built my desktop, but ran into the exact issue you describe. I kinda just shrugged and accepted it, but maybe I should have looked more deeply into their configurability. Off or all white would be a much better look IMO
What drives me crazy is that recently I had to download three separate bloated Electron app packages just to turn off the RGB in my new mobo, RAM and GPU because in 2025/26 we still don’t have vendors using a common protocol to control RGB.
I once tore apart my laptop and ripped out all the blue leds and replaced them with green amber reds. If yoi hate it that much you can just mod it. Soldering iron and a magnifier if you're over 30.
So what? I honestly don't get the issue. It's pretend.
When sex is taboo our brains will often associate the two things and we end up enjoying the taboo-ness. Incest is a big taboo, so it's sex with extra taboo slathered on it. People will want to fantasize about it. It's how brains work. As long as it's in the realm of fantasy I don't think there's ANY issue.
I'll go further and say that if two consenting adult sibling want to have sex it shouldn't be anyone's business but their own. It's not the state's job to decide that they aren't allowed because it makes the rest of us uncomfortable.
Everyone else is throwing in there 2¢, so here's my pet proposal.
Here's the undeniable fact: everyone (ok, almost everyone, but it's a rounding error) hates the switchover in spring, when you have to get up an hour earlier. Conversely, everyone (or a rough approximation) likes the switchover in the fall, when we get to sleep in an extra hour. So why don't we just get rid of the switchover in the spring and get rid of the one in the fall?
I hate both. The time jump in fall means sunset starts happening depressingly early (almost exactly 5pm where I am, which means no sunlight after work).
That's the setting for The Electric Church, by Jeff Somers. There's a very small capital class who own all the automated manufacturing plants, etc., and they live in obscene grandeur. Then there's the rest of the world which is basically a slum, brutally oppressed by the police force to keep them from rising up. The population gets everything they need to survive, but it's all fairly shitty - the technology for a post scarcity society exists in the setting, it's just hoarded by the most obscenely wealthy.
Pretty decent sci-fi, just pretend he didn't write any sequels tho.
Benj Edwards, one of the authors, accepted responsibility in a bluesky post[0]. He lists some extenuating circumstances[1], but takes full responsibility. Time will tell if it's a one-off thing or not I guess.
I agree that the work culture promoting this is bad, but being sick is still simply not an excuse to fabricate quotes with AI. It's still just journalistic malfeasance, and if Ars actually cares about the quality of their journalism, he should be fired for it.
If anyone who makes a mistake rarely and owns it completely shall be fired, everyone would be homeless.
To err is human, so owning what you did. This is the first time I have seen Ars to make a mistake of this kind in any size, so I think this is a good corrective bump given Ars' track report on these matters.
Maybe we should learn to be a bit flexible and understanding sometimes. If you live by the sword, you die by the sword, and we don't need more of that right now.
I agree, I think this should be taken in context and his past work should be reviewed by Ars to ensure this isn’t a pattern. If he made a mistake one time this is a learning experience and I doubt he would ever make it again. You don’t need to fire someone every time they make a mistake. Especially if the mistake was made in good faith.
I don't know about that - I'd say it's the managers responsibility to make sure employees don't feel pressured to work when they're to ill to function.
And also brings to mind the IBM one million dollars story:
(...)
A very large government bid, approaching a million dollars, was on the table. The IBM Corporation—no, Thomas J. Watson Sr.—needed every deal. Unfortunately, the salesman failed. IBM lost the bid. That day, the sales rep showed up at Mr. Watson’s office. He sat down and rested an envelope with his resignation on the CEO’s desk. Without looking, Mr. Watson knew what it was. He was expecting it.
He asked, “What happened?”
The sales rep outlined every step of the deal. He highlighted where mistakes had been made and what he could have done differently. Finally he said, “Thank you, Mr. Watson, for giving me a chance to explain. I know we needed this deal. I know what it meant to us.” He rose to leave.
Tom Watson met him at the door, looked him in the eye and handed the envelope back to him saying, “Why would I accept this when I have just invested one million dollars in your education?”
Should he? Where does that mindset come from? The author has owned up to his mistake. Unless there is a pattern here, why would we not prefer to let him learn and grow from this? We all get to accidentally drop the prod DB once, since that’s what teaches us not to do it again.
He's not some junior developer with his first job, he's the senior editor. If a senior editor plagiarized an article, he would rightly be fired because it's a serious violation of journalistic ethics. He knew using AI tools like that was against company policy and he did it anyway. That's well beyond just making a mistake.
There are degrees of plagiarism and you could argue this is not really plagiarism at all. Paraphrasing instead of directly quoting is probably about as mild as it can get. Most publications wouldn’t even note the mistake.
This wasn't paraphrasing either. The tool couldn't access the subject's website and instead fabricated quotes, which Benj nor anyone in the editorial process bothered to vet.
Have you met any professional journos? It's not exactly a laid back profession. I could easily imagine the people I know pushing through illness to get a story out.
> I have been sick with COVID all week /../, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh.
Being under stress and being ill at the same time can change your modus operandi. I know, because that happens to me, too.
When I'm too tired, too stupid, and too stressed, I stop after a point. Otherwise things go bad. Being sick adds extra mental fog, so I try to stop sooner.
Paste the original blog post into ChatGPT asking it to summarize or provide suggestions. Unintentionally copy and paste quotes from the ChatGPT output rather than the original blog post.
Okay. I've been harsh on Ars Technica in these comments, and I'm going to continue to hold an asterisk in my head whenever I see them cited as a source going forward. However, at least one thing in this apology does seem more reasonable than people have made it out to be: I think it's fine for reporters at an AI-skeptical outlet to play around with various AI tools in their work. Benj Edwards should have been way more cautious, but I think that people should be making periodic contact with the state of these tools (and their pitfalls!), especially if they're going to opine.[1]
We don't know yet how widespread these practices are at Ars Technica, or whether this is a one-off. But if it went down like he says it did here, then the coincidental nature of this mistake -- i.e., that it's an AI user error in reporting an AI novel behavior story at an AI-skeptical outlet -- merely makes it ironic, not more egregious than it already is.
[1] Edit: I read and agreed with ilamont's new comment elsewhere in this thread, right after posting this. It's a very reasonable caveat! https://news.ycombinator.com/item?id=47029193
That's a poor mea culpa. It begins with a preamble attempting to garner sympathy from the reader before it gets to the acknowledgement of the error, which is a sleight-of-hand attempt to soften its severity.
> which is a sleight-of-hand attempt to soften its severity
That’s not sleight-of-hand, I think we all immediately recognize it for what it is. Whether it is good form to lead with an excuse is a matter of opinion, but it’s not deceptive.
I speculate that curious minds, with a forensic inclination and free time, will go back to previous articles and find out it happened before...When you see a cockroach...
It's not really important whether it's a one-off thing with this one guy, he's not relevant in the big picture. To the extent that he deindividualizes his labor he's just one more fungible operator of AI anyway.
People are making a bigger deal about it than this one article or site warrants because of ongoing discourse about whether LLM tech will regularly and inevitably lead to these mistakes. We're all starting to get sick of hearing about it, but this keeps happening.
My immediately thought when I read the description of the book was that I have some internet friends who are into ABDL (adult baby diaper lover) stuff, and it sounds like the book's somewhat like that. I haven't GRILLED them about their motivations or why they're into it, but they like pretending to be a baby sometimes (not always in a sexual way) - maybe it's freeing to let go of responsibilities and pressure, etc. Anyway, it doesn't hurt anyone, and they get something out of it that makes them happy.
This ruling is sad IMO, because I have the feeling that Australia is increasingly hostile to The Weird Stuff, and I'm worried about what it might mean for people over there who are into abdl and the like.
reply