I don't think "usage" is exactly the metric they're going for, more like "usage in line with our developmental strategy." Transcripts of people using Claude to write code are probably far more valuable to them than transcripts of OpenClaw trying to set up a calendar invite.
I mean, they don’t train on your data unless you have the setting enabled.
Do you really think they are reading your prompts at all?
Free inference providers sure, but Anthropic?
I have a transformer attention mechanism which seems to be more data-efficient than the usual dot product, and I'm trying to write a performant backwards kernel for it.
No, AI has real, immediate, large-scale industrial and scientific applications. Cryptocurrency is a massive disappointment, but AI has genuine potential value.
I don't condone violence, but the contract he's signed with the US military is a credible threat to everyone in the US. OpenAI will now certainly be called on to assist in domestic mass surveillance, under threat of the kind of severe penalties Anthropic has faced. So why did he agree to that contract, unless he's will to provide that assistance? So it's gone well beyond conversation, though not to a point where violence is appropriate. Boycotts and hostility are definitely appropriate at this point IMO, though.
> The only solution I can come up with is to orient towards sharing the technology with people broadly, and for no one to have the ring. The two obvious ways to do this are individual empowerment and *making sure democratic system stays in control.*
OK! So he's going to renege on the contract he's signed with Hegseth, which effectively commits OpenAI to serving as the IT Department for Trump's secret service?
reply