Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My problem is that all AI features are currently wildly underpriced by tech giants who are providing subsidies in the hopes of us becoming reliant upon them. Not to mention it means we’re feeding all kinds of our own behavioural data to these giants with very little visibility.

Any new feature should face a very simple cost/benefit analysis. The average user currently can’t do that with AI. I think AI in some form is inevitable. But what we see today (hey, here’s a completely free feature we added!) is unsustainable both economically and environmentally.



Actually the frontier lab pricing is way more expensive than actual cost. Look up the prices for e.g. Kiki K2 on open router to see the real “unsubsidized” costs. It can be up to an order of magnitude less.


Summarizing text can very easily be done by local AI. Low powered, and free. for this type of task, there is essentially no reason to pay.


This is totally true and points to why Calibre's feature adds value. However I think the big players see exactly what you see and are scrambling to become peoples' go-to first. I believe this is for two main reasons. The first is because it's the only way they know how to win, and they don't see any option other than winning. The second is that they want the data that comes with it so they can monetize it. People switching to local models has a chance to take all that away, so cloud providers are doing everything they can to make their models easier to use and more integrated.


Which is not what is happening here. I think a lot of people’s objections would be resolved by a local model.


> Currently, calibre users have a choice of commercial providers, or running models locally using LM Studio or Ollama.

The choice is yours. If you want local models, you can do that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: