They have <a really expensive> infrastructure that serves 800 million monthly active <but non-paying> users.
Even worse, they train their model(s) on the interactions of those non-paying customers, what makes the model(s) less useful for paying customers. It's kind of a "you can not charge for a Porsche if you only satisfy the needs of a typical Dacia owner".
They have <a really expensive> infrastructure that serves 800 million monthly active <but non-paying> users.
I don't pay Meta any money too. Yet, Meta is one of the most profitable companies in the world.
I give more of my data to OpenAI than to Meta. ChatGPT knows so much about me. Don't you think they can easily monetize their 800 million (close to 1 billion by now) users?
> Don't you think they can easily monetize their 800 million [...] users?
I am pretty sure they will be able to monetize it. But there is a big difference between "generating revenue" and "generating profit". It's way cheaper to put ads between posts of your friends (like FB started out with ads) then putting ads next to the response of an LLM. Because LLM responses has to be unique, while a holiday photo of yours might be interesting for all of your friends, and LLM inference is quite expensive, while hosting holiday photos is cheap. IMHO this is the reason why the 5th generation of ChatGPT models try to answer all possible questions of the world in one single response, kinda hoping that I am going to be happy with it an just close the chat.
Even worse, they train their model(s) on the interactions of those non-paying customers, what makes the model(s) less useful for paying customers. It's kind of a "you can not charge for a Porsche if you only satisfy the needs of a typical Dacia owner".