Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have no idea what kind of compute power something like this relies on. Would this be able to run on a consumer desktop?


They note that the model has 12B parameters, which in terms of order of magnitude make it sit right between gpt 2 and 3 (1.5 and 170 respectively). With some tricks, you can run gpt 2 on good personal hardware, so this might be reachable as well with the latest hardware.

EDIT: I'm assuming you mean for inference, for training it would be an other kind of challenge and the answer would be a clear no


In the linked CLIP paper they say it is trained on 256 GPUs for 2 weeks. No mention of the size of the trained output.


Depends on how fast you want it to generate results, but yes, it can run on a desktop provided there's enough RAM.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: