Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Would always be nice, for sure, but for Deep Learning I use a big cheap box with loads of RAM and the 16GB in my M1 MBP with the 1TB SSD seems plenty good so far. I don't have use cases yet where I notice a problem (other than yeah, overhead of having to do DL jobs on a different machine).

What causes you to hit the limits? I don't do any video editing, so I could see that being a slowdown for that use case.



Being able to locally run the integration tests for my work on my 32GB thinkpad has been a godsend during the 'rona times. If I only had 16 (or even 24) I'd have to carefully consider whether I should close Firefox to run them.

These tests spin up a bunch of containers and run certain components with instrumented runtimes that take up gobs of RAM, so having a fat laptop is very useful.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: