Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
replwoacause
40 days ago
|
parent
|
context
|
favorite
| on:
MacBook Pro with M5 Pro and M5 Max
Sounds pretty beefy. What kind of local LLM is that thing capable of running? Does it open up real alternatives to cloud providers like OpenAI and Claude, or are the local models this hardware is capable of running still pretty far behind?
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: