Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
FuckButtons
80 days ago
|
parent
|
context
|
favorite
| on:
So you wanna build a local RAG?
A 120gb ram MacBook Pro will run gpt-oss-120b at a very respectable clip and I’ve found it to be quite serviceable for a lot of tasks.
adastra22
80 days ago
[–]
I bought one for this purpose, but LM Studio doesn't seem to want to run even the most quantized versions. Any suggestions?
FuckButtons
79 days ago
|
parent
[–]
Are you using the mix quantized versions? Also, there’s a setting that disables the memory allocation guardrails they put in place, which are irrelevant since macOS handles oom quite gracefully.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: