Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
wmf
on March 22, 2023
|
parent
|
context
|
favorite
| on:
Nvidia Announces H100 NVL – Max Memory Server Card...
Running LLMs on your own servers doesn't mean PCs which is what this thread is about. A100/H100 is fine for a business but people can't justify them for personal use.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: