Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
treprinum
on Aug 1, 2023
|
parent
|
context
|
favorite
| on:
Nvidia H100 GPUs: Supply and Demand
There's A6000 Ada for that (you can rent servers with 4xA6000 at Lambda Labs). Moreover, 4090 has only 24GB memory, H100 has 80GB.
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: