Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
wongarsu
on Jan 11, 2023
|
parent
|
context
|
favorite
| on:
NanoGPT
GPT2 can be run locally (on a somewhat beefy consumer GPU)
karmajuney
on Jan 11, 2023
|
next
[–]
Can you add some info on what consumer GPU would be needed for this? Would a 3080 be able to handle this?
wongarsu
on Jan 11, 2023
|
parent
|
next
[–]
Assuming you get the 12GB version of the 3080. A 2080TI is another option. Though you can reduce precision or use one of the smaller GPT2 versions to run on smaller cards as well.
int_19h
on Jan 11, 2023
|
root
|
parent
|
next
[–]
Let me slightly rephrase the question: what is the
best
model that one can run on high-end consumer grade hardware? Let's say RTX 3090.
minimaxir
on Jan 11, 2023
|
prev
[–]
The original GPT-2 small (the 124M one) can run on a CPU, just slowly and not scalably.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: