Hacker News new | past | comments | ask | show | jobs | submit login

yes, but Intel now has mainstream GPUs that are reasonably (~ RTX 3060) performant, which is what I assume this is for



Are you talking about Intel Arc? I'm yet to see any ML relevant benchmarks on Intel Arc. If you are aware of any please let me know.


Same here. At 16GB of VRAM and only $349, it could fill a really nice slot for DL.


You won't for some time as ARC is only supported on Linux kernel 6+. And honestly, their hardware is pretty bad for ML given you can pick up a used 3090 for $600. I just don't think there will be an ML push for ARC with current gen.


> And honestly, their hardware is pretty bad for ML given you can pick up a used 3090 for $600.

How do you know it's bad? Do you have benchmarks?

Just checked, used 3090s are going for $700-900 on eBay.


It won’t be able to make use of CUDA, which is more than enough reason.


hardwareswap.reddit.com is the best way to buy used computer hardware.


I think a recent Linux version requirement shouldn't hamper benchmarking if you're a somewhat technical user.

You can get the latest Linux pacakges for eg Ubuntu here: https://wiki.ubuntu.com/Kernel/MainlineBuilds - probably other distributions have similar easy ways since it's needed so often in bug reporting for users to tell if a hw support bug is in a distro specific kernel change or not.


It presumably has to support Aurora urgently: https://www.alcf.anl.gov/aurora




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: