Hacker News new | past | comments | ask | show | jobs | submit login

Throw your nVidia card in the bin and be free. After many years with nVidia I removed the card and I've been happily using Intel hardware for a year now.



Because God forbid you actually have to use the card you paid for, oh no, we can't have that. No one should work with machine learning, edit video or play games, Intel HD should be enough for everybody /s


Lenovo X1E is a laptop... So removing the GPU will be a bit of a challenge ;)

I thought of turning the GPU off under Linux, but external displays (through the thunderbolt port) can only be used with the nVidia GPU. I need external displays for my work, so disabling the nVidia GPU is not an option for me.


You may be able to either:

- Use Nouveau instead of the proprietary driver

- Use a nested X/wayland server, which should then presumably support fractional scaling

- Use non-fractional scaling and adjust font sizes


I guess you don't play videogames on your computer.


How's ML training on Intel integrated graphics? :)


Do you have to use the nvidia as your graphics adapter if you are only using it for ML training?


It definitely won't work in the bin.


Desolder the NVIDIA card on a laptop?


Or just disable it in bios?


Throw your Linux OS in the bin and be free. After many years with Linux I removed it and I've been happily using Apple/Windows hardware for a year now. /s


The commenter is lamenting the fact that he can't run Ubuntu. I, on the other hand, am perfectly happy with what I have.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: