Hacker News new | past | comments | ask | show | jobs | submit login

> you'll have at least double the performance of a base m4 mini

For $500 all included?




The base mini is 599.

Here's a config for around the same price. All brand new parts for 573. You can spend the difference improving any part you wish, or maybe get an used 3060 and go AM5 instead (Ryzen 8400F). Both paths are upgradeable.

https://pcpartpicker.com/list/ftK8rM

Double the LLM performance. Half the desktop performance. But you can use both at the same time. Your computer will not slow down when running inference.


That’s a really nice build.


Another possible build is to use a mini-pc and M.2 connections

You'll need a mini-pc with two M.2 slots, like this:

https://www.amazon.com/Beelink-SER7-7840HS-Computer-Display/...

And a riser like this:

https://www.amazon.com/CERRXIAN-Graphics-Left-PCI-Express-Ex...

And some courage to open it and rig the stuff in.

Then you can plug a GPU on it. It should have decent load times. Better than an eGPU, worse than the AM4 desktop build, fast enough to beat the M4 (once the data is in the GPU, it doesn't matter).

It makes for a very portable setup. I haven't built it, but I think it's a reasonable LLM choice comparable to the M4 in speed and portability while still being upgradable.

Edit: and you'll need an external power supply of at least 400W:)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: