This is a long time coming. I'm normally not a big fan of large companies building products in the embedded space that could potentially destroy competition and future innovation but this is needed.
Nvidia's embedded boards are EXPENSIVE. So expensive it limits the applications dramatically. They also require a different skillset in people to set up which drives up the cost.
We did an analysis for a security project that required visual inference. It turned out all the extra costs to setup with TX boards meant it actually made more sense to have mini desktops with consumer gtx cards.
I am excited to see the performance of the inference module. If it's decent at a good price, that opens up so many pi/beagle/arduino applications that were limited by both cost and form factor of existing options.
Not so sure how much this kit will cost but I wouldn’t put my hopes it will be cheaper per compute than nvidia tegras. It could be a good alternative for lower end compute though.
Nvidia provides a line of embedded systems for accelerated compute called tegra. It’s pretty awesome kit but costs from 150-500, depending of the compute necessary. Probably a new one will be announced in a months time, hence google is trying to get ahead
Nvidia's embedded boards are EXPENSIVE. So expensive it limits the applications dramatically. They also require a different skillset in people to set up which drives up the cost.
We did an analysis for a security project that required visual inference. It turned out all the extra costs to setup with TX boards meant it actually made more sense to have mini desktops with consumer gtx cards.
I am excited to see the performance of the inference module. If it's decent at a good price, that opens up so many pi/beagle/arduino applications that were limited by both cost and form factor of existing options.