Training models is not OS dependend. RAM is dependend on the size and i would argue this should be a lot easier to finetune with less GPU Ram.
Nonetheless the endgoal will probably be downloading a model like this or paying for finetuning than downloading and using it through an optimized Neuralchip.
Its currently more a question of when this will happen. The newest Windows cert already requires some neuralchip and even my google pixel 8 pro can host small models (i know the pixel is not a cheap phone, but the coprocessor should still be much more affordable than a big GPU)
Nonetheless the endgoal will probably be downloading a model like this or paying for finetuning than downloading and using it through an optimized Neuralchip.
Its currently more a question of when this will happen. The newest Windows cert already requires some neuralchip and even my google pixel 8 pro can host small models (i know the pixel is not a cheap phone, but the coprocessor should still be much more affordable than a big GPU)