I have to say that running this model locally I was pleasantly suprised how well it ran, it doesn't use as much resources and produce decent output, comparable to ChatGPT, it is not quite as OpenAI but for a lot of tasks, since it doesn't burden the computer, it can be used with local model.
Next I want to try to use Aider with it and see how this would work.
Next I want to try to use Aider with it and see how this would work.