Let's say I'm writing Flask code all day, and I need help with various parts of my code. Can I do it today or not? With questions like, "How to add 'Log in with Google' to the login screen" etc.
Longer: In theory, but it'll require a bunch of glue and using multiple models depending on the specific task you need help with. Some models are great at working with code but suck at literally anything else, so if you want it to be able to help you with "Do X with Y" you need to at least have two models, one that can reason up with an answer, and another to implement said answer.
There is no general-purpose ("FOSS") LLM that even come close to GPT4 at this point.
If you have sufficiently good hardware, the 34B code llama model [1] (hint: pick the quantised model you can use based on “Max RAM required”, eg. q5/q6) running on llama.cpp [2], can answer many generic python and flask related questions, but it’s not quite good enough to generate entire code blocks for you like gpt4.
It’s probably as good as you can get at the moment though; and hey, trying it out costs you nothing but the time it takes to download llama.cpp and run “make” and then point it at the q6 model file.
So if it’s no good, you’ve probably wasted nothing more than like 30 min giving it a try.