Hacker News new | past | comments | ask | show | jobs | submit login

This looks exactly like what I need for a project I've been working on. How do you get your own code in to the model? Or is that the future fine-tuning step you're talking about?



Tabby collected the source code from the related repository and built them into a dataset. I already have some proof-of-concept pipelines built [1] and [2], but I still need some time to polish the data pipeline..

[1]: https://github.com/TabbyML/tabby/blob/main/tabby/tools/repos...

[2]: https://github.com/TabbyML/tabby/blob/main/tabby/tools/train...


Ok, looks like that makes sense! Is there a prompt token limit like OpenAI models? Codex I believe has a 8k prompt/response limit.


You can train this on all the code in your own repositories? I would assume that makes the completions a lot better?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: