Hacker News new | past | comments | ask | show | jobs | submit login

As a chronic premature optimizer my first reaction was, "Is this even possible in vanilla python???" Obviously it's possible, but can you train an LLM before the heat death of the universe? A perceptron, sure, of course. A deep learning model, plausible if it's not too deep. But a large language model? I.e. the kind of LLM necessary for "from vanilla python to functional coding assistant."

But obviously the author already thought of that. The source repo has a great motto: "It don't go fast but it do be goin'" [1]

I love the idea of the project and I'm curious to see what the endgame runtime will be.

[1] https://github.com/bclarkson-code/Tricycle




Why wouldn't it be possible? You can generate machine code with Python and call into it with ctypes. All your deep learning code is still in Python, but in the runtime it gets JIT compiled into something faster.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: