Hacker News new | past | comments | ask | show | jobs | submit login

Creator here. Thank you for sharing!



Pretty cool!

> My research goal is to train models using various hardware telemetry data with the hope that the models learn to interpret sensor inputs and control actuators based on the insights they glean from the sensor inputs. This research direction may open up exciting possibilities in fields such as automation, space, robotics and IoT, where L2E can play a pivotal role in bridging the gap between AI and physical systems.

This part was easy to miss but quite interesting, could you expand a bit here? What does L2E stand for?


It's the name of the project - Llama2Everywhere.


thanks! I am still curious about the sensor inputs part.

I am trying to replace the 2.4ghz controller on my electric skateboard to make 0 to 5kmh and braking more pleasant and maybe use gyroscopes to do away with the controller altogether. What would tokens be in that case? Do you create a CAN style representation and feed that to the llm? What kind of throughput do you foresee being possible on which hardware?


I am still thinking how I'll pull this off. But basically it is collect tons of Telemetry converted to in ascii/text.

Telemetry in sense sensor streams, both command and responses.

Then we'll just train a small model for long enough. Then we will see how it would respond.

That's the plan sort of.


In this case, I would ask the LLM to suggest an algorithm to minimize acceleration, jerk and snap based on the expected sensor input data, and then just implement that. Probably in memory on whatever runs the board.

Straightforward control problem of bringing the board from 5-0kmh smoothly?


Let's see.


Basic control theory will work better than AI here. The mathematical models used in control theory have been used in computing since at least the 50's (Kalman Filters). I suspect you won't have issues with computational power.

Figuring out exactly which model to use and how, may take some work. Also understanding control theory will allow you to do things like traction control, etc.


In addition. I Agree.


Correct :)


Thank you. So L2E stands for Llama 2 Everywhere. Thanks again for checking it out.


> What does L2E stand for?

um...


What does

    cat /dev/llama
do? Would I get a kind of LLM stream of consciousness? That's incredible :-)


cat /dev/llama is not yet implemented. However we have a module to which you can give a prompt as parameter. It's buggy now.

Our goal is to write a proper kernel module to implement three things:

1. a character device 2. 1st backend is a LLMZip ie you write to say /dev/l2ezip, you get a compressed stream out 3. 2nd backend is a LLM, ie you write a prompt to say /dev/llama2, you get a completion back

So the 1st backend could be useful for compressed telemetry The second backend could be useful for IoT LLM, or our ambitious plan of responding to telemetry, ie take action, such as control motor speed etc.


Can one write installable apps for this OS that Llama can call out to?

We need to go deeper!


We will go the depth. In the coming versions.


Easily the first practical use I've seen after hearing about cosmo libc. Very cool!


:)


This is hilarious (in a good way), I love it. Thanks for creating it.


:)


the iso is 68mb or so, so the model is not there, how does it work? (tried on Oracle VM and didn't work but seems it is not supported)


try qemu or boot from pendrive

Qemu:

qemu-system-x86_64 -display gtk,zoom-to-fit=off -m 512 -accel kvm -vga virtio -cdrom l2eos.iso


the model is included




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: