Hacker News new | past | comments | ask | show | jobs | submit login
Chidori – Declarative framework for AI agents (Rust, Python, and Node.js) (github.com/thousandbirdsinc)
153 points by transitivebs on July 27, 2023 | hide | past | favorite | 39 comments



Friend of mine just open sourced his agent framework.

It's written in Rust and supports some cool features like time-travel debugging, embedded code interpreter (Deno + Starlark), human-in-the-loop, and agent evals.

Sharing here cause he's an introvert :) Would love to hear what the HN community thinks!


As someone that might well be interested in using this (as in someone that is likely to make use of better tooling around LLMs), I'm really struggling to understand what it does. Can anyone provide a summary of what an agent is in this context and an example of why this library makes things better?


In what I think of as engineering terms: an agent is a long running service that invokes LLMs during its execution. In contrast to an LLM driven application where the primary function is synchronous with the end user (like chatGPT). There's a blurry line there but that's how I think about it.

AutoGPT and BabyAGI are probably the two most well known examples so far.

A significant struggle when building these types of applications is understanding and debugging behavior many execution steps deep. This tries to assist with that by giving a framework for structuring the way your agent runs.

Maybe a similar concept is breaking out a web application into services, or individual route handlers, rather than implementing everything as one massive loop that responds to events.


So essentially a long running process that responds to events etc. So my understanding is now: when stuff happens, say a database gets updated, or a timer triggers, the agent wrangles the data and passes it to OpenAI for some LLM heavy lifting, then wrangles what is returned somehow. Is that correct?


Yeah, but the value they provide IMO (or promise to, it’s only promised in the readme, not shown) is the observability.

Creating one of these event driven systems is somewhat trivial. Node has event emitter built in for example.

Being able to monitor, debug, scale it and ensure it’s robust to failure is much harder.

In that respect it sounds kinda like tray.io. [1]

1: https://youtu.be/g_2HNv8HZcA


That is correct


Hello! I'm the author. If anyone has questions or feedback I'd really appreciate it!


What are some exciting ways you think people with Chidori?


I hope that people will be able to build agents in a way that makes deliberate methodical progress towards them successfully executing given tasks. Part of this could be tracing alone, which is something that I explored using opentelemetry on my own agents. But I found that I wanted to be able to understand the specific structure of the agent as I iterated on it as well. So Chidori attempts to allow you to do that, while having some properties that I feel assist with managing complexity!

Agents that work 80-90% of the time rather than 30-40% of the time would be awesome. And I think part of that is being able to iterate on specific pieces of the agent's behavior in a deliberate way.


You stole the name from Naruto...didn't you? Which is fine. It's been what, over 15 years since the episode/manga with that technique debuted? I guess I shouldn't be surprised someone young who watched it grew up and made a thing.


It's definitely a reference to it! It also broadly does mean thousand birds.


I don't know much about Naruto so initially I thought it could be a Persona 3 reference.


Lore dump: This is not a coincidence, as the characters themselves refer to the chidori being named due to sounding like thousands of birds. The alternate name used by kakashi, raikiri, is related to https://en.wikipedia.org/wiki/Tachibana_D%C5%8Dsetsu


Interestingly, in the Shona language it means 'puppet' which I found very appropriate as it is a functional framework which you have behave in whatever way you want.

This was my primary assumption re nomenclature


Ah, nostalgia.

Someone better make another called Rasengan.


Hello, I’m interested in adopting your framework for what I’m building.

Your docs are WIP, and I have a bunch of queries. How do I contact you?


The discord server linked from the github would work! I'd love to find out what the specific questions are. Alternatively twitter DM would also work @kveykva.


Well.. i'll be damned. I've been wanting to toy with LLMs on a local-only and simple "command<->LLM glue" style development. If i understand this project right.. it looks like Chidori implements most of the foundational stuff i wanted to do. Notably defining commands (nodes in this, i think?), monitoring node IO and LLM IO, etc.

Looks like i'll need to give this a try. Awesome!

edit: My goal is single binary local-only install though, so sounds like Chidori struggles here due to OpenAI required. Unfortunate for now


I've built something similar with a focus on returning structured data from the Agent and making it simple to expose an Agent as an OpenAPI. If that sounds interesting, I've wrote up some details here: https://wundergraph.com/blog/beyond_functions_seamlessly_bui...


Smaller target, but https://news.ycombinator.com/item?id=36854102 https://github.com/e2b-dev/agent-protocol "Agent Protocol" was submitted yesterday, & reminds me of this a little. This is definitely a bigger, more overarching thing.


I like the idea of the agent-protocol they define, I do feel that long term there might be a common interface for agent <-> agent or user <-> agent communication.


Hey, CEO of e2b here (the ones currently pushing the protocol). Congrats to the launch of Chidori! Great work!

> I do feel that long term there might be a common interface for agent <-> agent or user <-> agent communication. Definitely! We're just starting really small so we don't build the wrong thing and abstraction.

We'd would love to hear your thoughts and feedback on the direction of the protocol. And how the protocol could help Chidori and your users. Essentially, how we can get it to more agent devs and make it useful for them.

Would you be up for a chat? I'm @mlejva on Twitter


While people think this has something to do with Naruto. I would like to think it is some sort of future AI "Black technology" from Whispered.

( Yes, Chidori has a lot of other usage in Japanese )


I love the Naruto reference.


This is great, I had been wanting a framework like SAM or Serverless for Lambda functions but for running LLM apps.


Dude this is so cool. Reliable agents are freakin hard to build, excited to try this out


This looks very cool!

I'm most interested in the reactive runtime though and like the temporal.io and Timely Dataflow references. Is it possible to just have access to that in Rust and leave out the agent bits?


You could build a system that never invoked prompt nodes, and I think that would be equivalent! Re: timely dataflow, there are only hints of that in there right now - but I intend to expand on it.


can’t wait for Rasengan and Sharingan frameworks


The temptation to call something related to the replay functionality Sharingan is pretty strong now that I've used this name haha.


Mangekyō


Requires OpenAI API key. It feels like this is becoming too prominent.


Can this be integrated with local LLMs or does it only support openAI?


Currently it only supports openAI. I'm looking into patterns for supporting local LLMs! I think the best approach for that might be to allow you to update the api endpoint to your own and assume you're using something that mirrors the structure of their API.

The main escape hatch for everything right now are custom nodes though. But then you'll need to bring your own templating pattern.


Ah, yea i definitely need local-only support.

My goal is to build a single binary, local only command<->LLM interface. Chidori looks to overlap with that significantly, but of course i'd need custom LLM resolution.


[flagged]


Is this a sarcastic joke, or is it a serious comment? I honestly cannot tell.


Yes. Poe's Law.


Also, even if it didn't have it you could use GPT-Migrate (shameless plug) https://github.com/0xpayne/gpt-migrate


It's got JS and python too




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: