Hacker News new | past | comments | ask | show | jobs | submit login

If I were to self host an open source model like mistral or llama , are there options similar to this as an api gateway to proxy and authenticate , create api keys , monitor spends by api etc .,? How are people running open source LLM”s in production ? Thanks



We do support self-hosted models, as long as they're exposed as an API. Would this endpoint work for you? https://github.com/bricks-cloud/BricksLLM?tab=readme-ov-file...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: