Hacker News new | past | comments | ask | show | jobs | submit login

> The predictive nature of LLMs are perfect for voice assistant use cases

But people in general don't seem to be interested in voice assistants in general. They were a huge hype when you basically had to manually uncheck boxes to not accidentially order an Amazon Echo, etc, but as quick as the hype was there it was gone. To my knowledge Amazon has yet to make any kind of substantial revenue with Alexa stuff.

With LLMs I would expect this to be even more of an issue given how costly the inferencing alone is when we're talking about an LLM. And yes, the utility will likely increase substantially with the adoption of LLMs as opposed to previous language processing techniques, but I don't think this will outweigh the disinterest in the overall technology.




I'm not interested in voice assistants because they suck at the moment.

When they achieve something like 99.9999% accuracy and near-undetectable latency, I will become much more interested in them.

As it stands now, if I have to repeat something -- ever -- I'd rather just type it.


People aren't that interested in voice assistants atm, though i know plenty people who look up stuff, start music etc with it.

When voice synthesis, long context etc. i'm pretty sure a lot of people would love to have a private "Her" like friend/assistant that could both be therapy, a PA and a friend, and i don't think we're that far from that. Dystopian and lonely as it is.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: