Hacker News new | past | comments | ask | show | jobs | submit login
Amazon blew Alexa's shot to dominate AI, according to employees (2024) (fortune.com)
68 points by williamstein 1 day ago | hide | past | favorite | 57 comments





> Overall, the former employees paint a picture of a company desperately behind its Big Tech rivals Google, Microsoft, and Meta in the race to launch AI chatbots and agents, and floundering in its efforts to catch up.

I obviously don't have any inside information, but this is a weird take to me from the outside.

Google has butchered Assistant since the advent of LLMs. My Google Home devices have lost basically all of their functionality, but in the meantime the new Gemini "replacement" is still by all accounts a disaster.

Microsoft has gone through all the right motions to satisfy investors—they've pushed their Copilot button onto new keyboards, pushed their Copilot tech into all their cloud products, and started selling "AI-ready" stickers on laptops. But from the consumer perspective, the reception has been not just mixed but overwhelmingly terrible! No one asked for these features, and no one wants them.

Meta, meanwhile, has released Llama, for which we're all grateful, but in terms of products what do they really have to offer? A much-maligned AI-powered fake social media feed?

None of the pre-existing giants are performing particularly well at actually "winning" the AI assistant space. Out of the three named, only Microsoft has any claim to serious mindshare, and that only through their relationship with OpenAI.


> Google has butchered Assistant since the advent of LLMs. My Google Home devices have lost basically all of their functionality

I’ve seen this said a few times lately, yet I’ve not really seen much of an explanation of exactly how it’s regressed.

Maybe it’s just a difference in how we use ours, but it still has worked the same as it always has.

Out uses are fairly basic, but I’m not sure how much more complex it needs to be:

- tell us the current/forecasted weather

- pick random numbers/flip a coin (playing board games)

- ask it for information about X

- play music on Spotify

- carry out actions in our home devices

All of these have always worked, and continue to work just fine. For from “lost all of their functionality” (this feels exaggerated).

Can you/others go into more details on how Google Home has become broken lately?


I had mine setup to turn the lights off when I said 'lights out'. Now when I say 'lights out' it tells me that 'Lights Out' is a 2016 movie.

Yeah, I have noticed that some commands are now interpreted differently/incorrectly.

But I’ve just started being less ambiguous with commands now. E.g. simply “turn lights out” has worked for me.

So I guess I wouldn’t call that “broken” or “butchered”.


Ours doesn't answer questions of any sort and hasn't for months. "Sorry, I can't answer that, try asking something else". The only advantage the Home has had over Alexa was question answering, and that's gone now.

Yes, most of the features you describe are still there, but I can also implement those myself with the right hardware and plan to as a replacement. Google killed what unique value there was to their product.


The current favourite frustration of reddit seems to be that in the past you could cancel requests, by saying "Never mind".

Apparently now saying "Never mind" gets you an in-depth speech about Nirvana's album, or it playing.


i am plagued by this and interpretations of my regular speech. i must rename some devices more uniquely now.

I have a 20 dollar per month subscription to Perplexity Pro. I can ask it about various topics and it provides me accurate and well sourced information with citations. I have access to hundreds of these searches per day.

To me Perplexity Pro is clearly winning in terms of quality when I use it for this purpose. I have tried the 20 dollars Chatgpt and the 20 dollars google gemini as well. Chatgpt is pretty good and Google Gemini is pretty bad even with the 20 dollars a month but Perplexity Pro is by far the best AI for me. It's gotten even better with the recent Deepseek integration that they have done.

Anyway when I compare Perplexity and Chatgpt with the things coming out of the Big Tech companies I can't help but shake my head. Hasn't Amazon spent billions and billions on Alexa? Even then Amazon is clearly behind...


offtopic question from main thread. is Perplexity Pro subscription better than chatGPT one for queries mainly around daily programming questions?

I use Claude 3.5 Sonnet for coding assistance (and pay $20/month for the "professional" plan).

I recommend it!


> ...the new Gemini "replacement" is still by all accounts a disaster

Small nit: the initial rollout of Gemini was a dumpster fire. It is now quite good. For my use cases, I can't get any other current LLM to give me better results than Gemini 2.0 Flash. It's also free

But even that kind of proves your point, right? Pretend you 100% believe me without verifying. That means in a year or two the winner has transferred between ~3 companies. This is not a cheap mantle to keep passing around. The AI wars are going to get heated over the next year or two


I'm happy to see this because it's my experience with Gemini too. Google did terribly with Bard (clearly an emergency launch to say "Hey, we're here too!"), Gemini 1.0 and even Gemini 1.5 was only decent in the top 'Pro' tier.

Gemini 2.0 has been a huge step forward and I'm using Flash daily at work for coding. Very liberal limits and much cheaper than OpenAI too. I'm easily in the $0 tier as I don't exceed 15 requests per minute or 1500 per day. I use it with Chatbox AI and my Google Gemini API key.

They've been a bit late to the party though. But the fumbling from Amazon and Apple has opened an opportunity to STILL be ahead and launch the first powerful AI assistant this year. It's no doubt one of the reasons they developed Gemini 2.0 as an agentic AI and I'm only waiting for hardware refreshes now.


Just to be clear, are we talking about the Gemini based Assistant replacement? The last time I checked that version couldn't even set reminders or timers, so if that's improved since then I may actually try switching.

And yes, agreed that this space is very fragile right now for all the companies.


Don't forget Apple. They just forced an update on my laptop and now I have a text to image playground that it feels the need to keep telling me about.

All these years how did I live without a text to crappy image generator?


> keep telling me about

You may have something misconfigured. I have gotten exactly one feature notice that I forgot about until this moment.


Nitpick: it wasn’t a forced update, and I got exactly one notification about the Playground stuff that I remember.

This is what baffles most. When sales goons and marketing nitwits get control over software, it smacks of utter desperation, a lack of confidence in product, and for things such as an OS?

Insanity.

My (seemingly) core OS has no business nagging me.

You know I bought a car, and it nags me about, literally 4 or 5 things every time I use it? Some of which can never be deactivated?

Sheer stupid.


Amazon Lab126's Alexa team had following cross-related issues:

1. terrible tech stack. Absolutely poorly maintained pipeline

2. generally unenthused employees

3. office politicking and broken culture

4. unbelievable amount of penny-pinching and poor budgeting

I had a colleague within the Alexa team calling the place some sort of purgatory for people who couldn't join elsewhere. Another colleague went to Google Brain (when it was not as prestigious) and told me it was not the money that made him leave. Individually, they may have been good engineers and researchers, but the team was broken.

What is surprising is not that Alexa failed. What is surprising is that it managed to even build a half-decent product that was, at a point, a leading AI product and platform.


This. We tried to work with the Alexa team and it was a dysfunctional mess compared to the rest of AWS. It wasn't just a different culture, it was an entirely different country.

We ended up "closing down the project with enthusiasm".


I keep waiting for Amazon to admit defeat and partner with OpenIA or someone else to upgrade Alexa. That said, it also appears that the majority of folks just don't care about having a voice activated assistant. Most people just don't use their Alexa beyond "Play music" or "What's the weather today". I'm unsure if a smarter AI would help in any real way.

When the Echo started adding “by the way …” to responces without a way to turn that off, I gave up on it. I got a HomePod Mini to replace it and it’s fine.

I’d love to know what the people in Amazon who came up with that thought they were doing. Did they not use the product themselves? Surely they knew their users mostly didn’t want that. Why not add a way to turn that off?


My wtf moment was when my ecobee thermostat which somehow had integrated itself with my Amazon account started flashing lights and randomly announcing that there were notifications for me.

One time it was some package delivery notification and I let it slide. The next time was an ad for some Amazon service, which was an immediate “handle the crying baby for a few minutes while I get this shit eradicated from our house” moment.


I'm trying to imagine billionaire Jeff Bezos being happy to be told "by the way" after every interaction.

Who am I kidding though, the odds he's using Alexa is zero, eh?


Isn’t the story that Echo mostly exists because of Bezos? As in, it was his baby and wasn’t immediately discontinued because he wanted voice assistants to be a thing.

Whether that’s still true in 2025 I’m not sure, but the chances of him using it seems fairly high given its origins.


People as rich as Bezos have an army of human personal assistants to satisfy their every whim.

He probably has 10 actual people named "Alexa" just to cater to his every whim.

You mean Amazon probably routes his Alexa to 10 people to process.

No, I mean he isn't under the same corporate surveillance the rest of "alexa" users are. He has actual humans named Alexa ready to take any order he gives them.

I'm being facetious of course, but that's probably what I would do if I were a tech mogul with billions and billions of dollars.


> Overall, the former employees paint a picture of a company desperately behind its Big Tech rivals Google, Microsoft, and Meta in the race to launch AI chatbots and agents, and floundering in its efforts to catch up.

I feel like this is the feeling at every big tech company

Being on the outside looking in it seems like these companies rapidly innovate

But when you actually work at one, it feels like everything moves at a glacial pace and the “right” answers are never the ones implemented

A good friend of mine has worked at Nvidia for years and even he was complaining that “nothing gets done” there


don’t forget kitchen timers! besides music, a top 3 use case is being able to ask for a timer when your hands are messy with ingredients or loaded with a dish going into the oven.

we tried the Alexa/Home lighting solutions this last year and it gets so annoying to ask every morning to turn the lights on. even worse when one of the bulbs drops from the network or the wifi is down.


Wi-Fi is really sub-optimal for light connectivity. Without venturing into high-end home automation systems like RadioRA, your best options for lighting as far as connectivity are concerned are probably Lutron Clear Connect > Z-Wave > Zigbee/Thread. And then any of HomeKit/Home Assistant/Hubitat to handle the automations.

> Lutron Clear Connect > Z-Wave > Zigbee/Thread

Lutron Clear Connect is great, but it’s not even a standard within Lutron. It’s a whole bunch of things, often wildly incompatible with each other. At least most of their stuff is compatible with Pico.

But I’m surprised you put Z-Wave above Zigbee. On the one hand, Z-Wave is a nice ecosystem where approximately everything is compatible with everything else, and even generally unfriendly companies publish detailed integration guides. On the other hand, the protocol itself is unbelievably bad. It does not recover well from transient connectivity issues. Meshing is janky, picks poor routes, and has seconds of latency even when working correctly. The range of non-LR links is beyond pathetic, at least in the US regulatory region. (I found some references suggesting that US rules on power spectral density play poorly with the protocol and the EU region is a big better.) At least long range mode has tolerable, but far from amazing, range.

I haven’t used Zigbee and Thread much, but my understanding is that Zigbee’s mesh should expect much lower latency than Z-Wave, and that Thread is even better. And Thread can supposedly sort-of-transparently use Ethernet backhaul to extend the mesh. Sadly the Thread ecosystem appears to be a mess.


> Lutron Clear Connect is great, but it’s not even a standard within Lutron. It’s a whole bunch of things, often wildly incompatible with each other. At least most of their stuff is compatible with Pico.

Yeah, what I was trying to get at was specifically their Caseta line with 433 MHz Clear Connect (and not their more expensive solutions, which are also great, but a completely different tier).

> But I’m surprised you put Z-Wave above Zigbee.

I was a late-comer to Z-Wave, so I haven't used anything below Z-Wave 800 devices, and it's been solid for me, though I've been very judicious in the quality of devices I pick up, and am careful with device configuration to limit chatter, which can wreck Z-Wave networks. Zigbee has also been solid for me - the real upside for me with Z-Wave is simply the better range/reduced interference of 900 MHz vs 2.4 GHz.

> I haven’t used Zigbee and Thread much, but my understanding is that Zigbee’s mesh should expect much lower latency than Z-Wave, and that Thread is even better. And Thread can supposedly sort-of-transparently use Ethernet backhaul to extend the mesh. Sadly the Thread ecosystem appears to be a mess.

Thread's been pretty good for me too - but again, I'm pretty careful about devices. I use ethernet-connected Apple TVs on each floor of my house for the backhaul and don't have any thread devices that aren't also HomeKit compatible.


> better range/reduced interference of 900 MHz vs 2.4 GHz.

What country / regulatory region are you in? With Z-Wave, non LR, in the US, with 800 series devices and USB stick (properly extended on an extension cable) or 800 series devices meshed to each other, I sometimes struggle to get a connection at a range of 20 feet or so, with a single ordinary drywall + wood wall in the way or even sometimes line of sight. LR is better, but not fantastic.


I'm in Canada, so same frequency/power as the US, AFAIK. Using a HomeSeer G8 on a Home Assistant Green with a USB extension cable. I can't really speak much to the range - it's been enough for the handful of devices I'm currently using, which are all Shelly (rebranded from Qubino) mains-powered Z-Wave devices.

20ft doesn't sound necessarily unreasonable though with a wall... I wouldn't expect Zigbee/Thread to perform better with comparable devices. I haven't found any of the low-power mesh networks to have especially great range when not taking advantage of the mesh.


The trick is a local first solution like Home Assistant and then use automations. No need to ask to turn on the lights; it just does it.

We ask Google Home for many questions but it’s always “sent the result to your phone”. Things like historical facts, trivia, as we put our phones away during meals and when together as a family but still want to query the world.

I would love a way to ask recipes and have it look up in my NYT Cooking subscription and give me step by step.


I think Amazon hasn't managed to find a way to monetize Alexa. It was Bezos' pet project, and when he left it languished. Our main use for Alexa is to set timers, and alarms to wake up to, and the random trivia question.

> Most people just don't use their Alexa beyond "Play music" or "What's the weather today".

That's because voice assistants are and have always been terrible, and could only play music, tell you the weather, set an alarm, and set a timer with any advantage over any other input method. People who liked using them for these things liked them. If you didn't need them for these things, there was no good reason to put a direct open line to Amazon, Google, or Apple in your house just to read you what wikipedia says about something.

People will love real assistants to death, especially local ones that learn from their personal habits. During the wave of voice assistants, they were promised that futuristic secretary through fraudulent advertising and endless industry marketing, and instead they got talking clock radios connected to advertising companies.


There's a single alternative use case worth mentioning (two, actually): "Hey Siri, delete all my alarms" (as you can't multi select via the gui)

...and "Hey Siri, move my calendar event at 3pm tomorrow to 2pm Friday"

You have to be pretty deep in the weeds to consider doing these operations with any confidence via Siri, but hey! They mostly work! :-P


They don't really need OpenAI. There's literally a dozen LLM providers that would be sufficient for what an LLM-powered Alexa would do. Amazon has their own LLMs too. The underlying LLM technology is hardly the issue, at least now. (I'm sure it seemed like it was a year ago.)

IMHO the basic modalities have to change to make use of LLMs in a voice assistant. Alexa and other assistants are built around short utterances that map to a command or simple question answering. There's not much room to make that smarter; things like chained commands aren't just hard to parse (except with LLMs), they are hard to compose.

There's a lot of household use cases they could tackle, but it's almost like building a whole new product. I doubt they have the organization capacity to do that.

There's probably another category of development in making it more socially aware, also building on its position in a household. But it would have to change some of its privacy expectations to be "aware" beyond its wakeword. They have pretty good speaker identification, and I don't know what all they could do with their mic arrays and a sort of audio/spacial awareness, but I bet some interesting stuff is possible. By socially aware I mean adapting length of response, follow-ups, giving more continuity over time, and so on. That's something LLMs do well, but involves hidden state and theories of interaction that these big companies are very reluctant to implement. They are only comfortable implementing it at the training stage where it ends up in opaque model weights and no one takes any responsibility for the result.

From the article: "Research scientists who worked on the LLM said Amazon does not have enough data or access to the specialized computer chips needed to run LLMs to compete with rival efforts at companies like OpenAI."

To me this sounds like folks who don't know what they are trying to achieve, and coming up with excuses. (I thought there must be a term for this, and asked ChatGPT, and am quite delighted with all the options it gives: Strategic self-handicapping, Pretext paralysis, Deferral delusion, Obstruction illusion, Motivated impossibility – all of these seem to be hallucinated terms but focus on a different aspect.)

But it's true could also go deep with something like GPT's Advanced Voice Mode. This does actually does seem like a big technical lift. It's a change in the product, but besides the technical difficulties of better turn-taking, speech generation, memory, etc., it would otherwise be possible to drop it in to the product. It would change the relationship of Alexa with the household, but it's pretty clearly compelling.


When it mirrors the exact same experience as talking to a human (OpenAI's "chat," GPT is slowly getting there) yet a human like AI that does everything for you and interfaces with other AI Agent's (big to local businesses to your friends and family's agents) to get things done for you then I believe that's the tipping point.

I use and talk to chatGPT while driving to get things done and as a knowledgebase .... i.e. wanted get my car towed and junked and get best deal. GPT told me nearest junkyards and all details needed like need title & sign over title to them but it didnt interface with local junkyard's AI agent to schedule it all for me & send me an email confirmation or text. I wished it did.


I want to look at stuff, not hear iterated lists. A smart-tv plus a computer use agent would be pretty close to star trek. Just need it to map intentions like "Computer, what is my schedule like next week?" to opening the calendar app, scrolling to next week and showing it on the screen.

Yeah, I'm in the camp where AI just doesn't offer too much. Especially with how poor it makes the basic timer and radio tasks. It is incredibly frustrating when it falls on the basics.

If anything, I'd love ditching the cloud for the basics.


How would a smarter Alexa allow them to “dominate AI”? What even is the vision here that would allow domination over competitors like Google and allow a massive boost in profits? It all seems like a giant waste of time for them.

Millions of people have a box in their home that they talk to and have for a decade.

Imagine if that box were able to (somewhat) intelligently answer questions.

I'm sure everyone here has asked Alexa some random type of questions .. sometimes you can get an answer from wikipedia, but mostly it's just "sorry i don't know that", and people settle for a subset of standard voice commands to play music or dim the lights.

Amazon had 10,000 people working on this product for a very long time, and had the lead in market share of a device that people talk to regularly.

It's one of the biggest technology balls to ever be dropped.


Every reference to Amazon and Alexa in your comment would still be completely accurate if replaced with Google Home or Apple HomePod.

So the vision for Amazon is their box and underlying AI would forever be ahead of Google and Apple? So incredibly far ahead in fact, that Google and Apple’s massive amounts of user data and decades of habitual use (through Google search, Apple/Google maps, YouTube, Apple/Google pay, and a dozen other multi-billion dollar services) still isn’t enough to prevent Alexa’s AI domination? Again, it all seems like a massive waste of time.


Maybe look at how terrible Siri and gemni are.

I think we have seen there isn’t a huge incentive to be the firm exploring the cutting edge of this tech when you can save big by waiting for other firms to spend billions on dead ends and incrementally larger models which at any point can be distilled by a competitor for 1/10th the price of the final training run. I imagine we’ll start to see that model structure and weights are very easy targets of low grade corporate espionage. A company with Amazon’s resources should be able to catch up in under a year if they poach the right talent.

It seems to be that they’re doubling down on Anthropic/AI for commercial use, at least anecdotally from my friends at AWS. Not sure what Amazon proper is looking to do with it in the consumer space, or if it’ll have any real impact.

To what end? Alexa is a huge financial failure that cost the company billions. This will only increase costs and not increase the ROI.

It's not too late. Google and Apple have not released improved AI based home voice assistants.

I'm expecting it to be available any day now, included with a paid subscription to Prime+ (new), Google One, and Apple One.


The one that had the gaffe about surreptitiously providing employee access to customer data? Yeah, yikes.

Nah, Google was number one, they just dropped the ball as they always do.

Give ownership of the company to the workers?

(2024)

mirror:

https://archive.is/JJsuW

Alexa's answer are so embarassing that I always talk to it in whisper mode.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: