Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Apple WWDC Event Will Show Whether It Can Be a Force in AI Industry (bloomberg.com)
23 points by mfiguiere 3 months ago | hide | past | favorite | 52 comments



The AI industry needs to show that it's useful, not that it's innovative. People won't come if you just build stuff. Outside of the tech industry, I have found people have literally no interest or just disdain for this. Personally I haven't found a single use for any of the current AI tools out there which materially improves my life other than the de-noise in Adobe Lightroom.

I suspect Apple are the most likely organisation to be able to change this as they have a track record of taking existing technology and making it useful and demonstrating it effectively.

If they can't do this or it's a disaster I expect it will by proxy reflect badly across the entire industry. This will be an interesting year.


I simply don't understand this take. Yes, I think AI is incredibly over-hyped at the moment, and I've come to totally agree with Yann LeCun's assessment that LLMs have nothing to do with AGI, but I still think it's an incredibly useful tool if one is aware of its limitations. Things like:

1. ChatGPT is great at translations (at least for English-French), and I've found it too be a great language learning assistant.

2. It's incredibly useful for me when it comes to writing a first draft of complicated SQL queries.

3. It's great at essentially a faster document search and summarization tool if I don't know the keyword of what I'm looking for, e.g. when I want to ask "I want to code X, which libraries can help me do that?"

4. I use cursor.sh as my IDE and I find it incredibly useful, especially for finding answers about a large codebase when I'm a newbie to the repository.

5. I've found ChatGPT to be great for trip ideas and planning.

In short, I think generative AI is a powerful if flawed tool, and I think the disillusionment comes when people think it's more than that.


I think the disillusion here is when you find out that these things are mostly smoke and mirrors that tend to generate very suboptimal solutions. For example I am forever having to shoot down ChatGPT generated code before it hits production which is naively and lazily generated and the engineer is not aware of this at all. Yes it can write SQL but does it understand the index strategy on the destination table or the consequences of running thousands of those queries a minute in a production database? No. Does it have a deep understanding of context rather than just syntax (ask a French language translator what they think of it). No. And with trip planning and ideas, which is a speciality of mine, it is always better to find someone local and ask them about it because a huge portion of the information out there isn't something a model can infer about because it isn't written down anywhere. Case in point, I'm off to the Azores islands shortly - it can't generate anything useful for me. I had someone suggest that already. On top of that it is compelled to over explain sometimes to the point of incorrectness, which it likes to do on travel advice. This becomes dangerous. I found someone local to talk to instead.

So personally I think the LLM and question based uses are limited and quite ridiculous from an objectivity perspective. They barely scratch the surface of the problem and the hit rate is less than I'd get doing the legwork myself, which I do not find useful. And the information is hard to verify and judge if it's safe without deeper contextual knowledge. This is a complete mire.

The other use cases, like classification and search are where I see it winning but those aren't sexy and don't play on our human emotions as well enough to part with cash. They will be left after the hype of course.

Really I think people are buying into a romantic fantasy at the moment.


>Yes it can write SQL but does it understand the index strategy on the destination table or the consequences of running thousands of those queries a minute in a production database? No.

It does if you instruct it well enough.

We're still really early and companies are figuring out how to best use it. It's been 1 year since GPT4 was released. We didn't invent the lightbulb, electric motor, telephone, radio, refrigerators etc. until many years after the invention of electricity.

Right now, LLMs are very useful for many tasks. Over time, they'll continue to get more useful and inventions will be built on top of them.


It does if you instruct it well enough with a maybe 100k in GPU time and a team of data scientist who have expertise in the domain.

The other option you have is to hire somebody who can write a SELECT with a JOIN in SQL.


None of those things were commoditised in even a 20 year span. Many more things died.

Effectively everyone is running on faith. That’s not good enough.


But electricity didn't die. Many products built on top of electricity died.

I don't think AI is running on faith in 2024. I think it's running on very realistic projections and what is available now.

I have the exact opposite view of you. I actually think the AI hype is not overblown. I think it's "underblown". I think we have a very straight forward path in the next few years for AI to be useful everywhere.


> I use cursor.sh as my IDE

How does it compare to Co Pilot or the AI Assistant from JetBrains IDEs


I can't answer that anymore. When I first used it it was much better than Copilot, especially for "ask my repo this question"-type queries, but I think Copilot may have added that functionality since I last used it.


>The AI industry needs to show that it's useful, not that it's innovative. People won't come if you just build stuff. Outside of the tech industry, I have found people have literally no interest or just disdain for this.

you could've said the same thing about computers, back when they were room-sized machines full of vacuum tubes, capable of rudimentary calculations and not much else. would you imagine that less than a century later, we would have devices with more computing power than a million of those in our pockets?

when AI industry stops obsessing over lobotomizing their models into being useless for anything except basic bitch customer support, content moderation and mediocre stack overflow substitute, there will be tons and tons of applications for generative AI. my own industry - gamedev - is within less than a decade from dialogue and audio being generated on the fly, with human writers doing maybe 10% of the work and voice actors just providing a few samples. of course, that will not happen until LLMs are allowed to produce anything but soulless garbage that strictly adheres to FAGMAN sensibilities.


> you could've said the same thing about computers, back when they were room-sized machines full of vacuum tubes, capable of rudimentary calculations and not much else

Electronic computers were seen as useful right from the start; they clearly fulfilled an existing need. No-one was handing out VC money to make computers of possible speculative future interest in the 1940s.


I must be the only one on HN who finds GPT4/GPT4o extremely useful and use it multiple times/day for different things.

I'm aware of its limitations and work around its limitations. For example, I'm not going to blindly push the code it gives me without checking it carefully and testing it. I'm not going to completely trust the legal document that it generates for me but I will use it for the first draft and ask it questions.

ChatGPT is excellent for tasks that are on the low end of the risk spectrum but are tedeous.


Not really. Your point is hyperbole. Personal computing was a paradigm shift. This is a relatively minor change in a small subset of it.


it was a paradigm shift, yes, but you would never guess that if you were looking at ENIAC in the 40's.


It's not like "AI" is anything new though, it's been around since the 1950's and we've been through several AI winters since then:

https://en.wikipedia.org/wiki/AI_winter

There's no reason to believe that this time it'll be different.


I'm wearing Dick Tracy's watch so the ideas were already out there...

The point is that idea was actually useful!


> Personally I haven't found a single use for any of the current AI tools out there which materially improves my life other than the de-noise in Adobe Lightroom.

Huh odd. I use the copilot mobile app continuously to find things that would be laborious or annoying with search.


Well, anecdotally, a Uber driver who is also a real estate agent told me that he used ChatGPT daily, and it was very helpful.

Just to put it out there.


Kinda opposite in my circles - techies are just meh while creatives use to do tons of writing or summarising stuff. Of course tons of friends never even heard of it.


Even if they can't do anything better than other companies, if they can do it in a way that avoids having my questions sent to the cloud, I'll be much more likely to use it. I don't mind if it's a generation behind (and will increasingly feel this way as things progress), if the tradeoff is rock-solid privacy.


I'm more interested to see if Apple can make something useful out of this AI mess... And if Apple hasn't just become another trend-chaser.


I’m increasingly bearish on Apple. I have the impression that they basically make toys for adult these days (no, not that kind of toys). My initial amazement for the Vision was immediately dampened by the fact that it was released by Apple and realizing they get to decide what I’m allowed to run on it (you know, like on a kids device). I’m sure their AI will follow the same pattern: locked down and neutered to the effect of being a gimmick.


Apple sells consumer devices. You want to tinker with AI get a TPU.


And they sell them to a massive audience who generally don’t give two hoots about AI.


They have the largest installed base of AI engines in every iPhone, iPad, Apple silicon Mac.

If they can't do anything useful with AI, no one can.

Hopefully more than just identifying faces in photos. Siri needs an upgrade, it could become much more "conversational" and take advantage of the local processing instead of sending everything off to Apple servers to be processed.


This.

Apple can gradually add AI capabilities in their products as they do not need to sell it urgently unlike the rest of the tech companies out there screaming about it to the roof tops.

Like I said before, Apple will likely announce a hybrid AI solution for Siri for both online and offline capabilities. I expect them to eventually use a local LLM to race everyone to $0.


So you are saying I should sell Nvidia?


There is a different reason to sell Nvidia and it also affects Apple as well and it is a geo-political issue with the US and China over Taiwan.

Won't be surprised to see a scare in the AI industry and a wider disruption over chip manufacturing as tensions escalate.

At least Apple is de-risking their chip manufacturing outside of China. [0]

[0] https://www.businessinsider.com/companies-leaving-china-dive...


the thing is that the challenge to monetize AI. if AI won't allow apple to sell more devices - it won't work out.


This might be more defensive and about not letting someone else with better ai features sell more phones (if that's what consumers want).


The problem is that there is no indication for now that users want integrate AI experience - feels like people are just too much into app specific things like "I use this app for this".

We will see though.


I think of the most interesting aspects of Apple and AI is the fact that they will use OpenAI's models.

I used to think that OpenAI would like to compete with Apple one day by changing the entire OS into a giant LLM. We got a hint of this when ChatGPT was big on its 3rd party apps and Function Calling. They may still have that goal but it's not realistic in 2024. But in the mean time, they have to compete with Google because Google can/will integrate their LLM models directly into Android - which means billions of people worldwide will use Google's LLMs. OpenAI needed a way to compete. It makes a ton of sense then that OpenAI would partner with Apple to distribute their models into billions of devices instantaneously. Apple also needs OpenAI because they're probably many years behind.

For Apple, I assume they will get OpenAI's models directly instead of calling OpenAI's API which runs on Azure. Apple will likely need to build its own AI infrastructure to inference OpenAI's models for its billions of devices and they'll likely to market it with privacy features that ChatGPT don't bother with right now.

I suspect that Apple will integrate smaller LLM models that can run on a tiny iPhone SoC NPU and make them free. Then they'll charge for Apple Intelligence Pro that does most of the inference in the cloud.

One of my questions for Apple, if my speculation is correct, is how are they going to launch a service that inferences a model as large as GPT4 for their billions of devices worldwide? They didn't invest in server NPUs like Google did with their TPUs. They can't buy nearly enough Nvidia GPUs because they're supply constrained. They can't possibly use M2 Ultras because it's suboptimal for AI inferencing and very expensive to manufacture (134 billion transistors & glue 2x Max dies together).

Perhaps Apple will only launch the service in the US first and charge a subscription for the GPT4-backed AI features. I don't see any other way Apple could scale so quickly.

I did not think the Apple/OpenAI marriage would happen but it makes a lot of sense for both companies after some thought.


Does it need to be a force in the AI industry? Are we confident the ai industry is healthy and sustainable? I'm certainly not.

Could Apple continue to just make good, desirable hardware and decent os decisions that many people like?


I’m sort of enjoying Gemini on Android Studio, it’s really helpful and I’m the most AI skeptical person.

I’m half expecting for some AI on Xcode too but, you know it’s Xcode. It’s like the worst IDE in the history.


I'll make a prediction:

Apple will not mention "AI" during the event. The only time "AI" is uttered is when they're referencing OpenAI the company.


You’re already wrong. They’ve mentioned the word AI many times in the lead up to the event.


I did specifically say “during the event”

So far they’ve talked about machine learning like they do, not a single mention of AI


They were smart enough to invent a new term, Apple Intelligence. You know, AI


It's referring to the whole platform of partially on device, partially in their private cloud -thing.

During the presentation itself, they talked about machine learning, not "AI".


Apple doesn’t need to be a force in the AI industry. It needs to integrate the concepts into devices to make them more useful for everyday people.


just like it showed with Vision in VR industry?


I genuinely disagree with the premise. Bolting generative AI on to its existing offering isn't in Apple's playbook.

If they can genuinely improve their products by the use of these tools, then I expect to see something pretty special. But I doubt you'll see a drop-in Siri replacement using ChatGPT. Apple watched Google try essentially that and get ridiculed.


https://archive.is/GeGwm

  THE CHATGPT EFFECT

  When ChatGPT launched in late 2022, everything changed. Federighi, the software chief, became a convert that Christmas break after he began playing around with the Microsoft-owned GitHub AI coding tool called Copilot, which is powered by OpenAI’s technology, said people familiar with his experience.

  After that moment, across Federighi’s software-engineering organization, employees were tasked with coming up with new ways of incorporating generative AI into products and given resources to pursue these projects, said former executives and engineers. At internal meetings, Federighi said that he had come to appreciate generative AI technology and that it would be incorporated into all aspects of Apple’s software.

Craig played with CoPilot => "AI all the things!". Same as Google, just throw spaghetti at the wall. I dread the upcoming iOS updates it will be a clippy-everywhere nightmare.


Apple products usually have a decent level of polish, I hope it’ll be done in a useful way.


It isn't difficult to predict at this point: It will be added as a minor gimmick to every app and mostly rolled back next version as the hype dies down and the embarrassing half assed implementations are mocked.

I wouldn't even be surprised if they add it to the calculator app at first. Next version they'll keep the vaguely plausible integrations that people actually use. Which will be very few.

Some of this will be justified internally as experimentation and data collection. This used to be called taste.

In actuality it is because they don't want to be perceived by the markets as "falling behind". We gotta do something, this is a thing, and therefore.. arm flail

All the incentives at play point squarely at enshitification.


I suspect similar. None of the AI stuff they have is particularly fantastic at the moment. I mean it's mostly used to suggest things which I am not interested and mix up cows and horses in Photos.

Working with OpenAI is interesting. That means they can cast their proposition off or replace it later. Even they know it's a massive risk.


They will be if they were smart enough to exclude everyone who worked on Siri from their new AI projects.


I don't get the hate for Siri. If you know the constraints it's extremely useful and reliable.


I think the frustration comes around the constraints. It’s great for reminders and timers, and those meaningfully improve my life. It sometimes can successfully make a call. It interrupts me in the most obnoxious way at least once a week.

But it seems almost willfully stupid when compared to its potential.


I use the Overcast app for listening to podcasts. Sometimes when I say "play Overcast" it will play, other times it will not. When it doesn't play, it will either say it doesn't understand, or it will say I need to unlock my iPhone. Or it will start a free trial of Apple Music and start playing songs by some band called Overcast. This is crazy. The app is extremely popular, and the command is very simple. I have also tried using different syntax ("tell the app Overcast to play", "play the Overcast app"), and have not been able to find anything that reliably works.

It is similarly unreliable with navigation and calling restaurants. If I tell it to call Restaurant X in Palo Alto, it will tell me that it couldn't find that in my Contacts. No shit — I'm not asking about my Contacts. I have to literally tell it to search for the restaurant in Palo Alto, then listen to it tell me about the restaurant (how many stars, how far in which direction, that it can navigate or call), then tell it to call. There's no reason it shouldn't be a simple "call X in town Y". IME this used to work, but has been increasingly broken.


Ah I just use it to control music when driving and adding stuff to the shopping list. I find any other tasks require a human. Not because of Siri as such but because the problem is deeply concerned with context and semantics.


What command do you use to add things to a shopping list?

I think Siri used to be more reliable, but as it has grown to be less closed-universe, it has (somewhat understandably) become worse at doing tasks that it used to be able to handle very reliably.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: