Hacker News new | past | comments | ask | show | jobs | submit login

Apple is going exactly where I predicted they’d go. They’ve already had machine learning hardware on their phones for multiple generations, mostly “just” doing on-device image categorisation

Now they’re moving into more generic LLMs, most likely supercharging a fully local Siri in either this year’s or next year’s iOS release

And I'm predicting that there will be an opportunity for 3rd party developers to plug in to said LLM to provide extra data released during WWDC. So you can go "Hey Siri, what are the current news on CNN" -> CNN app provides data to Siri-language model in a standard format and it tells what's going on.




The secret sauce will most likely be tight integration with Shortcuts, enabling the language interface to drive and automate existing apps.


I really hope that Shortcuts gets a UX overhaul. It feels so painful to write and test new shortcuts.


"play an hourly chime, oh and by the way remind me to get coffee when i'm on the way home tomorrow" no ux beats that but text/voice.


Shortcuts is getting to a point where I'd prefer to just write actual code instead of fighting with the stupid Scratch-style coding blocks.


That sounds a lot like AppleScript... https://en.wikipedia.org/wiki/AppleScript

I wonder how hard it would be for Apple to rebuild Shortcuts around an AppleScript backend to allow power users the ability to edit the scripts directly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: