Hacker News new | past | comments | ask | show | jobs | submit | mleroy's comments login

@somnium_n: Now, wait a minute, I wrote you!

MCP: I've gotten 2,415 times smarter since then.


You’re right that "history" starts with writing, but "prehistoric" refers to the human period before written records, starting with the use of stone tools around 2.5 million years ago.


For reference, language deprivation experiments have a long tradition: https://en.m.wikipedia.org/wiki/Language_deprivation_experim...


I believe the concept of multimodal token embeddings is quite fitting for my own thinking. Certainly, my 'embeddings' are not always fully formed words, but sometimes they are.


I hope that Linux will be the best way to live the AI dream. AI on Linux could enhance user privacy and security by giving full transparency and control over data, ensuring that AI functionalities are aligned with individual preferences. I hope for a common system pattern where a local-first, frontline AI manages tasks and requests, seamlessly relaying them to either cloud services or specialized local subsystems, such as RAG-enabled components. Linux is uniquely positioned to bring these advanced features to the desktop without compromising privacy. However, if Windows e.g. offers the ability to select any text, hit a hotkey, and seamlessly format, improve, and correct grammar across all applications, while Linux lacks these capabilities, Windows will gain the upper hand again.


Thank you for stating this. I've found this particular article tends to have a lot of Luddites, perhaps they see it as a career threat or have a case of the green eyed monster https://en.m.wikipedia.org/wiki/Green-Eyed_Monster_(disambig.... I honestly don't understand it.

I sarcastically posted this is akin to using Windows 3.11 Vs a modern multitasking OS and we should do that because humans can't really multitask, and it was unsurprisingly downvoted perhaps because it went over people's heads.

AI gold rush is just like the internet gold rush or the file sharing gold rush, and it'll settle down with the marginals vanishing, leaving local models and the big players.

Those who don't embrace will be left behind, like COBOL Devs of the 80s as newer languages came about (mostly VB and Java) and took their banking jobs (I've seen this first hand, and not ironically I've seen this start to happen with Java as a language now and the "old guard Devs", of the generation I am now in, but this is off topic).

Back on topic, Linux will go the privacy route, Microsoft and Apple possibly the sales and analytics route, but regardless of the negative side, people will be able to achieve their work and goals faster. I use ML day to day, and for me my personal use case is to have 2 FE developer AI on tap. These have been https://github.com/wandb/openui and ChatGpt 4o.

Eventually I expect the former to be the more capable and use just that, but who knows, either way my FE Dev time (I'm not a FE Dev) has been cut from weeks to hours. Good luck to any average developer without FE skills beating that.


What do you think? Is it possible to give a polite, slightly anxious translator bot a metallic-sounding British accent without having to pay C-3PO's voice actor?


A bigger problem than AI alignment is human alignment. Sadly, the only effective alignment we have is return on investment.


I disagree with the author. Insights and opportunities to understand new technological developments increase every year. In the early years of these technologies, they were always less approchable to me. While in the 80s, I had to rely on outdated books from the library for superficial knowledge about PCs, now I can actively engage with AI developments in near real-time, experimenting with it as a service, running it on my own machine, or even training smaller models.

However, I am apprehensive about how society will navigate these new AI advancements. I believe we won't be able to adapt concepts and cultural techniques as quickly as the reality shifts due to ubiquitous AI. These social changes are beyond my comprehension and overwhelm me.

My engineering education has always helped me explain technology to both my parents and my children. But for now, it's just a matter of "fasten your seat belts."


I actually thought AMD would release something like that. But somehow they don't seem to see their chance.


On what hardware with what software?

Do you think people got an MI300 lying around?

AMD's GPUs simply weren't meant for GPGPU.


When discussing silicon deities, maybe we can skip the Old Testament's punishing, all-powerful deity and reach for Silicon Buddha.


Sounds like somebody has watched the movie "Her"


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: