Hacker News new | past | comments | ask | show | jobs | submit login

The hilarious part to me is the number of otherwise intelligent people concerned that this sort of stupidity is a threat to humanity.

The only real threat is from people willing to trust AI.




You're (somewhat abrasively) stating a version of my own opinion as well, but the "real threat" you mention, is very real. While "AI" (really, machine learning) is not good at most things, it does appear to be very, very good at convincing people it is good at them (for whatever reason). The threat of it being put in charge of things when it has (quite literally) no idea what it is doing, is not a small threat.


In other words, the real threat is stupid people, not stupid machines.


Many combinations of atoms are in my jeans. My jeans are not very dangerous. Therefore, other combinations of atoms will not be dangerous.

Nobody is worried about GM's chat bot.

People are worried that LLMs will be abused and many people will suffer for it.

People are also worried that significantly more advanced forms of AI will cause us to no longer be the dominant species on the planet.


People wear jeans like yours to do bad things and people will suffer for it.

People are worried that maybe your jeans are dangerous and should be regulated.


Because a poorly implemented chatbot using someone else's LLM API is comparable to what you can accomplish with 10^n rounds of inference in a clever way. Computers are useless without error correction, LLMs may be as well. That's not to say that LLMs will form their own goals, but that people in control of them will be welding dangerously capable agents.


Trust isn't enough. Humans constantly error checking AI output, get worn down and then the stupidity leaks in.

Can't use AI as a crutch, it eventually does the thinking for you.

Agent Smith - I say your civilization, because when we started thinking for you, it really became our civilization.


Trinitrotoluene is great for mining. What could go wrong?


It isn't. It isn't. It isn't. It is.

We have no idea where that point is.

It's worth comparing to where we were a century ago. That's where my kid will be when he's grown up compared to now.


Your kid will be grown up in less than 20 years, not 100. But even still, in 100 years, will there be 4x as many people? Will humanity be consuming 10x the energy that we do today? Will we have computers that are a million times faster?

The point is, exponential progress is incredible, but at some point it ceases to be exponential. And the progress of the last 100 years was fueled by a exponential population growth and exponential energy usage. We're already at +1.5C because of that; how hot will it be when your kid is grown up?


If you look at the rate of change of humanity, it's been exponentially increasing.

If you look at the direction, it's not predictable. A very different set of things will come to pass.

A child born today will live O(100 years), and will be in a very different world than I am today. Computation, in particular, is continuing to change. LLMs are a huge change, as is being interconnected, as are many other things. That's not "faster," like Moore's Law of yesteryear, but it is change.

Also: Change isn't always progress.


Just a guess but I'd say "this point" is some time after real signs of understanding and intelligence are displayed.

The concept of *money* and commerce might be a good place to start trying to teach this techno parrot how to actually think.

A 5 year old has way better thinking ability. Maybe we should regulate 5 year olds as being potentially dangerous. You never know --- at "some point" one of them could easily decide to destroy humanity.


Once a technology has been developed and made available it can be used by any number of governments and corporations to do... whatever the fuck they want. You may have the resources to say "no" but they have the resources to get millions of people to give an enthusiastic "yes". Most people will do whatever marketing campaigns and figures of authority tell them they can do. Hold a radiating box by your brain a few hours a day and have it sit next to your crotch the rest of the time? Sure. Take 3-plus shots of a vaccine developed with new technology and in record time? Of course. Get into a metal tube and soar through the skies like an absolute lunatic? You're the boss!

In some cases, like nuclear proliferation, a concerted effort by powerful actors can slow the spread of certain technologies. Otherwise, your "no" will amount to about as much as the anti-vaxxers.


don't underestimate the amount of cost pressure to put the artificial idiot somewhere it may actually cause some damage


Logical fallacy there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: