Hacker News new | past | comments | ask | show | jobs | submit login

AZ trained in self-play mode for millions of games, over multiple generations of a player pool.



I am familiar with the literature on reinforcement learning.


They're saying the board games AlphaZero played with itself are experiences.


And I am saying they are confused because they are attributing personal characteristics to computers and software. By spelling out what computers are doing it becomes very obvious that there is nothing that can be aware of any experiences in computers as it is all simply a sequence of arithmetic operations. If you can explain which sequence of arithmetic operations corresponds to "experiences" in computers then you might be less confused than all the people who keep claiming computers can think and feel.


> By spelling out what computers are doing it becomes very obvious that there is nothing that can be aware of any experiences in computers as it is all simply a sequence of arithmetic operations.

By spelling out what brains are doing it becomes very obvious that it's all simply a sequence of chemical reactions - and yet here we are, having experiences. Software will never have a human experience - but neither will a chimp, or an octopus, or a Zeta-Reticulan.

Mammalian neurons are not the only possible substrate for intelligence; if they're the only possible substrate for consciousness, then the fact that we're conscious is an inexplicable miracle.


If an algorithmic process is an experience and a collection of experiences is intelligence then we get some pretty wild conclusions that I don't think most people would be attempting to claim as it'd make them sound like a lunatic (or a hippy).

Consider the (algorithmic) mechanical process of screwing in a screw into a board. This screw has an "experience" and therefore intelligence. So... The screw is intelligent? Very low intelligence, but intelligent according to this definition.

But we have an even bigger problem. There's the metaset of experiences, that's the collection of several screws (or the screw, board, and screwdriver together). So we now have a meta intelligence! And we have several because there's the different operations on these sets to perform.

You might be okay with this or maybe you're saying it needs memory. If the later you hopefully quickly realize this means a classic computer is intelligent but due to the many ways information can be stored it does not solve our above conundrum.

So we must then come to the conclusion that all things AND any set of things have intelligence. Which kinda makes the whole discussion meaningless. Or, we must need a more refined definition of intelligence which more closely reflects what people actually are trying to convey when they use this word.


> If an algorithmic process is an experience and a collection of experiences is intelligence

Neither, what I'm saying is that the observable correlates of experience are the observable correlates of intelligence - saying that "humans are X therefore humans are Y, software is X but software is not Y" is special pleading. The most defensible positions here are illusionism about consciousness altogether (humans aren't Y) or a sort of soft panpsychism (X really does imply Y). Personally I favor the latter. Some sort of threshold model where the lights turn on at a certain point seems pretty sketchy to me, but I guess isn't ruled out. But GP, as I understand them, is claiming that biology doesn't even supervene on physics, which is a wild claim.

> Or, we must need a more refined definition of intelligence which more closely reflects what people actually are trying to convey when they use this word.

Well that's the thing, I don't think people are trying to convey any particular thing. I think they're trying to find some line - any line - which allows them to write off non-animal complex systems as philsophically uninteresting. Same deal as people a hundred years ago trying to find a way to strictly separate humans from nonhuman animals.


Continuing this reductio ad abusrdum, you might reach the fallactious conclusion, as some famous cranks in the past did, that intelligence is even found in plants, animals, women, and even the uncivilized savages of the new continent.

Intelligence appears in gradients, not a simple binary.


> Intelligence appears in gradients, not a simple binary.

Sure, I'm in no way countering such a notion and your snarky comment is a gross mischaracterization of my comment. So far off I have a difficult time believing it isn't intentional.

The "surprise" is not that plants, animals, or even women turn out to be intelligent under the definition of "collection of experiences" but that rocks have intelligence, atom, photons, and even more confusingly groups of photons, the set of all doors, the set of all doors that such that only one door per city exists in the same set. Or any number of meta collections. This is the controversial part, not women being intelligent. Plants are still up for debate, but I'm very open to a broad definition of intelligence.

But the issue is that I, and the general fields of cognitive science, neuroscience, psychology, and essentially everyone except for a subset of computer scientists, agree that intelligence is more than a collection of experiences (including if that collection has memory). In other words, it is more than a Turing Machine. What that more is, is debated but it is still generally agreed upon that intelligence requires abstraction, planning, online learning, and creativity. But all these themselves have complicated nuanced definitions that are much more than what the average person thinks they mean. But that's a classic issue where academics use the same words normal people do but have far more restrictions on their meaning. Which often confuses the average person when they are unwilling to accept this fact that words can have different meanings under different contexts (despite that we all do this quite frequently and such a concept exists in both our comments).


You seem to use the word intelligence to mean `consciousness` (if you replaced the first with the latter I would agree with your argument).

I would define "intelligence" as (1) the ability to learn or understand or to deal with new or trying situations and (2) the ability to apply knowledge to manipulate one's environment.

It turns out that this is also the Merriam-Webster definition [0]. By that definition, yes AlphaZero was learning and understanding how to deal with situations and is intelligent, and yes most machine-learning systems and many other systems that have a specific goal and manipulate data/the environment to optimize for that goal, are intelligent.

By this definition, a non-living, non-conscious entity can be intelligent.

And intelligence has nothing to do with "experiences" (which seem to belong in the "consciousness" debate).

[0]: https://www.merriam-webster.com/dictionary/intelligence


This is a common retort. You can read my other comments if you want to understand why you're not really addressing my points because I have already addressed how reductionism does not apply to living organisms but it does apply to computers.


The comments where you demand an instruction set for the brain, or else you'll dismiss any argument saying its actions can be computed? Even after people explained that lots of computers don't even have instruction sets?

And where you decide to assume that non-computable physics happens in the brain based on no evidence?

What a waste of time. You "addressed" it in a completely meaningless way.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: