I mean, my tendon in my knee can "react" to "negative stimuli" when a doctor strikes it with a rubber hammer.
I guess the question is about how these words are being used, and whether they're referring to automatic biological reflexes or something like a conscious experience, sentience, or sapience, depending on how you use the terms.
Sentience always requires more than just reaction, but a memory component. That is, it can take a proactive response to avoid that negative stimuli in the future. However, measuring both of those is generally very difficult, and so determining if something is sentient can be difficult.
The bar for sentience is basically, "is it more than automatic". Which is fairly low and easy to clear. It doesn't even require that there be consciousness.
>However, measuring both of those is generally very difficult, and so determining if something is sentient can be difficult.
That makes sense. So then, we should probably be very cautious about statements that openly entertain such possibilities given that they may be hard to substantiate meaningfully.
>The bar for sentience is basically, "is it more than automatic". Which is fairly low and easy to clear. It doesn't even require that there be consciousness.
Huh, every definition I've seen when I read about this stuff in college, and that I currently see when I google, explicitly relates in some way to consciousness. And anything that means "reaction" "sense" "detect" in a sense pertinent to sentience meant some kind of conscious sensing, conscious reacting. Did the definition change at some point?
> Huh, every definition I've seen when I read about this stuff in college, and that I currently see when I google, explicitly relates in some way to consciousness. And anything that means "reaction" "sense" "detect" in a sense pertinent to sentience meant some kind of conscious sensing, conscious reacting. Did the definition change at some point?
You're sort of there, and sort of not. Sentience does require feeling, yes. But feeling doesn't require consciousness, no.
Sentience requires that the subject can desire pleasure, and desire to avoid pain, but neither of those things explicitly requires conscious thought. They are feeling, which you'll find in every definition of sentience, but desire doesn't have any need of a conscious mind. Collective behaviours, like in trees, can exhibit desire and can show pain, without the need for a consciousness to exist.
Claiming sentience requires mere feeling and that such cases fall into a different category than consciousness is an interesting intellectual exercise, and I'm quite familiar with that exercise, but as I said in a different comment, I don't think that's actually as consistent with normal definitions as you appear to believe.
>Sentience requires that the subject can desire pleasure, and desire to avoid pain, but neither of those things explicitly requires conscious thought.
I understand this as a recently popular argument that exists in the ether, and at best I can say I think there are numerous problems with it, although it's a legitimate and interesting idea. Whatever it is though, it's certainly not accepted mainstream usage of the term in normal circumstances, it's more a position in an argument seeking to draw distinctions between biological reflexes at one extreme and full blown consciousness at the other.
>Collective behaviours, like in trees, can exhibit desire and can show pain, without the need for a consciousness to exist.
I think that is only the case if you water down the meaning of words like desire and pain in such away that estranges them from sentience, which of course, makes it problematic to invoke the words, used in that way, to illustrate case examples of sentience.
I also have to note that I'm finding it pretty condescending of you to declare that I'm "sort of there, sort of not," and then repeating back to me some pretty bog standard material I'm perfectly familiar with as if its new information, and concluding by endorsing a fringey-at-best definition that is not, despite your protestations to the contrary, the usual or accepted meaning of the term.
I don't think the definition has changed philosophically, but using consciousness as a definition makes decision making very hard. We have no way to prove what animals are and aren't conscious, and what evidence we do have either way is extremely hazy. Using a line like "is more than automatic" to define what we think is probably sapient is pretty reasonable. Maybe it catches some things that are just complex machines with minor learning capabilities, but if you're more worried about excluding sapient beings than including non-sapient it's a reasonable heuristic for decision making.
I do personally think the line can be pulled back a bit to include creatures that's learning ability is extremely inflexible, but in the end it comes down to one's philosophical stance on what is likely to indicate a conscious creature.
Okay, I think this is entirely a different discussion than I was trying to have.
We had one person pretty confidently claiming that "sentience" meant XYZ, which was a definition I had never heard before.
Now we have this, which acknowledges that it's a non standard definition, but suggests we should proceed to use it anyway because, gee, it's hard to get evidence, and what if your objective is [insert objective] and you don't want [insert inconvenience due to strict definition] to happen, so let's use a definition that blurs the lines between speculation and evidence. I have a number of problems with that, but it's a different conversation anyway and it loses sight of what I was asking about.
About six comments up this thread, if anyone can remember that far, we had people putting octopuses, lobsters and even amoeba in the same category. At the bottom, the reason for doing that boiled down to people deciding it was okay to redefine sentience in a way that lowers the bar to include them all.
Trying to get to the end of what was driving that conversation leads to things like this, that aren't quite focused answers, but are just invitations to detour in any number of directions.
I agree, my thought is less that I think an amoeba feels pain (I don't) but that I don't personally believe things with insect-level complexity feel pain. That they react to negative stimuli definitely doesn't prove sentience, and I personally find the totality of how insects interact with the world to be too extremely mechanistic to believe they feel at all. This is true of crabs as well, I've never really looked into lobsters enough to have an opinion on them.
I've enjoyed the Journey into the Microcosmos Youtube channel - This one is Titled "Making Decisions Without a Brain" (subtitled poking protists) https://youtu.be/1LyeWQZ7ZR0