Hacker News new | past | comments | ask | show | jobs | submit login

I would say that you are jumping to a lot of conclusions here. Lets dig deeper.

"It doesn't have a culture. It doesn't have thoughts"

These are conclusions. What is your reasoning?

To what degree would you say that human decision making can be explained by this statement:

"It is a tree of probabilities with a bit of a randomization on top of it."




Once again, it seems to me that the burden of proof for claiming that a piece of software is sentient should not be on the discarding side.

What is your reasoning to prove that it has culture and reasoning, that its abilities go beyond mimicking human discourse?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: