Hacker News new | past | comments | ask | show | jobs | submit login

You wouldn't say that is the camera was assisting with a serious surgery or firing a military rocket, it's not okay, we need to make sure the bar is set very high and there needs to be penalties/ fines for mistakes if we are going to jump on the fully autonomous bandwagon.



> You wouldn't say that is the camera was assisting with a serious surgery or firing a military rocket

No you wouldn’t. But it’s not.

“Move fast, break things” is perfectly ok for certain applications, such as recreational AI. Nobody said it’s ok for medical or military applications.

You can live in a world where the bar for quality control varies depending on the application.

Experimentation will never happen if we set medical/military grade expectations across the board.


Those applications already have higher standards for general software reliability, to say nothing of AI. Obviously the bar is different for different applications.


No rockets, no surgery here. Just a game. It's probably ok to lower the budget by an order of magnitude and open a ticket in Jira after the match.


That sounds wildly out of proportion for a soccer match where you can just recheck the footage


Considering that at least a cameraman has lost his job recently, there's always going to be a social cost to moving to pure tech. It's never just one thing (ie. It's not just a soccer match)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: