I do not disagree, my hope is that, as with other technologies, especially dual use ones, we will be able to keep them biased toward the positive uses.
Robotics, like genetic engineering, nano-technology, and data mining can do good things and bad things. It is important to focus on the good things, remain cognizant of the bad things, and maximize the value. I would be sad to see robotic quadrapeds "banned for civilian use" because they can be weaponized, because they can also get to people in need in dangerous and hard to reach places.
In a more current events sort of way, I am in favor of severe punishments for people who weaponize drones and fly them in public places, or people who manufacture and sell such drones, but I am not in favor of banning personal ownership of drones. I am willing to risk that someone will show up where I am with such a device, and the risks to my personal safety if they employ it in a deadly way, in exchange for the freedom to own and experiment with drones in a responsible way.
Both parent comment and Chuck's reply appear to contain an important semantic error, using amoral to mean immoral. Corporations tend towards amorality, which is a superset of both morality and immorality, because this is more profitable. I believe Chuck is hoping we can keep morality more profitable, on the whole, than immorality. I am less sanguine.
I don't consider most things capital does to be immoral. In an optimally run profit-generating business, the course that produces the most profit will be pursued. That's basically a tautology, and cares nothing about morality at all!
I don't buy that argument. Criminal capital (if such a term exists, which would be defined as capital that is amoral in origin) tends toward profit over morality. To the extent that technology is exploitable with a small amount of capital[1] the likelyhood of it being so exploited increases.
We've seen drug lords building submarines to transport drugs up the coast, but the economic cost of really effective submarines is still too high relative to the profit such devices provide. And perhaps more importantly the economic risk of 'losing' a submarine such that its life time value exceeds its cost to build and deploy.
Things like DNA printers worry me for example much more than robot dogs.
[1] The same cost reductions that make fielding a web server $5/month enable large scale data mining for very little investment in cash.
> we will be able to keep them biased toward the positive uses.
Iron Man II (2010). How you intend to use them is irrelevant if all it takes is one guy on the dev team (or a compromised janitor) can take control of the entire army by dropping in some code.
I'm less worried about a rogue staffer than I am about the systemic use by large corporations to further their interests (it wasn't that long ago before corporate-employed thugs were breaking the heads of strikers.)
Robotics, like genetic engineering, nano-technology, and data mining can do good things and bad things. It is important to focus on the good things, remain cognizant of the bad things, and maximize the value. I would be sad to see robotic quadrapeds "banned for civilian use" because they can be weaponized, because they can also get to people in need in dangerous and hard to reach places.
In a more current events sort of way, I am in favor of severe punishments for people who weaponize drones and fly them in public places, or people who manufacture and sell such drones, but I am not in favor of banning personal ownership of drones. I am willing to risk that someone will show up where I am with such a device, and the risks to my personal safety if they employ it in a deadly way, in exchange for the freedom to own and experiment with drones in a responsible way.