There is no conflict though, law three specifies that it's "as long as it doesn't conflict with the first or second laws"
1. Don't harm humans
2.Always obey humans
3.Don't harm yourself
In that order, humans can't order robots to harm humans, but they can order them to harm themselves, as the...