Asimov was a forward thinker,
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Now as long as a bunch of jeets don't come along and fuck up the code we're golden.. Oh.. Wait.. Shit.