Technology

Where The Real Threats Are, According To “Killer Robots,” Won’t Be What The Movies Show

Where The Real Threats Are, According To “Killer Robots,” Won’t Be What The Movies Show

Hollywood is reputed to be adept at foretelling the future. Indeed, Robert Wallace, the US counterpart to MI6’s fictitious Q and head of the CIA’s Office of Technical Service, has recalled how Russian operatives would study the most recent Bond movie to see what technology might be on the horizon for them.

In light of this, Hollywood’s ongoing fixation with killer robots may be really concerning. Dolly, a sex robot courtroom drama coming to Apple TV, is the most recent example of this genre.

I never would have imagined writing “sex robot courtroom drama,” yet there it is. The premise centers on a billionaire who is murdered by a sex robot that subsequently requests legal representation to defend its deadly deeds, according to an Elizabeth Bear short fiction from 2011.

Killer robots: Dolly is the most recent in a long line of films about killer robots, including Arnold Schwarzenegger’s T-800 robot in the Terminator series and HAL in Stanley Kubrick’s 2001: A Space Odyssey. In fact, Fritz Lang’s 1927 classic Metropolis, the first full-length science fiction movie, was centered on a war between robots and humans.

But practically every one of these films gets it wrong. Killer robots won’t be intelligent, humanoid robots that are out to do you harm. Although such technology is several decades, if not centuries, away, they would make for a compelling plot and a box office hit.

Robots may never be sentient, despite current fears to the contrary.

The technology we should be concerned about is far simpler. And these technologies are already beginning to appear on the front lines in areas like Nagorno-Karabakh and Ukraine.

The technology we should be concerned about is far simpler. And these technologies are already beginning to appear on the front lines in areas like Nagorno-Karabakh and Ukraine.

One war changed: The most accurate depictions of the real future of killer robots come from films like Angel has Fallen (2019) and Eye in the Sky (2015), which use considerably simpler armed drones.

The nightly TV news shows us how more autonomous drones, tanks, ships, and submarines are changing modern combat. These robots aren’t much more advanced than the ones you can find at your neighborhood hobby shop.

Additionally, their algorithms are being trusted with a growing number of judgments, including how to locate, monitor, and eliminate targets.

This is putting the globe in peril and posing a number of ethical, legal, and technological issues. For instance, these weapons will exacerbate the already unstable geopolitical environment.

A promise not to use weapons: Six major robotics companies announced this week that they would never militarize their robot systems. These businesses include Boston Dynamics, which creates the Spot robot dog and the Atlas humanoid robot, both of which have a backflip that would not be out of place in the Black Mirror television series.

Robotics firms have previously expressed their concerns about this unsettling future. I organized an open letter requesting that the United Nations limit the use of killing robots five years ago, and Elon Musk and more than 100 other founders of AI and robot firms signed it. The letter even demoted the Pope to third place for the world’s greatest disarmament achievement.

A combined effort to protect our future: The only way we can protect ourselves from this horrific future is if countries act as a group, just as they did with chemical, biological, and even nuclear weapons.

Similar to how chemical weapons are regulated, this legislation won’t be perfect. However, it will stop arms companies from openly marketing these weapons, thereby halting their spread.

Therefore, the UN Human Rights council’s recent unanimous decision to examine the human rights implications of cutting-edge technologies like autonomous weaponry is even more significant than a pledge from robotics businesses.