It's happening and it's happening sooner than anyone of us would have ever predicted. The era of killer robots taking frontier in battlefields is looming upon mankind and this has got many people in the tech world extremely worried. In order to get their concern across people who have some authority to push the timeline of this event as far as possible, some of the most popular names in the world’s top artificial intelligence (AI) and robotics world have come together and drafted an open letter seeking a ban on killer robots or lethal autonomous weapons.

In a recent development, the world's top AI and robotics companies have utilised the platform of International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne to collectively urge the United Nations to ban killer robots, or autonomous weapons systems in battlefields.

The world’s biggest artificial intelligence conference saw the unveiling of an open letter signed by 116 founders of robotics and artificial intelligence companies from 26 countries around the globe. The letter was written in the light of the UN recently delaying meeting until later this year to discuss the topic of robots in warfare.

The open letter, which marks the first time that AI and robotics companies making joint statement on the issue, was released in the conference by Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales.

Some of the renowned names that have signed the letter includes Elon Musk, founder of Tesla, SpaceX and OpenAI (US), Mustafa Suleyman, founder and Head of Applied AI at Google’s DeepMind (UK), Esben Østergaard, founder & CTO of Universal Robotics (Denmark), Jerome Monceaux, founder of Aldebaran Robotics, makers of Nao and Pepper robots (France), Jü rgen Schmidhuber, leading deep learning expert and founder of Nnaisense (Switzerland) and Yoshua Bengio, leading deep learning expert and founder of Element AI (Canada).

In December last year, 123 member nations of the UN’s Review Conference of the Convention on Conventional Weapons agreed to discuss the issue of autonomous weapons. Of these 123, 19 have already agreed on a 100 per cent ban on these weapons.

According to the open letter, Lethal autonomous weapons have the potential of becoming the third revolution in warfare. It further warns that once they're developed, they will permit armed conflict to be fought at a scale greater than ever before, and at timescales faster than human beings can comprehend.

Comparing the weapons to a Pandora's box, the letter states, “These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”

The 2017 open letter follows a 2015 letter unveiled at the IJCAI conference in Buenos Aires, which acquainted the world with the dangers of autonomous weapons. The letter had the approval of Apple co-founder Steve Wozniak, British physicist Stephen Hawking, and cognitive scientist Noam Chomsky.

Every technology can be used for good and for bad. It all depends on the intentions of the people using it. According to Walsh, AI is capable of solving numerous problems that mankind is facing right now. This includes the challenges posed by climate change, inequality and poverty, and the currently in motion global financial crisis. However, the same technology can also be used in autonomous weapons to industrialise war. He urges the UN to decide which of these futures we want for our future generations and calls for them to impose a ban on autonomous weapons similar to one imposed on chemical and other weapons.

Here's the open letter:

An Open Letter to the United Nations Convention on Certain Conventional Weapons

As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm. We warmly welcome the decision of the UN’s Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations. We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.

We regret that the GGE’s first meeting, which was due to start today, has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.

We therefore implore the High Contracting Parties to find a way to protect us all from these dangers.
Advertisements

Post a Comment

أحدث أقدم
Like this content? Sign up for our daily newsletter to get latest updates.