top of page
Writer's pictureAC

Tech Development In Autonomous Weapon Systems

Updated: Oct 21, 2019

| A + T |


Technological development has become a relentless race. In the competition to lead the emerging technology race and the futuristic warfare battleground, artificial intelligence (AI) is rapidly becoming the center of global power play. As seen across many nations, the development in autonomous weapons system (AWS) is progressing rapidly, and this increase in the weaponization of artificial intelligence seems to have become a highly destabilizing development. It brings complex security challenges for not only each nation’s decision makers but also for the future of the humanity.


The reality today is that artificial intelligence is leading us toward a new algorithmic warfare battlefield that has no boundaries or borders, may or may not have humans involved, and will be impossible to understand and perhaps control across the human ecosystem in Cyberspace, Geospace and Space (CGS). As a result, the very idea of the weaponization of artificial intelligence, where a weapon system that once activated across CGS, can select and engage human and non-human targets without further intervention by a human designer or operator, is generating concern.


The thought of any intelligent machine or machine intelligence to have the ability to perform any projected warfare task without any human involvement and intervention, using only the interaction of its embedded sensors, computer programming and algorithms in the human environment and ecosystem is becoming certainty which cannot be ignored.


AI WEAPONIZATION


As AI machine learning and deep learning evolves further and moves from concept to commercialization, the rapid acceleration in computing power, memory, big data and high-speed communication is not only creating innovation, investment and application frenzy, but is also intensifying the quest for AI chips. This ongoing swift process and development indicate that artificial intelligence is on its way to revolutionizing warfare and that nations are undoubtedly going to continue to develop the automated weapons system that AI could make possible.


When nations independently and cooperatively accelerate their efforts to gain competitive advantage in science and technology, the further weaponization of AI is unavoidable. Consequently, there is a need to visualize what would an algorithmic war of tomorrow look like, because building autonomous weapons systems is one aspect but using them in algorithmic warfare with other nations and against other humans is another.


Reports are now emerging of complex algorithmic systems supporting more and more aspects of war-fighting across CGS, the truth is that the commoditization of AI is a reality now. As seen in cyberspace, automated warfare (cyberwarfare) has already begun where anyone and everyone is a target.


The rapid development of AI weaponization is evident across the board: navigating and utilizing unmanned naval, aerial, and terrain vehicles, producing collateral-damage estimations, deploying “fire-and-forget” missile systems and using stationary systems to automate everything from personnel systems and equipment maintenance to the deployment of surveillance drones, robots and more are all examples. So, when algorithms are supporting more and more aspects of war, it brings us to an important question: what uses of AI in today and tomorrow's war should be allowed, restricted and outright banned?


While Autonomous Weapons Systems are believed to provide opportunities for reducing the operating costs of weapons system and will likely enable weapons systems to achieve greater speed, accuracy, persistence, precision, reach and coordination on the CGS battlefield; the need to understand and evaluate the technological, legal, economic, societal and security issues remains.



ROLE OF PROGRAMMERS & PROGRAMMING


Within complex security challenges and the vast future unknowns, what remains significant for the safety and security of mankind is the role of programmers and programming along with the integrity of semiconductor chips. The purpose behind this is programmers can outline and regulate the nature of AWS, at least in the beginning until AI begins to program itself.


Nevertheless, if a programmer who were to intentionally or accidentally program an autonomous weapon to operate in violation of the current and future international humanitarian law (IHL), how will individuals control the weaponization of AI? Furthermore, because AWS is centered on software, where should the responsibility for errors and the manipulation of AWS systems design and use fall? That brings us to a crucial question, if and when an autonomous system executes, who is responsible for the killing, irrespective of whether it is justified or not?



CYBER-SECURITY CHALLENGES


In brief, algorithms are by no means secure nor are they immune to bugs, malware, bias and manipulation. In addition, since machine learning uses machines to train other machines, what happens if there is malware or manipulation of the training data? While security risks are everywhere, connected devices increase the ability of cyber-security breaches from remote locations and because the code is opaque, security is very complex. Subsequently, when AI goes to war with other AI, the ongoing cyber-security challenges will add enormous risks to the future of humankind and the human ecosystem in CGS.


Even though autonomous weapons systems are here to stay, the question we all independently and cooperatively need to address is will artificial intelligence drive and determine our strategy for human survival and security, or will we?


It is important to understand and assess if the autonomous arms race cannot be prevented, what could go wrong. It is time to acknowledge the fact that simply because technology may allow for the successful development of AWS it does not mean that we should.


Perhaps, it is not in the interest of humankind to weaponize artificial intelligence.


31 views0 comments

Comments


Automous Technology, Artifficial Intelligence, Architecture, Design, Emerging, Blockchain, Cryptocurrency, Digital, Electric Vehicles, Science, Self-driven, Trends
bottom of page