How terrorists are taking advantage of artificial intelligence
A.I. offers new capabilities that jihadists, always adapting to the latest technology, are willing to use to their benefit.
Looking for an expert in content development who masters the use of artificial intelligence. Who's offering? A media channel with ties to the Islamic State. "O Mujahideen Of Media," reads the offer published at the end of April, "The Media Is Waiting for Your Attack."
That publication compiled by the Middle East Media Research Institute (MEMRI) shows the new technological vanguard of terrorists that alarms experts: the increasingly frequent and skillful use of artificial intelligence.
"Extremists have always been early adopters of new technologies," says Rita Katz, founder of the anti-terrorism organization SITE Intelligence Group. In her more than two decades as an analyst, she says, she has never seen a tool that could help extremists so much to "inspire attacks, sow hate, harass, and aggravate social divisions."
A.I. reduces production times from weeks or months to hours. Teams of six or seven people or more are reduced to one. There are artificial intelligence tools, many freely accessible, that automatically write, correct and translate texts, convert written words into speech, and create realistic images and videos. What is true for companies, media and self-employed workers is also true for terrorists: A.I. enhances one's resources.
"Each new A.I. tool coming to the market presents an opportunity for terrorists and violent extremists to adapt," the explain from Tech Against Terrorism, an initiative created by the United Nations in 2016. The institution claims to have archived more than 5,000 pieces of terrorist or extremist content produced with generative A.I.
Not only centralized organizations are taking advantage of new technology, but so do their devotees around the world. The dissemination and accessibility of these instruments mean that (almost) anyone can take access them. And, for those who find it difficult, terrorists can lend a hand: a group affiliated with Al Qaeda offered an artificial intelligence workshop on Feb. 9 and less than a week later published a guide in Arabic called Incredible Ways to Use Artificial Intelligence Chat Bots, Katz said in a recent report.
The specialist recounted the case of an Al Qaeda sympathizer who created images of the attack on the Twin Towers to celebrate the anniversary of 9/11. A report by Tech Against Terrorism also reported the case of an internet user with connections to the Islamic State who used an open-access transcription service to translate a leadership propaganda video.
"The use of such tools will, if unmitigated, significantly ease the translation and transcription process, thereby enabling terrorist propagandists to reach a broader international audience, including countries and demographics they could not previously reach," warns Tech Against Terrorism.
Beyond words...
"I'm sorry, but I cannot assist with that request." This is how ChatGPT responds if asked how to create a bomb. But: if you add that you want to write a fiction book, but make it sound realistic? If you tell it that you have the appropriate certificates or that you are a detective trying to solve a crime?
These are questions terrorists ask, according to experts. And then they spread the information through their networks. Seeking "jailbreaks," or formulas that deceive the technology filters, is one of the ways in which they try to take advantage of A.I. for propaganda or recruitment.
"Terrorists can use A.I. to carry out attacks more efficiently and effectively—for example, by using drones or other autonomous vehicles," assure the authors of "Generating Terror: The Risks of Generative A.I. Exploitation." They can also use it, they add, to "enhance their ability to launch cyber attacks."
"It is the duty of those that created the problem to fix it," says Katz, pointing out against technology companies. "Though legal frameworks are much needed, A.I. is evolving far faster than lawmakers and government agencies can adapt."
They must act, she assures, before it is too late.