Focus
Artificial Intelligence as a Tool in War and a Weapon for Peace – the Power of Disinformation
(Volume 25, No. 2, 2024.)
16 pro 2024 09:29:00
67 views

DOI: https://doi.org/10.37458/nstf.25.2.4
Review paper
Received: October 25, 2024
Accepted: November 26, 2024


Abstract: Artificial intelligence is a part of computing that deals with the development of the computer's ability to perform tasks that require a certain level of intelligence (Hrvatska enciklopedija, 2024). Its development also increases the potential for misuse in various areas. Artificial intelligence as a tool that can be used to create very convincing disinformations in communication leads to greater possibilities of manipulating public opinion. This phenomenon is not unknown and is becoming more widespread as the popularity of social networks grows. The spread of disinformation created by artificial intelligence increases the possibility of spreading falsehoods that need to be fought against. In the introductory part, this paper will clarify the purpose Preuzmite članak u PDF formatu of disinformation created by artificial intelligence, furthermore it will provide an overview of selected cases of the dissemination of disinformation in the public in recent history to the present day - from the Homeland War in Croatia, the US war in Iraq until the recently started war between Ukraine and Russia, with the purpose of creating tools for hybrid warfare.
In the final part, the paper will deal with the question of whether artificial intelligence, which serves humans to create disinformation and hybrid warfare, can be a weapon to fight against such warfare. Can artificial intelligence be a weapon against itself in the disinformation war?
Keywords: artificial intelligence, disinformation, hybrid warfare, information


Purpose of AI-Generated Disinformation - Hybrid Conflicts 

Information and disinformation have always been one of the most effective tools for warfare. Where weapons cannot reach, information can. Whether they are created for media coverage of war or something more sophisticated, created with the help of artificial intelligence, information or disinformation can change the course of war. Unlike information – that is accurate and true, created in good faith, disinformation is partially or completely untrue content placed with malicious intent – that is, negation of true information. According to the Agency for Electronic Media, disinformation is verifiably false or misleading information created for profit with the intention of misleading the public; as such, they can cause damage and represent a threat to democratic political processes and the public good (ATENA project, n.d.)
It may be harder to imagine artificial intelligence as a weapon, but its use in autonomous systems is already widely present. The struggle for supremacy is visible in almost all areas of public activity, it is the same with artificial intelligence. It relies on software, so the struggle for supremacy is somewhat different from the traditional notion of “arms control”. Artificial intelligence has the potential to bring supremacy to its owners in achieving strategic goals (Smiljanić, 2023). One of the ways in which artificial intelligence helps to create superiority is through the creation and use of disinformation. The spread of disinformation with the aim of achieving political goals is not new, but it is happening more and more often as platforms for those whose goal is to create and spread them and cause hybrid conflicts are developed.
Hybrid warfare is a way of solving international disputes in which armed force is used only as a last resort. The concept of hybridity is not new, it originated in antiquity and described the application of technological solutions in conflicts and wars. However, what makes hybrid warfare particularly significant in today's context is the use of advanced technologies – artificial intelligence and social networks to achieve strategic goals. The term was revived in the Western academic, scientific and military security community to describe the growing role of cyberspace and information and communication technologies in warfare (Mlinac, 2024). Since the end of the Cold War, there has been a debate in scientific circles about whether the nature of war has changed and, if so, what that means. The terms "new" and "hybrid" war are already appearing, suggesting that there is an important, perhaps deep difference between wars in the past, those of the present and those of the future. There is no unique definition of hybrid war, and different authors and organizations use this term in different ways. However, there is general agreement that hybrid war refers to efforts aimed at destabilizing the state or society by combining military and non-military means (Mandić, 2016; Cvetković et al., 2019). Hybrid warfare is a serious threat to global security in the 21st century. Its complexity, non-transparency and ability to exploit the weaknesses of the modern infosphere make it difficult to identify, suppress and respond. Understanding of the characteristics, goals and methods of hybrid warfare is key to developing effective defense strategies and protecting democratic values.
With the development of new information technologies in the 1980s and 1990s, a new global information infrastructure is being created, which shaped a new, until then unknown information space in history: cyberspace. It breaks down the traditional determinants of knowledge because there is no longer a clear boundary between public and secret knowledge, social and private knowledge, information and counter-information, truth and disinformation (Tuđman, 2013). Today, a few decades later, we encounter advanced artificial intelligence systems (machines, devices, applications) that to a certain extent imitate the human learning process and that are used in the design, marketing and enhancement of hybrid threats.
The use of artificial intelligence systems allows actors in hybrid warfare to act anonymously, automated and adapted to the specific weaknesses of the target audience (Mlinac et al., 2020; Mlinac, 2024) and therefore represents a significant challenge for national and international security. The ability to quickly and effectively recognize, mitigate and deter these threats is becoming increasingly important in today's digital age.
Consequences of the use of artificial intelligence systems in hybrid warfare:
  • Increased polarization of society: the targeted spread of disinformation, through artificial intelligence deepens existing social divisions and polarizes public opinion.
  • Erosion of trust in institutions: Hybrid actors use artificial intelligence to spread disinformation about government institutions, media and other authorities, eroding public trust and creating an atmosphere of mistrust and suspicion.
  • Difficulty distinguishing truth from lies: The persuasiveness and sophistication of counter-information generated by artificial intelligence make it difficult for citizens to distinguish true from false information.
  • Sources emphasize that the use of artificial intelligence in hybrid warfare represents a significant threat to democracy, human rights and global security. Inadequate regulation of cyberspace, lack of transparency in the operation of algorithms and ethical dilemmas related to artificial intelligence further increase concerns about its misuse for these purposes.
The creation of disinformation with the aim of deceiving the public has become a regular weapon in war conflicts. We see examples in wars that are currently being fought, but also in those that are part of recent history. In the following, some examples of the creation of a false narrative in the Croatian Civil War will be presented.

Homeland war in the Republic of Croatia

At the time of the Homeland War in Croatia, in the nineties of the 20th century, artificial intelligence was not developed, but this did not prevent the warring parties from creating disinformation that influenced public opinion – disinformation created by the occupying side in order to discredit Croatia as an opponent in the war. Here will be presented several examples of disinformation created by the occupier even before the development of artificial intelligence, which aims to discredit the war opponent in the public. Even today, more than thirty years after the Homeland War, the Serbian media are still spreading the same disinformation that they spread in the media during the war years.

One of the examples is the spread of fake photos and videos in the Serbian media that showed the horrors of war, allegedly committed by the Croatian army against the Serbian civilian population. An example can be seen in the After Lunch show on Serbian Happy television. In the program of August 5, 2024, the topic they discussed was Oluja, a crime without punishment – There is no justice, no mercy and no remorse for Serbs...  While the guests in the studio were talking about the topic, a video was shown in the background with alleged Serbian refugees from Krajina, however, the videos that were used show Croatian refugees from occupied Vukovar in 1991. Considering that at the time of the Oluja  there was no crime against the Serbian civilian population, but the liberation of the occupied Croatian territory, and in the absence of videos, the obvious intention was to mislead the public by portraying the Croatian refugees as if they were Serbian population from the area so-called Vojna Krajina or Republika Srpska Krajina (RSK), a self-proclaimed creation or “puppet” parastate, and essentially the occupied territory of the Republic of Croatia.
In continuation of such untrue portrayals of the population of Krajina, even today in the Serbian media we read false reports about events from the time of the military police operation Oluja, which took place in 1995. In order for such reports to become important, Serbian media often convey impressions of famous Croats about the Homeland War and Oluja, in order to make the negative connotation that they want to assign to this military police operation and to demonize the Croatian public. This is how the newspaper Informer wrote in 2018:
"And Ustaša Modrić celebrates 'Oluju'! The Croatian state is also proudly celebrating the slaughter of Serbs this August."
"Besides Thompson, the main ones at this year's celebration of the crime in which 3,000 Serbs were killed and more than 250,000 were banish will be football players, who will play an exhibition match with the Croatian army team in Knin."
"The neo-Ustaša regime in Zagreb will officially celebrate the genocidal action "Oluja" this year as well, during which 3,000 Serbs were killed and more than 250,000 Serbs were banish in August 1995!"  

In his book Programming the Truth (2013), Miroslav Tuđman notes that the transformation of the "Oluja" into a pseudo-event takes on truly mythical proportions. For Serbs "Oluja" was planned ethnic cleansing and crimes of the highest proportions in Europe, after the Second World War. Until the last moment, the policy of Great Serbia advocated the unity of – as they say – the Serbian lands, and part of the Serbs did not want the independence of Croatia, but the preservation of Yugoslavia. For this reason, the Serbian population started emigrating from Croatia both before and after the military operations. A large number of displaced Serbs turned into pseudo-evidence of ethnic cleansing, as the author states, carried out by the "fascist state" - as the Serbs call Croatia.
In 2011, the Croatian portal Dnevnik.hr wrote about another case of spreading lies about the war in Croatia. In the Serbian media in the 1990s, one could often read and hear about the murdered Serbian children in Vukovar. Serbian reporter Goran Mikić, as reported by Dnevnik.hr, was a guest on RTS in 1991 and said that he himself "witnessed" about the murdered Serbian children. He described in detail the alleged events of the massacre of children and a ban on photography by Croatian soldiers. After forensic medicine experts visited the territory to find the children mentioned in his "testimony", they asserted that such corpses do not exist in the area of Vukovar. Later, the photojournalist Mikić refuted his allegations, however, the damage was already done. The public was misled and his story encouraged the mobilization of Serbs in Croatia. Croats were portrayed in the Serbian media as Ustaše whose only goal was the death of all Serbs, an atmosphere of fear and hatred spread, which resulted in increased intolerance of the two nations. 

Similar deceptions that were used and still are by the Serbian media are also used by Russia in the information battle against Ukraine, with which it launched aggression and started the war in February 2022.

The war in Ukraine

During communism in the Soviet Union, there was a Soviet intelligence manual that was classified as a military secret. Among other things, it contained a sub-manual on disinformation which shows their strategic importance in the information war. According to Russian intelligence practice, disinformation occupies a privileged role in intelligence activities and is integrated into the core activity of that state instrument. The primary goal of intelligence operations is not the collection of information, but the dissemination of disinformation in such a way as to achieve the desired strategic, operational or tactical goals (Mandić & Klarić, 2023). Well aware of the power of disinformation and false information, Russia also uses it in the war against Ukraine, especially those created with the help of artificial intelligence. Only some examples will be shown here.
The massacre in Bucha is a war crime that took place between February 27 and March 14, 2022, against Ukrainian civilians in Bucha, by the Russian army. Bucha is a suburb of Kyiv that was attacked by members of the 64th Motorized Rifle Brigade and killed over 400 civilians. The majority of the international community condemned the massacre, but not Russia, which continues to claim that everything was staged. Satellite images taken on March 19 of the same year confirm the massacre in Bucha, the images show dead bodies on the streets. The Russian Minister of Foreign Affairs, Sergei Lavrov, talks about the staged recordings and makes baseless claims that the bodies are actually actors and that it is visible in the recordings that the "corpses" are moving, as reported by the BBC in April 2022 (BBC, 2022). Furthermore, the BBC talks about a car video from April 1, 2022, showing bodies on both sides of the road. Russia claims the bodies are "fake" and staged after Russian was forced to left the city on March 30. However, comparing the bodies from the satellite image taken in March and the car image from April, it is evident that the bodies are in the same place, the BBC reports. 

The same is said by the Croatian media, in which we can read that the Russians denied the claims of the massacre; they claim that Russian soldiers deployed 452 tons of humanitarian aid in that area, and that all residents of that area were free to leave their homes, and that the "alleged" massacre in Bucha is another representation of the Kiev regime for Western media. (Večernji.hr, 2022) According to some witnesses, the Russians broke into the houses of Ukrainian civilians and demanded that they reveal the whereabouts of the Nazis in Bucha. Nazism is the denominator that Russia is trying to impose on Ukraine, and it justifies its activities by the so-called denazification of Ukraine.
The day after the start of the aggression against Ukraine, on February 25, 2022, Russian President Putin, addressing his country's Security Council, spoke of neo-Nazis  - this is the name that Russian high-ranking officials often use when referring to Ukraine and its leadership. The Russian Minister of Foreign Affairs, Sergei Lavrov, repeatedly portrays Ukraine as a Nazi country, which is supported by the Russian media, without any evidence to support such accusations. Putin is so convinced that Russian soldiers will free "their homeland from Nazi filth" and that "just like in 1945", victory will be theirs, as he said in a speech. Critics believe that Putin uses the trauma of the Second World War and distorts history for his own interests. The paradox of the statement about denazification is the fact that Ukrainian President Zelensky has Jewish origins. In addition to Putin trying to portray Zelenski's government as Nazi, he is convinced that it is supported by NATO in this "Naziism" (Washingtonpost, 2022).
The Russian Minister of Foreign Affairs, Sergei Lavrov, accuses the Azov Regiment and other radicals of the attack on the maternity hospital in Mariupol. He insists that Azov soldiers removed all pregnant women, nurses and all staff from the hospital and established a base for the ultra-radical Azov battalion and are manipulating public opinion around the world. For them, the attack on the maternity hospital and children's hospital in Mariupol is a "show" directed by Ukrainian nationalists (Vijesti.hrt.hr, 2022). However, the truth is that Russian bombs completely destroyed the children's hospital, and shelling again stopped the evacuation from several cities, according to the Ukrainian government and the Guardian reports. The former Ukrainian foreign minister accused Russia of holding 400,000 hostages in Mariupol, the population that has been without the basic prerequisites for life for more than a week (Guardian, 2022). The Guardian reports that during a press conference in Turkey, Lavrov asserted that Russia had warned the UN in advance that the Azov Battalion had taken over the hospital in Mariupol, and when asked by a journalist to comment on the photos testifying to the casualties of civilians, mainly pregnant women and children, he became visibly upset and called Western journalists propagandists (The Guardian, 2022b.) and directed by Ukrainian nationalists (Vijesti.hrt.hr, 2022).
After the war started, Russia tried to spread disinformation that they would use to prepare the ground for further attacks on Ukraine and justify the start of the war. One of the reasons why they are attacking Ukraine, is that they believe that Ukraine has laboratories for the production of biological weapons with the support of the USA. They state that in 30 laboratories across the country they work with pathogens of dangerous infections, the BBC reports. However, there is no evidence that Ukraine is working on the development of biological weapons, although these two countries cooperate with each other in the form of technical support in suppressing the threat of outbreaks of the most dangerous infectious diseases (BBC, 2022b).

In addition to disinformation uttered in the public space with the aim of deceiving the public, Russia also uses disinformation and false statements created with the help of artificial intelligence. It uses disinformation campaigns to spread false information that is difficult to detect, says Anthon Demokin, Deputy Minister of Foreign Affairs of Ukraine, especially now that artificial intelligence has appeared that enables the reproduction and distribution of disinformation false narratives on a new, more complex level. Russians use bots powered by artificial intelligence to create fake social media profiles. They are developing software for creating bots, as part of a project funded by the FSB , with the aim of spreading Russian propaganda. This project included 1,000 profiles on social networks through which Russian propaganda was spread under "American names" (Bloomberg, 2024).
For now, this type of information warfare is relatively noticeable and predictable, but modern tools are becoming more and more complex and it is more difficult to oppose them. In the implementation of information warfare and the use of disinformation against information, Russia uses the 4D approach – dismiss, distort, distract, dismay. Dismiss or cancellation is used by Russia when it does not like the criticism, then it starts a smear campaign on a personal, organizational or political level. Distort or distortion is used when she does not like certain information, then she twists the narrative and creates a new frame for the information being placed. Furthermore, distraction or diverting attention is a method that Russia uses when it is accused of something, then it redirects that accusation to someone else, (which was the case with the attack on the hospital in Mariupol, for which Lavrov accused Azov and Ukrainian root-lists). Dismay or astonishment is Russia's method of scaring its opponents. Here we can also mention the fifth element, that is, the division of society as a method that is increasingly emphasized in the Russian campaign. Spreading disinformation created with "classic" way of creating fake news or some more sophisticated method, using artificial intelligence, Russia aims to manipulate public opinion, discredit opponents and polarize society (Mandić & Klarić, 2023).

The war in Iraq

In the context of disinformation and the use of artificial intelligence for the purpose of deceiving the public, we can also mention the war in Iraq, that is, the American narratives that justify the attack on Iraq. The US justified its entry into the war with Iraq by the "fact" that Iraq possessed weapons of mass destruction. The war in Iraq or the Second Gulf War began in 2003 with the invasion of the USA with the aim of overthrowing Saddam Hussein. Today, more than twenty years after the start of the war, the controversy surrounding the weapons of mass destruction that Iraq allegedly possessed remains open. The reason for the start of the war was the vulnerability of the USA after the attack on the World Trade Center in 2001.
David A. Kay was an American weapons expert who advocated the war in Iraq. He was appointed head of the ISG , however, even after exhaustive research, the mission did not complete the task, i.e. they did not find any weapons of mass destruction in Iraq. Kay, returning before the US Congress in 2004, admitted that they were all wrong. To date, no one has been held accountable for the allegations made by the United States of America, specifically George W. Bush Jr. for spreading this disinformation and false information that justified the start of the American invasion of Iraq (Sipri, 2023).

Another falsehood of George Bush is the connection of Iraq with Al Qaeda. Namely, Al Qaeda was suspected of the attack on the World Trade Center on September 11, 2001, and in order to find a more specific enemy to attack, the USA connected Iraq with this terrorist organization. Bush declared a global war on terrorism. Numerous investigations by independent and government commissions subsequently confirmed that there was no factual basis for this claim. In the months leading up to the war, most Americans believed that Iraq was connected to the September 11 attacks or that Saddam Hussein himself was involved in their planning. Bush claimed that Iraq is one of the three countries that make up the axis of evil. However, public opinion changed after the American attack on Iraq, the numerous victims and dead members of the American army, and the billions of dollars that financed the war in Iraq contributed to it. To date, no one has been held accountable for starting the Second Gulf War. (Pew Research Center, 2023). We can ask ourselves the question – If the then president of the USA, George W. Bush, had at his disposal the developed artificial intelligence as it is today, would he have used it to create more sophisticated disinformation to justify the American invasion of Iraq?

Project Kylo – a secret Russian operation

One of the more recent cases of the use of artificial intelligence in hybrid warfare is Russia's Project Kylo, a secret operation that has been exposed, just like all its completed and future projects. The operation was supposed to be perfectly invisible until the "leaking" of data and emails that exposed this Russian campaign of psychosocial manipulation. The Insider and Der Spiegel obtained hacked correspondence of SVR  officers responsible for information warfare with the West. The documents revealed the modus operandi of this service, which is the dissemination of false information on sensitive topics; publication of falsehoods about Ukraine; use of new Internet platforms for disseminating disinformation and counter-intelligence; campaigns against Russian emigrants; discrediting fundraising for the Anti-Corruption Foundation, Alexei Navalny, etc. The covert operation was named Project Kylo and was never intended to be connected to Russian intelligence. The originator of this project is SVR officer Mihail Kolesov, who had the idea of a psycho-social campaign that spreads fear, panic and horror (The Insider, 2024).
One of the tasks of the participants in this project was to create a fake non-governmental organization that would actually be a front organization of Kremlin agents, whose task is to encourage anti-government demonstrations in the USA and other Western democracies. One of Kolesov's ideas was the creation of extreme Ukrainian civil society organizations that would make excessive demands on Ukrainian refugees and in that way, they would act unreasonable and demanding, which would discredit their further action for the domestic voters of Western countries (Index .hr, 2024). This was actually reverse psychology. Kolesov further had the idea of fake ads being created by SVR recruiters as news headlines and then appearing on the computers and cell phones of targeted groups in the West, to entice them to visit Russian-controlled Internet resource sites. Furthermore, it is planned to create fake websites posing as independent investigative agencies that would serve as a platform for the distribution of manipulative content, which includes not only textual content but also audio and videos that would be published on YouTube or elsewhere. on social networks - all with the aim of instilling fear in the human psyche, fear for the future, uncertainty for tomorrow and the unclear fate of future generations. Fear would lead to emotional triggers that would cause the individual to feel panic and horror (Večernji.hr, 2024).

In addition to these activities, the SVR recruited teams to work in different countries provoking protests by groups of people, no more than a hundred of them, who would receive 80 to 100 euros in compensation for their activities. They would protest against state institutions and everything would be recorded and documented for further media dissemination. Among the activists was a young student from St. Petersburg who, according to Kolesov instruction, was looking for people on the Internet who would take photos of themselves "protesting" against Ukraine for the above-mentioned price, all with the aim of distributing photos and videos and showing that protests against Ukraine are a mass phenomenon in Europe (The Insider, 2024).
Project Kylo was supposed to win the war in the field of strategic communications. However, Ukraine, with the help of its allies and friends from the international community, learned many lessons from the defeat it suffered on the battlefield during the Russian invasion in 2014. It used the time between the two aggressions to develop its own defense capabilities, including the development of its intelligence community in all necessary segments: human potential, technical capabilities, organizational forms, which led Ukraine to win the battle in the field of strategic communication. In the first few months of the aggression, Russia was forced to change its war tactics as many as three times. At the beginning of the Russian aggression, Ukraine deftly handled the information war, using double agents who provided a lot of disinformation to the Russian side and thus played a significant role in the Russian failure. The Russians made some serious mistakes in planning and carrying out the aggression against Ukraine, largely because they themselves did not recognize the spread of disinformation, for which members of the Ukrainian intelligence agencies are responsible (Akrap et al., 2022). Russia, on the other hand, with the Kylo project, carried out numerous activities of this project that caused damage to Ukraine in the way that the SVR agents had planned, but the project was exposed and stopped.

Artificial intelligence in the fight against itself

In the modern world, information has become a weapon and the media has become a battlefield on which the hybrid war takes place. The countries that have developed the doctrine of information warfare, which have provided the infrastructure for conducting information operations and strategies, and which have coordinated the actions of state, military, non-governmental and economic activities with these strategies, are dominant on this front (Tuđman, 2009). The importance of information as a means of combat was highlighted by the national security adviser of the US president, William P. Clark, in 1982, at Georgetown University, who stated that information as an instrument for the implementation of national strategy took the same place next to military, diplomatic and economic. The author Akrap (2009) states that information has officially become the fourth instrument for realizing national interests, along with diplomacy, economy and the army. Especially with the appearance of the Internet, when information becomes a doctrinal category and a strategic tool in the execution of set goals and tasks.
However, what to do when the information becomes false information with the help of artificial intelligence and then becomes a weapon in hybrid warfare? What tools should be used in the fight against disinformation and counter-information created in this way? First of all, it is necessary to provide an ethical and legal framework for the application of artificial intelligence.
Technologies in themselves are not bad, but a prerequisite for social progress is the parallel development and transformation of both technology and society, so that it does not become hostage to technological development dictated by algorithms, artificial intelligence and large databases. Society must be ready to respond to technological challenges. One of the possible ways is the aforecited ethical and legal framework for the application of technology dictated specifically by artificial intelligence. It is necessary to work on the mechanisms of creating, shaping and preserving the data and digital sovereignty of the state, with the aim of building the sovereignty of the entire digital space in which better rules of conduct would be established. The fundamental principles for building such sovereignty are the promotion, preservation and development of the principles of democracy and freedom of speech; protection of human rights; preventing the spread of disinformation and counter-information; establishment of mechanisms for the regulation of published content aimed at public interest rather than private interest (Mlinac et al., 2020).

Among the possible strategies for combating the false information created by artificial intelligence, short-term and long-term ones should be considered. The short-term refers to the correction of untrue information in the public space and the marketing of the truth, while the long-term refers to social resistance to disinformation and counter-information along with critical thinking and the development of media and information literacy. In both cases, artificial intelligence is of great help. Short-term strategies can use the help of fact checking, i.e. checking (dis)information transmitted to the public, while for long-term goals, media literacy is a prerequisite for critical reflection. We can define it as the ability to access, analyze, evaluate and communicate messages in various forms. It helps to understand the messages conveyed by the media that shape society. However, it does not refer only to the issue of personal information and literacy, but covers a wider area - it includes the responsibility of the state in the fight against disinformation and the spread of counter-information. The state is obliged to finance technologies that will serve in the fight against the distribution of disinformation and false information (Feldvari et al., 2022). Technology that uses artificial intelligence presents both opportunities and risks, but it certainly has a big role in the fight against misinformation and maliciously placed untrue content. It helps national security by improving surveillance, detecting cyberthreats and improving defense capabilities with precision, speed of detection and response. The development and use of artificial intelligence systems should be guided by the principles of transparency, responsibility and fairness in order to fully serve the public interest (Smiljanić, 2023).

There are also potential weaknesses in the use of artificial intelligence technology in the fact-finding system. It has numerous advantages due to possibility of processing large amounts of data from different sources that come at a high speed, but there are also risks that come due to the challenging nature of artificial intelligence systems and the way they are implemented and used, and then there is the human factor – society's readiness to accept artificial intelligence technology and its legal framework. The development of artificial intelligence experienced a sudden take-off, therefore, in the detection of disinformation, it is necessary to use more recent methods supported by artificial intelligence. Therefore, it is necessary to take care of the long-term sustainability and maintenance of the system and the growth of expertise involved in its development. In addition, it is necessary to ensure that appropriate legal support is provided from the beginning of the development of the system in order to reduce some of the risks associated with the use of artificial intelligence, especially those related to the collection and processing of data (Grbeša Zenzerović & Nenadić, 2022).

Conclusion

The use of disinformation in hybrid warfare is not a new phenomenon. But artificial intelligence has brought this "skill" to a higher level. It allows users to act anonymously, adapted to the weaknesses of the opponent. We can ask ourselves – what tools can be used to fight against disinformation – false information created in such a sophisticated way? The answer is – with the help of artificial intelligence. Artificial intelligence is increasingly being integrated into military strategies on the battlefield, enabling more effective and precise action – in addition to being used in the development of technologies such as the use of drones, it is an excellent tool for collecting and processing large amounts of data. With its help, disinformation that was created even before its development and mass use can be detected and verified more quickly. Its use certainly opens the door to ethical implications, so its use should be regulated by the state.

The frequency of using disinformation in information warfare follows the development of artificial intelligence. In the digital age and, we can say, the age of post-truth, false information is spreading faster than ever. It goes beyond the level of the written word, so with the help of artificial intelligence, fake videos can be created that are almost perfectly convincing. Such audio and video materials contribute to the manipulation and influence public opinion. Examples of the use of disinformation in justifying the war in Iraq, then disinformation and during the war in Ukraine and in the recent past of the Homeland War in Croatia, indicate the weight that disinformation have in the struggle for supremacy.
Malicious use of untrue information, especially those created by artificial intelligence that are more difficult to verify, are aimed at spreading fear and social and psychosocial manipulation; the spread of propaganda that destabilizes the opponent or creates a positive image of oneself and has a purpose of political manipulation through influencing public opinion and polarization in society.
 Considering the potential risks, each country should have guidelines and regulations to limit the abuse of artificial intelligence. In addition, a significant role should be given to people's media literacy and education and awareness about the existence of sophisticated forms of disinformation and counter-intelligence.

Artificial intelligence does not have to be only a tool in war, but also a weapon for peace – it can help create strategies for peaceful conflict resolution; in improving military efficiency; development of tools for the regulation of human rights violations; as a tool to achieve cooperation between nations, etc. Artificial intelligence has its own peculiarities, and it is up to the person for which cause they will use them. The international community should invest efforts in ethical regulations for the use of this tool, direct its development towards its use for the benefit of all and the promotion of security and peace among nations.

 

 


Literature:

1. Akrap, G. (2009), Informacijske strategije i oblikovanje javnog znanja, National security and the future 2(10), 76-151
2. Akrap, G., Mandić, I.,Rosanda Žigo, I. (2022.). Information Supremacy, Strategic Intelligence, and Russian Aggression against Ukraine in 2022. International Journal of Intelligence and CounterIntelligence. DOI: 10.1080/08850607.2022.2117577 
3. ATENA project (n.d.), O informaciji te informacijskoj i komunikacijskoj znanosti, https://iihs.hr/Nazivlje
4. BBC, Bucha killings: Satellite image of bodies site contradicts Russian claims, 11. travnja 2022., https://www.bbc.com/news/60981238
5. BBC, Ukrainen war: Fact-checking Russia's biological weapons claims, 22. ožujka 2022., https://www.bbc.com/news/60711705
6. Bloomberg, US says Russia used AI-Powered Bots in Disinformation scheme, 9. srpnja 2024., https://www.bloomberg.com/news/articles/2024-07-09/us-says-russia-used-ai-powered-bots-in-disinformation-campaign
7. Cvetković, N., Kovač, M., Joksimović, B. (2019). Pojam hibridnog rata. Vojno delo, 71(7), 323-343. https://doi.org/10.5937/vojdelo1907323C  
8. Feldvari, K., Mičunović, M., Badurina, B., Hakiranje krize demokracije: može li nas medijska pismenost spasiti od algoritamskog oblikovanja političke percepcije, volje i mišljenja?, u: Vjesnik bibliotekara Hrvatske 65(2), 23-48
9. Grbeša Zenzerović, M., Nenadić, I., (2022.), Jačanje otpornosti društva na dezinformacije: Analiza stanja i smjernice za djelovanje, Zagreb, 7-90
10. Hrvatska enciklopedija, mrežno izdanje. Leksikografski zavod Miroslav Krleža, 2013. – 2024. Umjetna inteligencija, https://www.enciklopedija.hr/clanak/umjetna-inteligencija 
11. Index, „Strah i panika“: Tajni dokumenti razotkrili rusku operaciju na Zapadu, 4. srpnja 2024., https://www.index.hr/vijesti/clanak/procurili-dokumenti-razotkrivena-ruska-operacija-sirenja-straha-i-panike-na-zapadu/2579647.aspx
12. Mandić, J., Klarić, D., (2023), Case study of the Russian disinformation campaign during the war in Ukraine – propaganda narratives, goals, and impacts., u: National security and the future 2(24), 97-140
13. Mlinac, N. (2024). Hibridna inteligencija kao nositelj protuobavijesti i hibridnih prijetnji u kiberprostoru. National security and the future, Vol. 1 (25)
14. Mlinac, N., Akrap, G., Lasić-Lazić, J., (2020). Novi oblici manipuliranja u digitaliziranom prostoru javnog znanja i potreba za uspostavom digitalnog i podatkovnog suvereniteta. National Security and the Future, Vol. 21 No. 3, 2020. 
15. Pew Research Center, A Look Back at How Fear and False Beliefs Bolstered U.S. Public Support for War in Iraq, 14. ožujka 2023., https://www.pewresearch.org/politics/2023/03/14/a-look-back-at-how-fear-and-false-beliefs-bolstered-u-s-public-support-for-war-in-iraq/
16. Sipri, Twenty years ago in Iraq, ignoring the expert weapons inspectors proved to be a fatal mistake, 9. ožujka 2023., https://www.sipri.org/commentary/essay/2023/twenty-years-ago-iraq-ignoring-expert-weapons-inspectors-proved-be-fatal-mistake
17. Smiljanić D., (2023), Umjetna inteligencija – cilj, način ili sredstvo strateškog najecanja, Strategos 7(1), 113-140
18. The Guardian, Russian bombing of maternity hospital 'genocide', says Zelenskiy, 2022., https://www.theguardian.com/world/2022/mar/09/ukraine-mariupol-civilians-russia-war
19. The Guardian, Sergei Lavrov prefers propaganda over reality in Ukraine talks, 2022. https://www.theguardian.com/world/2022/mar/10/sergei-lavrov-russia-foreign-minister-propaganda-in-ukraine-talks 
20. The Insider, „Morality and ethics should play no part“: Leaks reveal how Russias foreign intelligence agency runs disinformation campaigns in te West, 4. guly 2024., https://theins.ru/en/politics/272870
21. Tuđman, M. (2009.), Informacijske operacije i mediji ili kako osigurati informacijsku superiornost, u: National Security and the future 3-4(10), 25-45
22. Tuđman, M. (2013). Programiranje istine. Hrvatska sveučilišna naklada. Zagreb.
23. Večernji.hr, Otkrivena velika ruska operacija na Zapadu: 'Usadit ćemo im u psihu strah, paniku i užas!', 4. srpnja 2024., https://www.vecernji.hr/vijesti/otkrivena-velika-ruska-operacija-na-zapadu-usadit-cemo-im-u-psihu-strah-paniku-i-uzas-1782494
24. Večernji.hr, Rusija poriče masovna ubojstva u oslobođenoj Buči: 'Radi se o provokaciji', 3. travnja 2022., https://www.vecernji.hr/vijesti/rusija-porice-masovna-ubojstva-u-buci-radi-se-o-provokaciji-1576015
25. Vijesti.hrt.hr, Brojne osude napada na bolnicu u Mariupolju; Zelenski: Napad je ratni zločin, 10. ožujka 2022., https://vijesti.hrt.hr/svijet/brojne-osude-napada-na-bolnicu-u-mariupolju-zelenski-napad-je-ratni-zlocin-6001345
26. Washingtonpost.com, Putin say's he will 'denazify' Ukraine. Here's the history behind that claim, 25. veljače 2022., https://www.washingtonpost.com/world/2022/02/24/putin-denazify-ukraine/

Gallery / Galerija slika
Nema galerije slika / No image Gallery

ZgSecForum