Español | English
rss facebook linkedin Twitter

Poisoned truth

Psyops is a term from military jargon to describe the strategies used in an armed conflict that aim at influencing the enemy, usually in an attempt to demoralize it or make it support our cause. It could be described as social engineering applied to the battlefield.

From Genghis Khan’s tactics to the aggressive and complex defacements characteristic of current cyber-wars, history of warfare is scattered with attempts to psychologically change the enemy’s mind by manipulating the perception of the facts. In order to achieve this, it is necessary to modify the information received by the enemy. The only difference with the strategies of the past is that the battlefield has changed.

At first glance it might seem that a decentralized structure like the Internet, which is based on scattered sources of information, should be extremely difficult to poison or manipulate, since it is impossible to take control of all sources of information available to the users. However, a series of incidents throughout 2009 have demonstrated that this hypothesis is far from being accurate, and that the Net is terribly vulnerable to a well-crafted lie. In fact, to conduct a successful attack we could imagine a two-stage strategy:

  1. First, we should analyze the information available about the targeted subject until we find a reasonable number of key sites, from where our lie will "propagate itself" after being injected.

  2. Once these sites have been located, it’ s the turn of astroturfing, which basically is the spreading of false information – better said, misinforming – making it look as if it came from many independent sources. By fooling others into thinking that there’s a wide support for a cause, the attackers can encourage a favorable opinion about it. The key to success is making the public believe that all opinions come from a large number of independent individuals, not organized and geographically scattered, which is easy to achieve with the appropriate means.

A sufficient number of astroturfers can decide what makes the headlines or bury important news in media like Digg or Menéame, vote videos in Youtube, keep alive false entries, write their own version of the facts in Wikipedia and even influence Google searches.

We have seen in the last years numerous examples of astroturfing with a diverse range of targets, from ridiculing political campaigns to improving the sale of a product. In fact, the governments of China and Israel have acknowledged having used groups of astroturfers – as volunteers and paid staff respectively – to flood politics and human rights advocates’ forums with opinions favorable to their political agenda.

Javier Barrios
S21sec e-crime

0 comentarios:

(+34 902 222 521)

24 hours a day, 7 days a week

© Copyright S21sec 2013 - All rights reserved