Canadian Government Executive - Volume 24 - Issue 03
“Pushing hands” are not bots but are low-wage workers doing the same work as political bots, often aided by automation tools. They are mobilized through piece-rate crowd-sourcing platforms (e.g. Amazon’s Mechanical Turk) or clandestine networks. While impersonating citizens on social media and comment threads, these conversationalists flood the zone with advocacy, polarize debate, sow confusion, mock opponents, and otherwise discourage good-faith dialogue. Automated software applications (“bots”) have been weaponized to unduly sway public opinion and online government consultations. A good defence starts with knowing the mischief each type of bot can get up to. Not all bots are bad. Many Internet-enabled services rely on them. Some bots can fight the good fight by improving political transparency. Knowing your bot allies is the second element of an effective defence. Scraperbots pull personal inform- ation from Web pages and online public records. The data is used by other bots to impersonate real people in official submissions. Without follow-up checks, most identity-theft victims will not learn of advocacy made in their name. Hackbots roam the Web looking for vulnerabilities to exploit. Once a weakness is discovered, the bot alerts hackers and infects the system with nefarious code. Data breeches and corruptions can undermine confidence in a voting or consultation process. A honey pot is a software enclave that attracts computer viruses to study them up close. Similarly, a honey bot acts as a decoy for other bots to record messages and tactics. Findings are publicized, sent to authorities, and used to devise counter-measures. Clickbots trigger online adver- tisements. Originally, these bots defrauded those paying for ads on a pay-per-click basis. During consultations, clickbots drain the ad budgets of opponents or solicitation campaigns while making the ads seem successful. These are the ballot stuffers of the bot world who inflate the tallies of online polls and petitions. They can also highjack public consulta- tions by flooding them with writ- ten submissions to give particular policy stances the veneer of wide- spread public support. Denial-of-service attacks are attempts to shut down Web sites by bombarding them with bot traffic launched from hijacked computers and connected devices. These attacks can shut down consultations or censor Web content. Newsbots spread propaganda and gossip. Real news that fits their agenda is amplified by reposting to social media sites. Sensational stories distract the public and muddy the facts. Misinforma- tion is spread to manipulate political participation. Alertbots monitor the activity of politicians, activists, or govern- ment processes and publicize activity that would normally go unnoticed. That transparency can raise awareness of consultations (or lack thereof) and the submis- sions of various players. Spambots spew unsolicited e-mail messages at targets. Even with low success rates, the large volume of messages ensures some influence. Spam bombardments hinder consultations by drowning out other voices. Embedded links and attachments can infect systems. Helpbots were originally designed to fight parket tickets, file tax returns, or otherwise overcome convoluted bureaucratic processes. Machine learning discovers tactics most likely to result in successful submissions, which can help those without technical expertise. 16 / Canadian Government Executive // May/June 2018
RkJQdWJsaXNoZXIy NDI0Mzg=