Which audience has political social
These private companies promote private interests as nationless finance, PR, strategic communication, behavioural change and technology as kingmakers seeking to enhance their own profit and power.
Commander Steve Tatham served in Iraq, he commanded its Afghanistan regiment for some years. What seems to have been lost in much media coverage of Cambridge Analytica is the understanding of where its roots lie — from deep within the military-industrial complex.
A British corner of it populated, as the military establishment in Britain is, by old-school Etonian Conservatives. Steve Tatham was the head of psychological operations for British forces in Afghanistan. And, now, too, the American defence establishment. It has to be understood in terms of a military contractor using military strategies on a civilian population. The tools of psychological warfare are being used on us.
To have so much data in the hands of a bunch of international plutocrats to do with it what they will is absolutely chilling. I agree. The implications in the undermining of the integrity of our democratic system and citizen rights of so-called western liberal democracies is hidden in plain view.
The problem far exceeds the revelations about the wrong doings of Facebook and Cambridge Analytica. Psychological manipulation of citizens by both corporate entities and governments is now the norm. Mass surveillance, data profiling and behavioural modification strategies are embedded in the corporate sector and are now being used in a way that challenges the political canon of liberal democratic societies, where citizens are traditionally defined by principles of self-determination.
But you can help by making a donation to help me continue to research and write informative, insightful and independent articles, and to provide support to others.
The smallest amount is much appreciated — thank you. The Conservative Party are facing another investigation from the Electoral Commission following evidenced allegations that they operated a secret call centre during the general election campaign, breaching electoral law, an undercover investigation by Channel 4 News has revealed.
The investigation has uncovered underhand and potentially unlawful practices at the centre, in calls made on behalf of the Conservative Party. Call centre employees working on behalf of the party used a script that certainly appeared to canvass for support on film, rather than conduct market research.
On the day of the election, call centre employees contacted voters to promote individual candidates. Anya Proops, a QC specialising in information law, told Channel 4 that political parties had to ensure that third parties working on their behalf followed the law.
Blue Telecoms is run by Sascha Lopez. However, a whistleblower at the call centre told Channel 4 News that they had been making potentially unlawful phone calls to voters. Undecided voters were fed key Conservative campaign messages, including references to the Brexit negotiations and warnings about a hung parliament.
Callers were also recorded quoting media articles that were pro-Conservative. As the election campaign started, the information commissioner, Elizabeth Denham, wrote to all the major political parties reminding them of the law around telephone calls and data protection. The call centre confirmed it was employed by the party, but has so far denied canvassing on its behalf.
The Channel 4 undercover reporter has captured evidence that certainly seems to refute that claim. The test data is supplemented by recent issue surveys, and information from online surveilance, together they are used to categorise political supporters, who then receive psychologically tailored canvassing messages, social media targeting, phone calls and doorstep visits. The micro-targeting of voters has been around for a while, but the Conservative operation has deepened the intensity of the effort and the use of vast resources of psychological data.
However the methods being used which entail the manipulation and management of public perceptions and voting behaviours resemble those of authoritarian regimes, not a healthy liberal democracy.
Authoritarian propagandists attempt to convey power by defining reality. The reality they portray is usually very simple. However, its role as an authoritarian prop for an ideological imposition on the population has always been apparent to some of us, especially given the more visible evidence of political narratives and the stage management of our democracy via an extremely manipulative mainstream media over recent years. However, in democratic societies, governments are traditionally elected to reflect and meet public needs.
This is profoundly undemocratic. In fact it turns democracy completely on its head. This means that the individually tailored messages are not open to public scrutiny, nor are they fact checked.
A further problem is that no-one is monitoring the impact of the tailored messages and the potential to cause harm to individuals. The reality is that often, authoritarians construct an incongruent, flimsy and meaningless language of democracy in order to erect a fact proof screen around an undemocratic reality.
They offer a lot of glittering generalities to the public. However, those apparently incoherent, meaningless slogans are especially designed to signal intents to groups from which the government wants to gain approval. Dog whistling and wedge issues are used extensively by the right. Dog whistling is closely associated with a broader wedge strategy, whereby the political party introduces a divisive or controversial social issue into a campaign, aligning its own stance with the dissenting faction of its opponent party, with the goal of causing vitriolic debate inside the opposing party, defection of its supporters, and the legitimising of sentiment which had previously been considered inappropriate.
Political campaigns use wedge issues to exploit tension within a targeted population, and undermine unity. UK voters are being targeted with highly specific and manipulative messages in an attempt to influence their vote.
The shadowy world of online political advertising has until recently gone largely unmonitored, despite the huge power and reach of Facebook and despite social media messaging now thought to have contributed to the election of Donald Trump and the Vote Leave victory. The new forms of psychological electioneering are invisible to all but the individual people they are designed to reach and influence.
This meant that the Conservatives reached 17 million people per week, while Labour reached only 16 million in their best month. He said :. In turn, these can have enormous influence to affect what we learn, how we feel, and how we vote.
The algorithms they may produce are frequently hidden from scrutiny and we see only the results of any insights they might choose to publish. Nigel Oakes founded the latter and also set up Strategic Communication Laboratories and using the new methodology from BDI , ran election campaigns and national communication campaigns for a broad variety of international governments.
All of those operations are based on the same methodology Target Audience Analysis and, as far as can be discerned from the outside, SCL and affiliates have very obscure corporate structures with confusing ownership. In , CA was involved in 44 US political races. And :. Then, using a sophisticated electronic data delivery system, CA is able to provide TV advertising campaign data that may be used to inform media buyers about shows that have the highest concentrations of target audiences and the least amount of waste; all of which leading to higher media ROI [return on investment] and more voter conversions.
The company is heavily funded by the family of Robert Mercer , an American hedge-fund billionaire. Mercer made his money as a pioneer in the field of Computational Linguistics. Mercer later became joint CEO of Renaissance Technologies , a hedge fund that makes its money by using algorithms to model and trade on the financial markets.
This is a billionaire who is trying to reshape the world according to his personal interests, beliefs, wishes and wont. Mercer is known for his anti-welfare and right libertarian views.
Mercer and his family are major donors to Conservative political causes such as Breitbart News. Most political campaigns run highly sophisticated micro-targeting efforts to locate voters. SCL promised much more, claiming to be able to manipulate voter behaviour through psychographic modeling.
This was precisely the kind of work Mercer values. SCL claimed to be able to formulate complex psychological profiles of voters. CA collects data on voters using sources such as demographics, consumer behaviour, internet activity, and other public and private sources. The company is proprietorial about its precise methods, but says large-scale research into personality types, based on hundreds of thousands of interviews with citizens, enables them to chart voters against five main personality types — openness, conscientiousness, extraversion, agreeableness and neuroticism.
Stephen K. Jennifer Rubin in The Washington Post writes:. In essence, a new, dark, subliminal propaganda war is being waged against citizens by those who wield power, serving the narrow interests of those who do and who are funded by a hidden few who want to weild power also.
Lynton Crosby has been a close advisor in the Conservative election campaigns of Australia, Canada and the UK, and is well known for his racist dog whistling and wedge strategies, influential at an international level. Crosby is said to focus on delivering simple messages, targeting marginal constituencies and the use of lots of polls and data. Mark Textor, co-founder of the private company, was mentored by the late Richard Wirthlin , a pollster who was chief strategist to US President Ronald Reagan.
As a result, a number of media outlets have called her credibility into question , with some refusing her requests for one-on-one interviews.
You would be forgiven for thinking that the world and the media are being run almost exclusively by a small number of elitist, pan-nationalist aliens. But, while the criticisms of the campaign were valid, the Kony phenomenon likely had a mixed impact.
Despite rumors of money mismanagement, for example, the organization seems to have used its funds for their intended purposes. Invisible Children generated over 30 million dollars from the video alone and had also received grants for its work. Some public figures have noted that Invisible Children had legitimate programs on the ground — despite their problematic framing of the conflict.
However, the U. The campaign also briefly made Kony a household name fulfilling the stated goal of the initiative. Regardless of any benefits of the Kony video though, the execution and messaging of the campaign is an important case study for advocates. The primary lesson being that the story advocates tell needs to be accurate and, often, nuanced. Emotional appeals that lack substance can fall apart in the face of expert opinion. Other campaigns have fallen under similar criticisms to Kony , but with different outcomes.
The movement suffered from the naturally short lifespan of social media trends, though, and critics pointed out that the return of the schoolgirls would not be a quick task. The campaign had also inadvertently given leverage to Boko Haram which used the high-profiles of the young girls to conduct drawn-out negotiations for their release. Five years after the tweet, had been released and remained missing. Boko Haram, however, had kidnapped scores of youth and, in particular, young women prior to this instance.
The size of the kidnapping in Chibok made the case particularly egregious and apparently well suited for social media trends.
The mounting publicity also meant that those among the schoolgirls who were returned were treated differently than other youth who had been trafficked. The BBC reported that the Chibok girls as they had become known had not been allowed to return home and had strict limitations placed upon them after being rescued.
Another unfortunate side effect of the social media limelight was that other youth who were being trafficked had not received equal attention. Boko Haram had kidnapped over 7, young women and many young men in crises spanning years. However, only the case of the Chibok girls received a significant amount of outside support and attention. Without the campaign though, authorities may not have taken action as quickly or at all.
Crucially, the campaign had local support and was initiated by local voices. The spontaneous and uncoordinated nature of the BringBackOurGirls social media trend also demonstrated the power of social media to act in emergency situations. The results emanating from that single tweet show the potential in social media for raising awareness.
Critics often point to the inconsistent and trendy nature of campaigns on social media as evidence of insincere participation. Many argue outright that online activists are naive and have little impact. Research shows that social movements can draw exponential strength from typically unengaged users when they participate in viral social media campaigns.
Exciting research out of New York University in revealed a few important aspects of social media trends and political networks. The study also looked at the way in which social networks spread information depending on the subject matter. For example, networks that shared news about protests or social causes behaved differently than networks that disseminated entertainment news. Notably, networks that spread information about civic actions were just as reliant on the network periphery as the network core.
The participation from casual users creates a collective impact that reaches beyond anything the core users of the network could create on their own. The key difference between networks that spread civic information and networks geared toward entertainment news was the activity of the core.
The core sustains a high level of activity in civic networks, while it provides very little of the overall activity in entertainment networks meaning fans talk about celebrities more than celebrities talk about themselves. The challenge is for organizers to harness the power of the periphery in meaningful ways. In the field of communications, reinforcement theories are concerned with all media — online and offline — and center on the idea that people seek information that confirms or reinforces their current world view.
Studies looking at the behavior of networks on social media have revealed that, indeed, users divide themselves into online political communities with little contemplation of opposing views. A study out of Indiana University, for instance, looked at the structures of Twitter networks and how their structures impacted information flows.
Researchers found that users formed distinct communication patterns that isolated themselves within political camps. Social media is also increasingly raising alarm bells among watchdog groups and regulators for its role in government surveillance.
In , the Brennan Center for Justice released a report which examined the way in which the U. The report found that a number of U. According to the authors of the report, data sweeps by U. Nonetheless, agencies continue to collect data, and reports indicate agencies sometimes focus on citizens opposing U. For example, The Nation revealed in that U. Outside of the U.
Experts have noted , though, that some media coverage of the online-based credit system has glossed over details and the current circumstances; many stories have depicted a fully operational system that is already in place. Nonetheless, the amount and use of online surveillance currently taking place in the U.
These examples of political polarization and state surveillance reveal that social media has introduced opposing forces into politics. For example, polarization and surveillance on social media are directly opposed to the increase in communication channels and civic participation also enabled by the platforms.
The remaining portion of this section examines more background and examples of social media reinforcing existing power structures and political polarization.
A report by Freedom House, an independent watchdog organization, gave a comprehensive and global overview of how social media is used for political purposes. The report highlighted three primary ways in which governments and political regimes manipulate social media. Authors noted that 38 out of the 65 examined countries saw the spread of politically-motivated misinformation.
The use of legal measures has increased over the years as well, and researchers found that 47 countries recorded arrests of social media users for political, social, or religious speech. Forty countries were found to have instituted advanced government programs to collect and analyze social media data. One conclusion of the report is that social media restrictions erode other rights as well.
The authors argue that that the violation of privacy rights then leads to discrimination as well as restrictions to speech and assembly, which in turn erodes institutions and the rule of law. Perhaps the most extreme example of online government surveillance is the XKeyscore program revealed by Edward Snowden. The program is a computer system used by the U.
The XKeyscore program goes far beyond social media surveillance, however. By some accounts , users of the XKeyscore program can see virtually everything an individual does online including emails, browser history, and metadata. The system also collects data beyond internet use — including phone calls, video messages, and even faxes.
The Intercept reported in that virtually no data escapes the eyes of XKeyscore users. Beyond emails the program also collects:. A series of revelations about the XKeyscore program demonstrated the power of U.
Uncovered documents show U. Later discoveries from the leaked documents exposed U. Reports also suggest that the U. The program is not only used to spy on high-profile targets for certain pieces of information either. Some have pointed out that the system is automated to track people who trigger certain conditions.
Given the sheer volume of exchanges and activity on social media, it stands to reason that even the most advanced online surveillance programs begin with data from these platforms. XKeyscore likely represents the most sophisticated means of surveillance currently known to the public. The program is extensive and many resources have been dedicated to the topic. For our purposes of understanding the impact of social media on politics, the XKeyscore program shows that no information on social media platforms is private and it can likely be retrieved by one or more government surveillance programs.
While XKeyscore demonstrates the capabilities of national governments to spy on users, the Cambridge Analytica scandal brought to light what type of data actors are able to retrieve on users.
Documents later showed that Facebook was aware of the breach and failed to take action. Some commentators pointed out that the firm may not have been able to make much use of the data, however. As such, the breach was not likely a deciding factor in the election. Ironically, the firm may have had some effective data analysis techniques.
The technique deemed OCEAN was publicized as a groundbreaking innovation and provides an important example of how user data can be assessed by political campaigns. Reports have also surfaced that Cambridge Analytica was involved with a group called Leave. EU to promote the Brexit campaign in as well. While the two groups did not have a formal contract, evidence suggests that Cambridge Analytica provided Leave.
EU with datasets on potential voters. Social media data, then, has become an increasingly natural target for firms hired by political campaigns. The case of Cambridge Analytica also shows how political reconnaissance can resemble state surveillance and contribute to political polarization. In Sudan, for example, social media sites were disabled in an attempt to quell demonstrations in the capital, Khartoum.
From the winter of to the summer of , protestors demanded democratic elections and an end to the regime of President Omar al-Bashir.
Al-Bashir, who had ruled over the country for thirty years, responded to protests with physical violence and online censorship.
Social media sites were, at times, specifically blocked in an effort to disrupt information flows. At other times, the internet was disabled entirety. Ultimately though, the protests succeeded in ousting Omar al-Bashir without consistent internet access. Thousands of people march to downtown Khartoum today, to commemorate one year since Sudan protests began. Military have closed off the road in front of the military HQ, where the sit-in camp once was.
No sign of Rapid Support Forces. The ousting of Omar al-Bashir shows that social media and internet interruptions will not fatally disrupt information flows for organized social movements. Nonetheless, the tactic is being used at an increasing rate by governments around the world. Shutdowns have become so frequent that NetBlocks, a private organization working on digital rights and security, developed a tool to determine the cost of internet shutdowns to the economy of any country in the world.
The economic costs of an internet disruption persist far beyond the days on which the disruption occurs. Indeed, the negative effects of a disruption on the economy may extend for months, because network disruptions unsettle supply chains and have systemic effects harming efficiency throughout the economy.
Online access and censorship are often discussed as part of the same phenomenon. However, there are distinct differences and important reasons to examine each issue separately. In other words, censorship has to do with the content of expression whereas access issues are concerned with the medium of expression. In some cases of online censorship, governments may simply block certain content related to politically sensitive issues. Critics have even accused some social media platforms of collaborating with governments to censor content.
The hashtag quickly took off with over 6, tweets in less than 15 minutes. Then suddenly, all mentions of the hashtag vanished. Users reportedly questioned Twitter over the incident but never received a response. The silence left some to speculate whether the company was complicit in the action or whether it had been hacked and was wary of revealing politically sensitive vulnerabilities. The incident is not unique, though, and demonstrates the power of governments to control information over the internet even outside of their national boundaries.
Deleting social media posts, however, is a heavy-handed way for governments to censor information. In most cases, such major deletions are reported in the media and damage not only the reputation of the government behind the censorship but, also, the tech companies responsible for the platform. Other methods of censorship have emerged in the social media era that are much more subtle.
Demonstrators gained traction online by using the hashtag Triumfalnaya. Russian government forces responded with bots that posted content using the same hashtag, but with contradictory messages in order to suppress and confuse the online conversation. According to the student researchers, the Russian political spamming effort involved 25, fraudulent accounts which sent , tweets. More subtle than deleting content or blocking access to accounts, this technique is likely being developed further by these actors.
One of the most famous incidents of manipulating information flows over social media is from the U. The dissemination of misinformation likely goes back to the beginnings of mass media.
However, the spread of misinformation represents a much larger issue online than it ever had in print media. Misinformation on social media, then, can accelerate political polarization. As such, political tensions can be stoked with little relative effort and resources on social media. The authors of the report note that the difference may be explained by the fact that novel information spreads faster and farther than everyday news.
Notably, the authors found that false news stories spread faster and farther than the truth in many categories such as natural disasters and terrorist attacks. Yet, the effect was most pronounced for political misinformation.
Contrary to popular assumptions, the study also found that robots were not the primary culprit for the spread of misinformation. Researchers found that humans were primarily responsible for the exaggerated spread of misinformation; robots treated true and false articles in the same manner. With respect to the U. Three primary sources appear to account for the misinformation: 1 passionate Trump supporters, 2 bloggers seeking to make money off of visitors to their sites, and 3 Russian intelligence and propaganda organs.
One should anticipate the needs or expectations of your audience in order to convey information or argue for a particular claim. Your audience might be your instructor, classmates, the president of an organization, the staff of a management company, or any other number of possibilities. They usually need background information; they expect more definition and description, and they may want attractive graphics or visuals.
The managerial audience may or may have more knowledge than the lay audience about the subject, but they need knowledge so they can make a decision about the issue. The most important reader is probably the instructor, even if a grader will look at the paper first. Ask yourself what you know about your teacher and his or her approach to the discipline.
0コメント