Straight Talk Media & Entertainment
Last week’s US Senate Select Committee Hearing on Social Media Influence in the 2016 US Elections threw into sharp relief the extent to which social media sites – such as Facebook, Twitter and, to a lesser extent, Google – now represent many online users’ dominant source of news and current affairs, to the point where social media could probably now be regarded as a mainstream media (MSM) channel. However, social media has several defining characteristics which render it riper than MSM for the type of manipulation that was the subject of the Senate Committee Hearing. It is, therefore, contingent on social media companies to be aware of the ways in which the structure of their platforms leaves them open to misuse, and their users to abuse, with potentially serious consequences for not only individuals and groups, but also societies and nations. Vigilance is vital: The key concern that prompted this Senate Committee Hearing was whether Russia’s use of social media influenced the outcome of the 2016 Presidential Elections in the US.
As global internet companies, social media providers have so far not been subject to the legislation and regulation which govern traditional media channels in many countries. The onus is, therefore, on the social media companies to self-regulate. But self-regulation is only as effective as a company’s readiness to put aside its totally valid objectives of growing revenues or reach (or both), so that it can best serve its customers. For example, social media sites’ efforts towards self-regulation must balance the needs of their end users, as well as their advertisers, which can potentially conflict. The desire to generate revenues by selling affordable and appealing advertising products means that social media providers have invested heavily in developing advertising inventory and ad targeting platforms. However, they haven’t sufficiently scrutinized who purchases their advertising, where advertising is placed, and the intentions of the advertisers – particularly with regards to providing transparency around political advertising. This is a shortcoming that Facebook, Twitter, and Google acknowledged during their opening statements to the Senate Committee Hearing, and are already committed to addressing.
MSM outlets, as well as specialized news channels and interest groups of all kinds, use social media as a content distribution platform, which means that users can self-curate the content that they see by liking or following pages or accounts. In addition, social media sites use technologies such as algorithms and bots to suggest similar content to their users based on their interactions, which can be useful in allowing users to discover new sources of information or content. But the use of algorithms or bots can leave social media platforms open to misuse, ranging from individual users trolling others, to entities using bot technology to amplify their own, potentially harmful or misleading content (i.e., fake news).
As social media platforms become mainstream media platforms, so too do their responsibilities towards their users increase. For instance, Facebook has more than 2 billion users – far more than traditional MSM companies such as the BBC or CNN whose global audiences number in the low hundreds of millions. The size of Facebook’s use base makes it a tasty target for those who wish to subvert its platform, either subtly or overtly. To give credit where it is due, Facebook is keenly aware of the task it faces – as are Twitter and Google – and has instigated various internal and external initiatives, such as updating its News Feed Publisher Guidelines, hiring more employees to vet content and advertising published on its platform, and founding projects such as the News Integrity Initiative. But much of what Facebook, Twitter, and Google have done has been with the benefit of hindsight. Social platforms must be able to anticipate the unintended consequences of the products and services they provide, including the underlying technology, to prevent further potentially harmful exploitation of their platforms. It is incumbent on them to be vigilant and proactive, rather than complacent and reactive.
Straight Talk is a weekly briefing from the desk of the Chief Research Officer. To receive this newsletter by email, please contact us.