PR & Influence

What brands can do to prevent misinformation

Preventing the manipulation of information and its dissemination on social media has become a vital priority for the public and private sector alike. Speaking at the 2019 Asia-Pacific StratCom, Pierre Robinet, Founder of Live With AI and Senior Consulting Partner at Ogilvy Consulting Asia, said that finding the solution will hinge on achieving the “right balance between regulation and empowerment.”

Fallout from the 2016 US presidential election means that foreign interference is at the forefront of our consciousness, and platforms are under intense scrutiny. “If you don’t trust what you’re seeing on Twitter, you’re not going to use my platform,” says Philip Chia, Senior Public Policy Manager (Asia-Pacific) at Twitter. And so regaining that trust is crucial — but far from simple. Chia cites the recent election in Taiwan as being “problematic” for outside interference, but at the same time he brings up the example of India, where internal interference was more of a concern. The composition of actors is changing, and so are their tactics.

It’s not just about spreading salacious, false information; we’re evolving to a more sophisticated playbook, and a whole new vast array of nuance. What if a headline is fake, but the article is true? “It worries everyone we’re partnered with,” says Chia. “We’re no longer addressing basic literacy; we’re talking about basic trust.” Twitter is using platform data to try to figure out the ingredients for good understanding and healthy discourse. “If anyone mistrusts the platform, it’s usually a proxy for the fact that they mistrust the content on it, or the way society engages with it,” says Chia. 

From a technical aspect, automated fact checking provides a partial solution. But fact checking by an AI doesn’t entirely remove the problem of trust; it simply moves it a step away. “Facts do not necessarily mean truthfulness,” says Jakab Pilaszanovich, Innovation Strategist at Telefonica Innovacion Alpha. “Facts can support lies as well. Facts have to be put into context and understood through narration in order for us to be able to say if something is true or false.” We must also consider users’ optimism bias, or quite simply, their naivety. A great many individuals absolutely do not believe that they are responsible for helping spread misinformation. 

On the company side, a renewed focus on cyber-security is needed. This is hardly a new concept, but while it once belonged solely to the IT department, cyber-security must now be adapted for our increasingly interconnected world. That means lifting it out of a single department and giving ownership and responsibility to the C-Suite. Cyber-security must be on-going and continually updated, with education spanning the entire organisation from the top down. 

On the user side, things are more complex. It’s easy to forget that Facebook is only 15 years old, and other tech giants are even younger. These are teenage companies, says Robinet, and we haven’t necessarily taken the time to acclimate and adapt our behaviours to live with these adolescent platforms. “We need to give back power to the consumer, the citizen,” he says. “Everyone wants to live in a world where people can absolutely control their own data. But is that enough? Are we ready for that? I’m not sure the entire population would be able to understand and manage their own data yet. There is a huge leap in education to overcome…. There are plenty of literate, mature markets in Asia, like Singapore, but I’m concerned about countries like Indonesia and the Philippines, where digital literacy rates are low.”

There is a technological component to behaviour change, but also a cultural one. For example, smoking used to be an habitual, social thing where people either didn’t understand or didn’t care about the negative health effects. Initiatives put in place to explain the consequences, such as putting graphic medical imagery on the packaging, had a limited impact. What worked was changing both the policy and the social rituals around smoking; when smokers were legally required to go outside, there was a shift towards smoking being perceived as anti-social, and as a result, rates of smoking began to drop. 

But should it all come down to policy? GDPR prompted companies to give data back to their users to own and control, but people don’t necessarily understand the value of this data, or how to use that potential value. When it comes to educating the consumer, it’s up to companies to bring transparency, traceability and utility to that data, in a way which makes sense for a user’s specific market, culture and values. 

There are no comments

Add yours