Intro
Everything is different. This means that all areas of society have to learn and change, and adapt to the way technological disruptions are shaping this new world of communication.
How do people stay well informed in the digital age?
Digitalization has turned the concept of how we inform ourselves and communicate entirely upside down. Society isn’t adapting quickly enough. People still lack the information and communications skills they need to really master these interactions.
This is first of all a democratizing process, because many more people today want their voice to be heard, and want to be a part of the discussion. And they can do this. But this complete change in the communications landscape also introduces entirely new challenges in terms of regulation, in terms of responsibility, in terms of who the relevant actors are.
That makes it difficult for us as a society to react quickly enough, because the technological disruption is simply gigantic, and is happening very, very quickly. These days, you can’t expect just to clean up after things by putting together good regulation and approaches, because by that time the world is already changing again, and new things like AI, ChatGPT and Midjourney are all becoming relevant.
And that leaves society overwhelmed. So almost all social systems have to learn and adapt very, very quickly, whether that means the education system, the political system or the healthcare system. All of them are suddenly part of this public sphere, and of course once there, they have to learn how to communicate professionally and inform people in a way that brings them along. But it’s really not easy to navigate this huge flood of information, and find your way from A to B without somehow falling for or getting stuck on advertisements or disinformation or cat videos, or whatever.
Can platforms shield against disinformation through quality mechanisms?
Platforms can make the most important contribution to protecting people from disinformation. If we remember back to the pandemic, there were suddenly completely new information offerings from the various platforms.
On Google, when I googled COVID-19, a sidebar suddenly came up that gave me quite a bit of relevant information about the pandemic. For example, what the incidence was in my country, or how many people had been vaccinated, or where I could find relevant health information. Google did a good job there of sort of taking me by the hand when I was trying to google information.
Other platforms approached this differently, sometimes less well. For example, Instagram flagged any post that contained the words “corona” or “COVID-19” in any way, and said, “Watch out, this is about the coronavirus.” Other platforms like YouTube tried to do a better job of curating what showed up in response to keyword searches.
The moment the platform decides to help people, maybe by curating certain content or adding a separate piece of information that assesses the quality of a source, that’s a huge help. This could be information about the source, explaining why the source is trustworthy, or additional information and links. The question is, how much do people trust this platform, so that we actually have a different outcome.
Ideally, we’ve learned a lot from the pandemic, and now actually know how to think about all these issues beforehand when building a new platform, for example. For instance by setting up a quality system that determines beforehand what sources are even allowed to be included. This means thinking about a set of rules beforehand that clearly say: This is trusted information, and this is just disinformation that has no place on our platform.
I think we can learn a lot from what Google has done, and from how YouTube has labeled sources, and from how other platforms have done things like provide additional links. We can use these ideas to be able to offer genuinely high-quality information even in a new communications environment.
So yes, platforms have a critical responsibility with regard to whether people are leaving the platform informed or disinformed. It has to do with the way they prepare this information, with whether they’re thinking about how the information ecosystem can be secured or curated, and about what additional information they can provide so that users can better decide whether this is a trustworthy source, or just some disinformation crap that they’d be better off avoiding.
Expert
Alexander Sängerlaub is the Director and Co-founder of futur eins. He takes a holistic approach to digital public spheres and explores how the utopia of an informed society can be achieved. Previously, he helped establish the “Strengthening Digital Public Sphere” department at the Berlin think tank Stiftung Neue Verantwortung, where he led projects on disinformation (“Fake News”), fact-checking, and digital news literacy. He studied journalism, psychology, and political communication at the Freie Universität in Berlin.