How is the Internet shaping today's society

Self-help against manipulation on the Internet

“The questions we are concerned with are: What should online environments look like so that they respect human autonomy and promote the truth? And what can people do themselves to avoid being so easily misled and influenced in their behavior? ”Says Anastasia Kozyreva, first author of the study and researcher in the research area Adaptive Rationality at the Max Planck Institute for Human Development. Together with her colleagues, she first examined the specific differences between the online and the real world and identified four major challenges.

Recognize manipulation and dangers

(1) User behavior is influenced by manipulative selection architectures, so-called “dark patterns”. They often lead users to actually unwanted behavior - such as advertising that appears as the content of the page or in the navigation guidance so that one clicks on it, or even confusing privacy settings that lead one to share more information than one does actually want.

(2) Information architectures supported by artificial intelligence do not present content to us neutrally, but rather personalized on the basis of data that has been collected about us. This means that two people who enter the same term in a search engine are very likely to get different results. This can be helpful if we are looking for a specific restaurant in our area and the search engine first shows hits in our neighborhood - instead of a restaurant with the same name at the other end of the world. However, if we receive news or political content based on our user preferences, there is a risk of filter bubbles forming in which we no longer receive any other opinion.

(3) The research team sees false and misleading information as another challenge for citizens on the Internet. Videos and posts with conspiracy ideologies and unconfirmed rumors can spread quickly and cause damage via social media: for example, if people do not get vaccinated due to misinformation about vaccines and thus run the risk of falling ill and endangering others.

(4) Distracting online environments are constantly trying to attract interest, whether with push messages, flashing displays, advertising messages popping up, or the presentation of new content. The business model is to get attention and hold it for as long as possible. At the same time, we catch ourselves with our screen time becoming much longer than we actually wanted - without us really benefiting from it and at the expense of our attention for other things.

Hide distracting apps

In order to be able to cope with these four challenges, the research team shows concrete possibilities from a behavioral science perspective. With so-called “boosting tools”, ie competence-enhancing cognitive tools, new skills can be trained and better autonomous decisions can be promoted in the online world.

Self-nudging is one of the cognitive tools from behavioral science that people can use to create “healthier” choice and information environments. Self-nudging on the Internet can help to shape the digital environment in a self-determined manner: this includes, for example, muting apps, tidying up the smartphone's home screen and only allowing the applications that are really needed to be visible: calendar, camera and map service, the meditation app or the weather should remain . Anything that's too distracting, like social media and games, is better placed in folders. For the use of social media, the researchers also recommend consciously setting time limits.

“The digital world is full of traps that are set for us. But we can protect ourselves from the traps. Just as we hide candy in the cupboard and put a bowl of apples on the table, we can also switch off notifications from apps to protect ourselves from permanent distraction and flooding. Out of sight, out of mind It also works in the digital world, ”says Ralph Hertwig, director of the Adaptive Rationality research area at the Max Planck Institute for Human Development.

Questioning information

Just like looking left and right before crossing a street, we should make it a habit to ask specific questions in order to classify online content. These include: Where does the information originally come from? Which sources are given? Do I find similar content on reputable websites I know? In this way, one's own digital competence can be trained with a view to the truthfulness of information. But internet services could also help to better classify content. For example, with decision trees that appear before you share content and remind you to first check the source and the facts.

In general, however, politicians should also take greater account of regulatory measures to ensure that users have control over their digital environment and their personal data - for example through preset data protection. And don't forget: The subject of intelligent and self-determined use of digital technologies must be part of both school and adult education. The sooner the better.

The research team emphasizes that none of the proposed tools alone are sufficient to evade online manipulation or prevent the spread of misinformation. However, if you combine smart cognitive tools and early media literacy education with policies that limit manipulation, the online world can become more democratic and reliable, says Stephan Lewandowsky, professor of cognitive science at the University of Bristol, UK.