By Gerd Gigerenzer
Considering the digital world today means confronting questions that intersect with research, policy, and everyday life. As our societies become increasingly shaped by algorithms, platforms, and data-driven systems, the challenge is not just to keep up but to remain truly in control.
Exploring these questions, as my book How to Stay Smart in a Smart World does, reveals that while AI and automated systems excel at narrow, well-defined tasks, they often stumble when it comes to judgement, empathy, and uncertainty.
Algorithms that triumph at chess can misfire in human domains, with significant consequences for people’s lives. For example, self-driving cars may fall prey to the so-called Russian Tank Fallacy. Despite little evidence that complex black box algorithms predict human behaviour better than transparent and straightforward algorithms, these commercial products increasingly guide predictive policing and judicial decisions.
On social media, we are manipulated by engineered rewards. The ‘like’ button keeps us engaged while platforms record and monetise almost everything we do. As users, we – our time and attention – have effectively become the product.
This situation is exacerbated by the ‘privacy paradox’: most people claim to care about their data, yet few are willing to pay for control over it. What is required is not just regulation, but a shift in mindset: a world where the algorithms shaping our lives are transparent, and big tech’s business models place data ownership in the hands of users. Many complex black-box algorithms, such as for credit scoring, can be simplified – without loss of accuracy – and thereby made transparent.
‘Digital literacy, and the ability to fact-check is essential not only to navigate the online world, but to shape it’
Critical, too, is the role of digital literacy. The skills that make people digitally literate, including knowing how to fact-check reliably, are essential if we want not only to navigate the complex digital landscape, but also to shape it. Yet studies worldwide show that even as many as 90 per cent of digital natives do not know how to judge the trustworthiness of websites, or to tell a fact from an opinion.
The DSA’s promise for research
The Digital Services Act (DSA) represents a significant step toward building digital spaces that are safer, more transparent, and accountable.
For years, researchers have sought deeper access to the online platforms that wield such a profound influence over public life, democracy, and society. Under Article 40 of the DSA, qualified (‘vetted’) researchers can now request access to sensitive datasets from large online platforms and search engines. These datasets, until recently out of reach, can help researchers examine how misinformation spreads, how elections are influenced, and how online harms and systematic risks develop.
‘Accessing platform data takes time and resources, and research communities must work together to make the most of what is now possible’
The new framework spells out the requirements for researcher vetting, data security, transparency, and independence from commercial interests, raising expectations while also opening the door to more reliable findings.
Still, this shift brings its own difficulties. Applying for and maintaining access to platform data takes time, resources, and care. Research communities will need to commit effort and work together to make the most of what is now possible.
Turning data access into knowledge
In a forward-looking pilot project, the European Commission’s Directorate-General for Communications Networks, Content and Technology (DG Connect) turned to the ERC for feedback on systemic risk research within ERC-funded projects. This initiative brought together ERC-funded researchers, representatives of national Digital Services Coordinators (DSCs), and policy stakeholders to explore new procedures for ‘vetted’ research access to platform data under Article 40.
The pilot project, developed in the ERC’s former Working Group on Open Science, established a bridge between research and policy. Participating ERC scientists gained hands-on experience with the application process and worked closely with DSCs to identify challenges, knowledge gaps, and research opportunities.
A diverse group of researchers and decision-makers discussed results at a recent workshop. The event provided a vital space for direct exchange, collective learning, and dissemination of insights and methods to the boarder scientific community, thereby bringing Europe closer to meaningful, data-driven research in the online environment.
Looking ahead
As citizens and researchers, we should neither blindly trust in ‘smart’ technology nor succumb to fear. There are real grounds for optimism about our ability to reclaim and shape our collective digital future. Informing, educating, and regulating are the essential pillars of a democratic digital space, and these require renewed commitment and collaborative effort from the research community, dedicated to excellence and scientific independence.
Our task is to chart a responsible, evidence-based course for technological development – one that is both smart and wise. In doing so, we work towards a future where our digital world serves everyone, and technology empowers rather than manipulates.
Gerd Gigerenzer is vice-president of the ERC Scientific Council.