The research community is at the forefront of holding platforms accountable, shaping a viable data-access regime and advancing Europe’s digital sovereignty ambitions, Brandi Geurkink writes in her op-ed.
Thirty-five years after the invention of the Web, more than 20 years since the dawn of social media and on the brink of a technological revolution in AI, we still lack scientific understanding, let alone consensus, into these technologies’ impacts on societies - despite clear demand for answers to these questions by the European public and other publics around the world. That’s because researchers who study everything from online scams to disinformation, addiction, deepfakes, and other challenges that technology poses for society struggle to access crucial data from technology companies.
Platforms like Facebook and TikTok act as gatekeepers to data on things like public content, algorithmic recommendations and user behaviour that make research possible. And they have closed more gates than opened, effectively making it impossible for the public to understand how their technologies work, their impacts on individuals and communities, and critically, how they might be made better. Platforms have shut down key tools used by academic researchers, journalists and researchers working in civil society organisations. They have made accessing platform data prohibitively expensive for researchers while creating an open marketplace for that same data to be bought by commercial marketers who use it to sell ads and by other technology companies to train AI models.
In the last few years, Europe has managed to make meaningful progress on digital regulation and reform by passing world-class data protection and AI regulations, as well as the Digital Services Act (DSA) and Digital Markets Act (DMA) to protect European consumers online. The law’s Article 40 acknowledges that researchers, including academics and researchers working within community organisations, have a right to access platform data and requires both regulators and platforms to facilitate that access. This opens up the possibility for groundbreaking, critical independent technology research.
But even with a law on the books, researchers’ access to platform data is not guaranteed. For Article 40 to live up to its promise, we must keep three things in mind:
1. Platforms will fight this tooth and nail.
A critical provision of the DSA, Article 40.12, requires platforms to make public data accessible to researchers. Since that mandate came into effect in 2024, platforms have routinely denied researchers’ applications for access without good reason, released tools that are riddled with bugs and errors, and used other creative means to keep researchers out.
This is a warning of what to expect as other data access provisions of the DSA come into force, such as Article 40.4, which enables researchers to access a much broader subset of non-public platform data.
To appropriately heed that warning, regulators within the European Commission need to strongly enforce the data access obligations for large platforms, and Digital Services Coordinators within national regulatory bodies, especially in Ireland (whose regulators oversee most of the companies in scope of the law and specifically play an important role in the implementation of Article 40.4), need to prepare for platform recalcitrance.
There are some good developments on this front: in response to collective evidence of obstruction, last week, the European Commission fined X 120 million Euro for, among other things, failure to provide researchers with access to public data. There are still numerous open investigations against other platforms subject to the data access requirements under the DSA, and it remains to be seen whether this signal from the European Commission will result in much-needed behaviour change from the platforms.
Without the political will to enforce the law, technology companies will continue to stonewall researchers rather than comply.
2. Rome wasn’t built in a day, and neither will Europe’s data-access regime.
In other regulated industries, independent researchers and experts are guaranteed access to data to evaluate safety, compliance and public risk. In the pharmaceutical industry, independent scientists analyse full clinical-trial datasets before and after approval. In the aviation industry, accident investigators routinely obtain black-box and maintenance data from airlines. The energy industry relies on third-party verification of emissions, safety logs and sensor data.
To make these systems work, scientific disciplines have had to build systems and safeguards for working with third-party, and often sensitive, datasets. Social science researchers studying technology’s impacts on society will need to similarly adapt. During a recent ERC workshop about data access under Article 40, one participant noted how the kinds of requirements that feel “foreign” to those working in the social sciences are commonplace in disciplines like health and environmental science. Social science researchers studying technology need to adopt practices like embedding a data protection officer on research teams and including mandatory data protection and security training in undergraduate education.
Similarly, the research community needs to make sure its members are aware of their rights to access data under Article 40. There is good work already being done on this front. The DSA 40 Collaboratory, led by researchers at the Weizenbaum Institute, helps researchers understand how to apply for data access under Article 40 and supports them throughout the process. The DSA Observatory, housed at the University of Amsterdam, convenes researchers to discuss and study the implementation and enforcement of the DSA. The organisation I lead, the Coalition for Independent Technology Research, includes a task force on data access for researchers running transatlantic initiatives.
3. Remember: the R in R&D stands for 'research'.
As Europe moves to embrace “digital sovereignty” and free itself from its dependency on American big tech, we should remember that independent, scientific research can do more than just guide regulation. It can also underpin innovation in the technology industry and open up pathways to build and deploy technology in ways that are positive for humanity.
But that can only happen if researchers are able to answer the questions that really matter to consumers and society as a whole. Big tech companies do not want those questions to be asked, because they do not want to be held accountable for the answers. As we work to create a European digital ecosystem, we need to remain vigilant that it does not simply replicate the existing Big Tech digital ecosystem.

Brandi Geurkink is the Executive Director of the Coalition for Independent Technology Research