Last week, we learned a lot from a leaked cache of Facebook’s internal research: that the company exempted high-profile accounts from its normal rules on what users can post; that a change to its algorithms purportedly designed to bring friends and family together amplified “[m]isinformation, toxicity, and violent content”; and that Instagram, which the company owns, has pushed some teenage girls toward anxiety, depression, and even thoughts of suicide.

This research, disclosed to The Wall Street Journal by a whistleblower who used to work at Facebook, has revived the debate about how to regulate social media platforms. Since the revelations, we’ve seen a wide range of proposals floated, from regulating Facebook’s algorithms to subjecting the company to external oversight to reforming Section 230 and creating federal privacy laws. 

But as legislators and regulators continue to debate the merits of these thorny proposals, there’s another idea that’s already ripe for action: Congress should enable more research and journalism focused on the platforms. 

Social media platforms affect every aspect of society. They shape our public discourse; they have become the battleground of our elections; they are the source of our news and the new medium of our friendships and affiliations; and they often determine who sees an advertisement for an electoral candidate, for a job opening, or for a mortgage application. They have enabled new and meaningful forms of interaction, but they have also become powerful vectors for discrimination, misinformation, harassment, and hate.  

Some of the platforms have offered researchers and the public limited forms of insight into their operations, but these efforts have delivered far less than they promised. It’s now obvious that we need legislation to ensure that researchers and journalists can do the work they need to do.   

Despite the centrality of these platforms to our lives, we have an incomplete understanding of how they operate and what effect they’re having on our institutions and our society. Not even the platforms have a full picture, because they rely on black box algorithms that obscure their operations from even their own engineers. Some of the platforms have offered researchers and the public limited forms of insight into their operations, but these efforts have delivered far less than they promised. It’s now obvious that we need legislation to ensure that researchers and journalists can do the work they need to do. 

There are three things in particular that Congress should do. First, ​​Congress should mandate universal digital ad transparency. Digital advertising is a major avenue for harmful and illegal behavior, especially on platforms that allow advertisers to “microtarget” or tailor ads to specific users. Some platforms have begun making the ads they run more visible to the public, but many haven’t, and the ones that have haven’t gone far enough. Congress should mandate greater public transparency to enable researchers and journalists to study the online ad ecosystem. A group of computer scientists and other experts have made significant headway developing the parameters of such a mandate. Their proposal would require platforms to regularly disclose information about the ads they run, including the ads themselves, how they were targeted, how long they ran, and the number of users who saw or engaged with the ads. The proposal would also require the platforms to disclose the data in a standardized format useful for researchers, and it would fund the development and maintenance of a single public repository for this data. 

Second, Congress should create a legal safe harbor to protect independent journalism and research on the platforms. The platforms are increasingly relying on their terms of service as a cudgel against important journalism and research. Recently, for example, Facebook suspended the accounts of two NYU researchers—Laura Edelson and Damon McCoy—claiming that their research into the spread of disinformation on the platform violates the company’s terms of service. (We represent the researchers in their personal capacities.) This hostility to outside investigation has suppressed journalism and research that would help the public better understand how the platforms work and what impact they are having on society. Congress should immunize this kind of public interest research from legal threats by the platforms, so long as the research is conducted in a manner that respects user privacy. 

Third, Congress should mandate researcher access to platform-held data in carefully controlled circumstances. To effectively study some of the most pressing problems on the platforms, researchers cannot depend solely on publicly available information or even information users choose to donate. Some access to platform-held data is necessary, but access must be tightly regulated to protect user privacy. A proposal by Stanford University professor Nate Persily offers one pathway. Under that proposal, Congress would establish a data sharing regime that compels major platforms to share data with vetted researchers, and immunizes them from civil and criminal liability when they do so. 

These proposals would not, on their own, solve the problems of social media, and so it would be a mistake to let them slow debate about other regulatory measures. Transparency alone will not address discrimination, misinformation, or political polarization on the platforms. This said, better public understanding of the platforms’ pathologies might discipline their operators, and it would help light the path toward more lasting solutions, by allowing us to see the problems more clearly.