I’ve spent more than two decades as an engineer and executive building digital platforms for everything from helping people find childcare (Sittercity.com) to supporting President Obama’s 2012 election campaign. Many of the products I’ve worked on were made possible by the protections built into Section 230 of the CDA. While I’ve been a strong supporter of the law in the past, the time has come to reform it. The big platforms (including Facebook and Twitter) have shown that self regulation isn’t a possible solution unless it is spurred on by real legislative change. While different sides have different views of the problem, the fact is that these companies have developed not just business monopolies, but effective monopolies on the flow of information in society. That creates a public interest that an unmodified Section 230 no longer serves.
A priori, since the objections to repeal or revision of Section 230 tend to be rooted in free speech rights, it’s worth pointing out that these rights, while broad and sacred, are not absolute. Perjury, fraud, libel, assault, certain kinds of pornography, and hate speech are all common exceptions to free speech. At the moment, due in significant part to Section 230, all of these regulated types of speech thrive online.
The platforms have a grain of truth in their claims that they should not become the arbiters of truth, or, implicitly, of these laws. But rather than effectively vacate them, a better option would be to shift them back to where they belong: in public and in the courts.
How could this be done?
First, require the platforms to better verify the people who use their platforms even if they don’t make that information generally available to the public. If liability is to be shifted from the platform to the person responsible for the content, it must be straightforward to identify to the level of legal certainty who the person is. While some will point out that anonymity is important for users in authoritarian regimes, it may well be that the platforms need different operating plans in different countries (just as they currently have in China) and that anonymity may well be causing more harm that it is avoiding. Advertisers and other sites that support social sign-in should support these changes as well, since they would also reduce bots and other fake traffic. This would be a trimmed down version of KYC in financial services.
Second, the platforms should be required to create a machine-readable near realtime public archive of all posts that exceed a certain threshold of views. If people truly desire privacy, they can be given the option to limit the reach of their posts to below that threshold (likely based on Dunbar’s number – the quantity of real social connections a person can maintain – which is around 150.) If a person doesn’t accept that limit, it is because they desire their post to reach not just their friends but the largest possible audience – effectively, they want to be publishers. Given this, it is reasonable that they should agree that their post will be part of the archive and that they will take public responsibility for it.
This would provide transparency and allow researchers to prove or put to rest claims of algorithmic bias (a key Republican concern.) It would also have the incidental benefit of chipping away at the monopoly status of these companies since these archives would provide troves of AI training data that potential competitors need to create comparable products.
Third, since this will no doubt lead to an explosion of claims, the platforms should be subject to a tax to offset the costs to the judicial system. This would be a straightforward tax on earnings so those who benefit most would pay the most. These companies should embrace such a tax since the alternative would be to vastly expand their content moderation teams which have already shown they are unequal to the job.
This proposal is relatively modest compared to many others under consideration and attempts to balance the interests of individual people, the platforms, and society. It may not be enough, but it would be an important first step.
Leave a Reply