The various regulators of the digital economy need strong information sharing powers embedded within a clear division of labour to effectively hold technology companies accountable, UK information commissioner Elizabeth Denham tells MPs and peers.
Addressing the joint Online Safety Bill committee, which was launched in July 2021 to scrutinise the government’s forthcoming online harms legislation, Denham said that when deciding on the duties of each digital regulator, the government should take into consideration how their obligations overlap and interact, and design “information-sharing gateways” accordingly.
“It might sound like an in-the-weeds legal problem, but we need to be able to share information, because from a competition aspect, a content regulation aspect or a data protection aspect we are talking to the same companies, and I think it is important for us to be able to share that information,” Denham told the committee on 23 September 2021, adding that this would ensure that technology companies, “[some] the size of nation states, are not forum shopping, or running one regulator against another and claiming in the privacy interest that they are going to change third-party advertising processes.”
She further added that while it is important digital regulators “act together in concert… we need duties to respect the other regulatory objectives as well as information sharing between the regulators”.
Under the Online Safety Bill, which the government claims will safeguard freedom of expression online, increase the accountability of tech giants and protect users from harm online, tech companies will have a statutory ‘duty of care’ to proactively identify, remove and limit the spread of both illegal and legal but harmful content, or they could be fined up to 10% of their turnover by online harms regulator Ofcom.
While the Digital Regulation Cooperation Forum (DRCF) was formed in July 2020 to strengthen the working relationships between regulators and establish a greater level of cooperation between Ofcom, the Information Commissioner’s Office (ICO), the Financial Conduct Authority (FCA), and the Competition and Markets Authority (CMA), Denham noted that giving them “equivalent powers”, such as the ability to perform compulsory audits, would prevent the practice of “forum shopping” by tech companies.
“Parliament needs to look at the coherence of regulatory regimes… Equivalence in the kind of powers that we need to be able to tackle these large companies is important. I have mentioned audit powers, and again I think that is important for Ofcom,” she said, adding equivalence in the kinds of powers regulators can exercise is especially important when dealing with the same companies across different regulatory regimes.
On the information-gathering powers contained in the Online Safety Bill specifically, which allows Ofcom to compel companies to provide information so they can be assessed for compliance, Denahm said she would like them “to be bolstered by [compulsory] audit powers” so Ofcom as a regulator can properly “look under the bonnet”.
In response to questions about whether she thinks it is an omission in the Online Safety Bill to not include pathways for individual complaints about content, Denham warned against giving any one body too much to do and advocated for a better division of labour between regulators, using the ICO’s own wide-ranging remit as an example.
“The ICO is both an ombudsman, in that we take individual complaints, and a regulator, when we go in and look at whether companies are complying with the Act. We are also an enforcer. We are a little bit of everything. We use the intelligence that we gather through complaints to drive our more systematic investigations and our big actions,” she said. “I think it puts a lot on the shoulders of one organisation to take individual complaints as well as being in charge of oversight and regulating the space of content.
“If individual complaints could come to a different organisation, that might be a way to go, and then Ofcom could learn from the experience of those individuals, but imagine the millions of complaints for take-down requests that might go to an organisation such as Ofcom.”
Denham further added that while regulatory collaboration and cooperation efforts are already underway, it is important to have “bright lines” drawn between their different determinations so that both the public and private companies have clarity about who is making which decisions.
“Obviously, personal data is used in the delivery of content, and personal data is used if you have algorithms that determine the delivery of content. The content regulator and the data protection regulator will be looking at that very carefully,” she said.
“In the work that we have done with delivery of content to children through our age appropriate design code and the work we have done on electoral interference, we looked at analytics and algorithms to deliver content to people that sent them down into profiles and filters, and took them away from the public sphere. There, I think you have an intersection. We do not regulate content, but we regulate the use of data in systems that deliver content.”