EC proposes new directive to improve gig economy work conditions

0
289
Oracle enhances customer experience platform with a B2B refresh

Source is ComputerWeekly.com

Millions of people working for gig economy platforms in Europe could be reclassified as workers rather than self-employed, entitling them to a much wider range of rights and workplace protections, under a proposal put forward by the European Commission (EC).

The EC estimates that about 5.5 million people – working for the likes of Uber, Deliveroo, Amazon Mechanical Turk and others – could see their employment status change if the proposal is adopted by member states.

Based on the initial draft, which was originally presented by the EC in December 2021 and aimed at improving the working conditions of those working in the gig economy through digital labour platforms, the proposed directive would establish a set of five “control criteria” to determine whether or not a platform is an “employer”.

This includes: whether the platform has a role in determining or limiting remuneration; whether it requires workers to follow specific appearance rules, such as wearing a uniform; whether it supervises the performance of work, including by electronic means; whether it restricts people’s freedom to organise their own work; and whether it restricts the possibility of people building their own client basis or ability to work for a third party.

If the platform meets at least two of these criteria, it will legally be viewed as an employer, and the workers will be automatically reclassified.

“For those being reclassified as workers, this means the right to a minimum wage (where it exists), collective bargaining, working time and health protection, the right to paid leave or improved access to protection against work accidents, unemployment and sickness benefits, as well as contributory old-age pensions,” said the EC in a press release.

“Platforms will have the right to contest or ‘rebut’ this classification, with the burden of proving that there is no employment relationship resting on them. The clear criteria the Commission proposes will bring the platforms increased legal certainty, reduced litigation costs and it will facilitate business planning.”

Transparency on algorithms

The directive also aims to increase transparency around the platforms’ use of algorithms by giving both workers and self-employed individuals the right to challenge automated decision-making.

Platforms will also need to proactively provide information to workers and their unions about which aspects of their work are monitored, as well as the main parameters these systems use to make decisions.

“These rights will build on and extend existing safeguards in respect of processing of personal data by automated decision-making systems laid down in the General Data Protection Regulation [GDPR] as well as proposed obligations for providers and users of artificial intelligence (AI) systems in terms of transparency and human oversight of certain AI systems in the proposal for an AI Act,” said the explanatory notes for the directive.

It added that although workers already have individual data rights under the GDPR, the proposal would introduce collective rights regarding information and consultation around algorithmic management, giving people greater protection of their data in an employment context.

The EC has said the proposal will also increase transparency around platform work by clarifying platforms’ existing obligations to declare work to national authorities. As part of this particular measure, platforms will be asked to make key information about their activities and the people who work through them available to national authorities.

In response to the proposal, Ludovic Voet, confederal secretary at the European Trade Union Confederation (ETUC), said the directive provided more certainty for workers, who would no longer need to take multinational companies to court over employment contracts.

“The trade union movement can be proud of having made strong demands over the past two years for a presumption of employment relationship and the reversal of the burden of the proof,” he said in statement. “After having been supported by the European Parliament, these are the options that were deemed to be the most effective by the impact assessment of the directive.

“However, it seems some platforms have been successful in their lobbying, as the directive does still set burdensome criteria to activate the presumption of employment, which could defeat the point of it. In practice, criteria might legitimise subordination of self-employed workers and this would defeat the purpose of the directive. The upcoming negotiations should resolve this problem.”

The European Transport Workers Federation (ETF) stressed the need for collective action to ensure the proposal delivers on its mission. “Unions must fight to ensure that platform workers’ rights to social dialogue and collective bargaining are enshrined in European law,” it said.

“What we do not want are global agreements, with some vague declarations. We want clear engagement from Uber, Deliveroo and their cohorts, recognising unions, accepting social dialogue and collective bargaining.”

The proposal must now be discussed by the European Parliament and Council. If it is adopted, member states will have a further two years to transpose the directive into national law.

A UK perspective

In December 2021, UK-based campaign group Worker Info Exchange (WIE) – which was set up to help workers access and gain insight from data collected from them at work – published a report that found there are “woefully inadequate levels of transparency” about the extent of the algorithmic surveillance and automated decision-making that workers are subject to throughout the gig economy.

“Workers are denied access to their personal data outright, are frustrated in their request or are simply given an incomplete return,” it said, adding that existing employment and data protection laws are weakly enforced and do not offer sufficient protection.

“Article 22 protections from unfair automated decision-making [in the GDPR] provide escape options for employers who can claim superficial human review to rubber-stamp unfair machine-made decisions,” said the WIE report.

“The proliferation of profiling, generated by machine learning, can make it exceedingly difficult for workers to ever uncover, understand or test the fairness of automated decision-making relating to workplace fundamentals such as work allocation, performance management and disciplinary action.”

Responding to the proposed directive, WIE director James Farrar said the presumption of employment in particular was a strong aspect. “These [platform] companies have long complained that ‘oh, we’re operating in grey areas, very difficult to understand’, which is just nonsense,” he said. “That confusion has been eliminated for them, I would say.”

But although Farrar described the proposal as a “hugely positive” step forward, he suggested it could go too far in legitimising the practice of a human rubber-stamping automated decisions. “Rather than human reviewers of machine-made decisions, we need a proper human resources-led process where workers have access to due process and a proper appeal,” he said. “The reality is that most platforms have no such function.”

Farrar also criticised the lack of attention given to data portability, which is not mentioned once in the directive.

“They went to the trouble of saying there’ll be room for communication with the platform for workers’ representatives or unions, but haven’t really tackled the whole issue of the right to portability, the right of workers to take the data off the platform, go away and collectivise it in a data trust,” he said. “That right should have been baked into this and it hasn’t been.”

In March 2021, following legal action brought by the App Drivers and Couriers Union (ADCU) on behalf of six Uber drivers, Amsterdam’s District Court ruled that both Uber and Ola must disclose – to different extents – more of the data used to make decisions about drivers’ work and employment.

The court also rejected Uber’s and Ola’s claims that drivers collectively taking action to access their data amounts to an abuse of their individual data access rights, laying the ground for drivers to form their own union-controlled data trust.

“Uber and Ola said this is an abuse of rights, and that the portability and access rights are for the intention of inspecting the data to check its accuracy, not for you to run off and build a data trust,” said Farrar, who is also general secretary of the ADCU.

“The court didn’t agree with Uber and Ola on that – they agreed with us. But this was the opportunity, I think, for the EU to address this issue of data portability. It would also just make it much easier if workers do want to switch platforms, then they can shift their data in a meaningful way.”

On transparency, Farrar said rather than being put in place around periodic big decisions, as suggested by the draft text, it would need to be more constant. “Equally or maybe more important is transparency to the continuous and pervasive decisions determined by a worker’s individual profile that can degrade their opportunity to earn over time due to reduced dispatch activity, for example,” he said.

“We know workers are profiled according to previous performance behaviour and these profiles are used to determine current automated decisions to allocate work. Those profiles change all the time and most workers have no idea what they contain.”

Pointing to a consultation held by the UK’s Department for Digital, Culture, Media and Sport (DCMS) in September 2021, which contained proposals from the government’s Taskforce on Innovation, Growth and Regulatory Reform (TIGRR) to axe Article 22 protections, Farrar said he was also concerned about the divergence between the UK and the EU, and what this meant for platform workers.

“Here’s the EU strengthening protections against algorithmic management, and Britain is going in the opposite direction, chopping away at the already limited protections we have under GDPR,” he said, adding that from an induvial worker’s perspective, “they will face additional risk of exploitation and loss of protection”.

Other reforms suggested by the DCMS in the consultation include removing the requirements on organisations to conduct data protection impact assessments, and the introduction of fees for anyone who would like to make a subject access request for the data held about them.

The UK government is due to issue a full response to that consultation in spring 2022.

Source is ComputerWeekly.com

Vorig artikelRotterdam to Dismantle Bridge for Jeff Bezos’s Superyacht
Volgend artikelWhat Big Tech’s Riches Mean for Our Future