Skip to main content
Blog

Algorithms at the service of ACT – The need for transparency

By 22 Agosto, 2023Setembro 17th, 2023No Comments

In recent weeks, the media have echoed an initiative by ACT (Portuguese Authorithy for Labour Conditions) to combat job insecurity. This measure had been promised since the beginning of 2023 through a government announcement.

Thus, the Minister of Labor, Solidarity and Social Security, Ana Mendes Godinho, announced that the State was preparing an action for companies to regularize the situations of the approximately 300,000 fixed-term contracts that have exceeded the legal deadline. To this end, during the announcement it was explained that (i) a cross-check was carried out between databases to find out the number of people with fixed-term contracts whose time had already exceeded the legal deadline; (ii) an action was being prepared to intelligently detect this cross-check, after the approval of the Decent Work Agenda, so that companies would be invited to regularize these situations.

This initiative took place during the month of July. In this way, the ACT notified employers to convert employment contracts which, in the opinion of the administrative authority, should be converted because they had exceeded the respective legal deadlines. This communication decision was based on an analysis resulting from a promised cross-checking of data, which would lead one to believe that the information was reliable and consistent. However, this was not the case. On the contrary, the risks of administrative authorities using algorithms for punitive purposes were exposed. Let’s see.

The notifications made by ACT included requests to convert fixed-term employment contracts from workers who had already been identified as having an indefinite contract by the Social Security services, others who had already had their contract terminated several years ago or even workers with a fixed-term contract due to retirement. In view of this reality and its public exposure, ACT began to inform some employers that they should disregard the initial notification. This leads us to question whether the possibility of cross-checking data and using algorithms by inspection authorities should not have to respect elementary principles of transparency. In fact, the lack of information on the criteria used puts companies at high risk when it comes to defending themselves. While for some employers it is relatively easy to guarantee a full defense of the accusation or facts charged, for others the difficulty can be high. All the more so when the special procedure for administrative offenses resulting from Article 28 et seq. – when the infraction whose facts can be verified exclusively by information collected in a database – simplifies companies’ right to an adversarial hearing.

Most of the current doctrine and legislative processes are focused on the problem of algorithms from the perspective of their use by companies in controlling workers’ labor activity. In defense of transparency, the Decent Work Agenda itself established an obligation for employers to include in their duty to inform the parameters, criteria, rules and instructions on which algorithms or other artificial intelligence systems that affect decision-making on access to and maintenance of employment, as well as working conditions, including profiling and monitoring of professional activity, are based.

Antonio Aloisi and Valerio de Stefano influenced the global legislative process with their Your boss is an algorithm[1], suggesting clues about the levels of protection for workers when employers use algorithms. Maybe we’re at a stage where we also have to worry about whether the inspector is an algorithm. Only in this way can we protect employers from the abusive use of cross-checking. First-hand experience has shown that these precautions must be taken in the interests of transparency.

Equipa DCM | Littler

________________________________________________________________________________________________________________

[1] Bloomsbury Publishing, 2022. Versão italiana: Il tuo capo è un algoritmo – Contro il lavoro desumano, Tempi Nuovi, 2020