Kenya’s Employment and Labour Relations Court on Monday refused the Facebook parent company’s request to remove it from a suit about working conditions for content moderators.

The case filed by Daniel Motaung, a former Facebook content moderator employed by Sama, alleges that Sama employees suffered psychological injury from repeated exposure to graphic and disturbing content. Motaung’s case claims the companies, Sama, Meta Platforms Inc. and Meta Platforms, Ireland Ltd, acted negligently by failing to provide moderators with adequate psychosocial support after exposing them to graphic content.

The petitioners also want to be paid for working overtime without pay and for psychological harm caused by the nature of the job. According to reporting from Time, Sama employees were not always aware of the nature of the job until they started to work, Motaung says this was his case.

Sama a training-data company, that annotates data for artificial intelligence algorithms was until January Meta’s outsourcing partner for content moderation. Time recently reported that Sama employees also annotated data that Open AI, the company behind the viral AI website, Chat GPT used to train its algorithms.

Meta argues that the 12 petitioners led by Motaung had been contracted by a third party, which had been outsourced for content moderation services so it is not liable.

According to local media reports, the presiding judge, Justice Jacob Gakeri ruled against Facebook’s argument that Kenyan courts do not have jurisdiction to hear the case since the foreign corporations are not domiciled or trading in Kenya. “My finding is that 2nd and 3rd respondent shall not be struck,” the judge said.

Get the best African tech newsletters in your inbox