Later On

A blog written for those whose interests more or less match mine.

Weeks after PTSD settlement, Facebook moderators ordered to spend more time viewing online child abuse

leave a comment »

Sam Biddle reports in The Intercept:

WITH THE INK still drying on their landmark $52 million settlement with Facebook over trauma they suffered working for the company, many outsourced content moderators are now being told that they must view some of the most horrific and disturbing content on the internet for an extra 48 minutes per day, The Intercept has learned.

Following an unprecedented 2018 lawsuit by ex-Facebook content moderator Selena Scola, who said her daily exposure to depictions of rape, murder, and other gruesome acts caused her to develop post-traumatic stress disorder, Facebook agreed in early May to a $52 million settlement, paid out with $1,000 individual minimums to current and former contractors employed by outsourcing firms like Accenture. Following news of the settlement, Facebook spokesperson Drew Pusateri issued a statement reading, “We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future.”

Less than a month after this breakthrough, however, Accenture management informed moderation teams that it had renegotiated its contract with Facebook, affecting at least hundreds of North American content workers who would now have to increase their exposure to exactly the sort of extreme content at the heart of the settlement, according to internal company communications reviewed by The Intercept and interviews with multiple affected workers.

The new hours were announced at the tail end of May and beginning of June via emails sent by Accenture management to the firm’s content moderation teams, including those responsible for reviewing Child Exploitation Imagery, or CEI, generally graphic depictions of sexually abused children, and Inappropriate Interactions with Children, or IIC, typically conversations in which adults message minors in an attempt to “groom” them for later sexual abuse or exchange sexually explicit images. The Intercept reviewed multiple versions of this email, apparently based off a template created by Accenture. It refers to the new contract between the two companies as the “Golden SoW,” short for “Statement of Work,” and its wording strongly suggests that stipulations in the renewed contract led to 48-minute increases in the so-called “Safety flows” that handle Facebook posts containing depictions of child abuse.

“For the past year or so, our Safety flows (CEI,IIC) as well as GT have been asked to be productive for 5.5 hours of their day,” reads one email reviewed by The Intercept, referring to “Ground Truth,” a team of outsourced humans tasked with helping train Facebook’s moderation algorithms. “Over the last few weeks the golden sow, Accenture’s contractual agreement with Facebook, was signed. In the contract, it discussed production time and the standard that all agents will be held to.” Accenture moderators, the email continues, “will need to spend 6.3 hours of their day actively in production” — meaning an extra 48 minutes per day viewing the arguably most disturbing possible content found on the internet.

The email then notes that Accenture is “aligning to our global partners as well as our partners in MVW,” a likely reference to Mountain View, California, where, the email suggests, moderators were already viewing such content for 6.3 hours per day. It is understood, the email said, that there could be “one offs every now and then when you are unable to meet the daily expectation of 6.3″ hours of exposure, but warned against letting it become a pattern.

Pusateri, the Facebook spokesperson, told The Intercept, “We haven’t increased guidance for production hours with any of our partners,” but did not respond to questions about Accenture’s announcement itself. Accenture spokesperson Sean Conway said only that they had not been instructed to enact any change by Facebook, but would not elaborate or provide an explanation for the internal announcement.

Not only does the increase in child pornography exposure seemingly run afoul of Facebook’s public assurances that it will be “providing [moderators] additional support through this settlement and in the future,” it contradicts research into moderator trauma commissioned by the company itself. A 2015 report from Technology Coalition, an anti-online child exploitation consortium co-founded by Facebook and cited in Scola’s lawsuit, found that “limiting the amount of time employees are exposed to [child sexual abuse material] is key” if employee trauma is to be avoided. “Strong consideration should be given to making select elements of the program (such as counseling) mandatory for exposed employees,” the paper also noted. “This removes any stigma for employees who want to seek help and can increase employee awareness of the subtle, cumulative effects that regular exposure may produce.” The Accenture announcement, however, appears to fall well short of mandatory counseling: “Agents are free to seek out wellness coaches when needed,” the email states. A request for comment sent to Technology Coalition was not returned.

Accenture’s “wellness” program is a contentious issue for Facebook moderators, many of whom say such quasi-therapy is a shoddy stand-in for genuine psychological counseling, despite the best intentions of the “coaches” themselves. Last August,  . . .

Continue reading. There’s more.

FWIW, I make a small monthly contribution to The Intercept.

Written by LeisureGuy

18 June 2020 at 11:08 am

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.