Recommended Stories. The ZIP code you entered is outside the service areas of the states in which we operate. Apple and AMD suffered stock declines inbut that hasn't dampened their excellent long-term outlooks. Best Rating Services, Inc. You have selected the store.
Join us live at Davos. Accenture Life Trends Explore the evolving power dynamic between people and businesses. Value untangled. Accelerating radical growth through interoperability. Measuring our success by the value we deliver in all directions.
Previous Next Pause Play. Davos live on our Foresight App Download for exclusive Foresight 15 minute daily live discussions, our latest thought leadership and Davos insights.
Industry X. Renown Health develops a new program to engage older adults for better health management. Public Service. Geographic Services at Accenture.
Careers Blog. See all stories. Join us. Accenture says signing the document is voluntary. But two current employees told The Verge that they were threatened with being fired if they refused to sign. Accenture said employees are not being asked to disclose disabilities or medical conditions, and it framed the document as a general disclosure that it has been providing new employees for years.
Google also would not answer questions about the prevalence of PTSD among its workforce of moderators. Instead, it issued this statement:. Moderators do vital and necessary work to keep digital platforms safer for everyone. Employees said they struggled to afford rent and were dealing with severe mental health struggles.
The moment they quit Accenture or get fired, they lose access to all mental health services. One former moderator for Google said she was still experiencing symptoms of PTSD two years after leaving.
From my own interviews with more than moderators over the past year, it appears to be a significant number. And many other employees develop long-lasting mental health symptoms that stop short of full-blown PTSD, including depression, anxiety, and insomnia. Facebook alone currently faces lawsuits that are seeking class action status in California and in Ireland. Under the Occupational Safety and Health Act of OSHA , employers are required to provide a workplace that is free of hazards that can cause serious harm or death.
The act was designed to acknowledge that most employees cannot avoid unsafe conditions that are created by their employers, said Hugh Baran, a staff attorney with the nonprofit National Employment Law Project. Baran said forcing workers to sign PTSD acknowledgments could make them less likely to sue in the event that they become disabled.
But companies like Accenture would still be liable for harm caused on the job, he said. Meanwhile, employees I spoke with expressed shock that only recently — after they had been doing the job for more than a year — did Accenture acknowledge that the work could scar them deeply and perhaps permanently. Skip to main content. The Verge The Verge logo. The Verge homepage. Menu Expand.
Getting to know the "Reimagined" consumer. Since the pandemic began, people around the world have dramatically shifted their views on the businesses they support and the products they use.
Hear what consumers are really thinking and how companies can make sure they align and grow in the face of these changing priorities. Shaping the sustainable organization. Leaders have been under tremendous pressure to deliver financial value to stakeholders, while also committing to sustainability goals. How do you radically reposition after an industry collapse? Soitec has tripled revenue and profitability with The Net-Zero Industry Tracker highlights the state of the net-zero transition in key industrial sectors.
A new survey shows younger investors are proving to be prudent participants in global capital markets, but advisors have a chance to serve them better. Combining deep industry expertise, advanced analytics capabilities and human-led design methodologies to enable clients to act with speed and confidence. In an effective digital ecosystem, companies collaborate with entrepreneurial partners located anywhere—from next door to across the globe.
Use advanced analytics and human-centric design to co-create technology-enabled, agile strategies that ensure your clients win competitive advantage, unlock value and drive profitable growth. Skip to main content Skip to footer. About Accenture. Accenture Strategy. Zero-Based Transformation. Zero-Based Transformation Transformation starts at zero with rethinking priorities, resetting cost structures and redeploying resources Intelligent Operating Model Helping clients build the organizational agility and resiliency required for continuously changing markets and Private Equity We help private equity firms unlock growth, improve financial performance, and manage risk at speed with proven solutions.
Sustainability We embed sustainability into everything we do and with everyone we work with. Accenture at CES Prepare to innovate, adapt and lead at the speed of life. What we think. Marketing Transformation. Accenture shares why a holistic approach to technology is essential to maximizing value in merger and acquisition deals. Artificial Intelligence. New Science. Agile Business Models. View Transcript. Case studies. How Volkswagen is driving clarity and confidence in their sustainability strategy.
Read More. High Tech. Several workers I spoke with are hoping to become citizens, a feat that has only grown more difficult under the Trump administration. They worry about speaking out — to a manager, to a journalist — for fear it will complicate their immigration efforts. For this reason, I agreed to use pseudonyms for most of the workers in this story.
More than that, though, Peter and other moderators in Austin told me they wanted to live like the full-time Google employees who sometimes visit his office. A higher wage, better health benefits, and more caring managers would alleviate the burdens of the job, they told me.
For most of this year, I thought the same thing Peter did. Bring the moderators in house, pay them as you would pay a police officer or firefighter, and perhaps you could reduce the mental health toll of constant exposure to graphic violence. Then I met a woman who had worked as a content moderator for Google itself. She earned a good salary, nearing the six-figure mark.
There were excellent health benefits and other perks. But none of these privileges would ultimately prevent the disturbing content she saw each day from harming her. She had trouble interacting with children without crying. A psychiatrist diagnosed her with post-traumatic stress disorder.
Daisy Soderberg-Rivkin was working as a paralegal in when she spotted a listing online for an open position at Google. The listing said associates would process legal requests to remove links from Google search due to copyright violations, defamation, and other inappropriate content. It stated that associates would also have to review some links containing child abuse imagery. For the most part, videos reported for terrorist content or child exploitation are reviewed by contractors like the ones in Austin.
But Google also hires full-time employees to process legal requests from government entities — and, when required, remove images, videos, and links from web search. Daisy was surprised when, a few months after she applied, a recruiter called her back. Over eight rounds of interviews, Googlers sold her on the positive impact that her work would have. Really, how bad could it be?
Neither, it seems, did Google. Daisy was assigned to review legal requests for content removals that originated in France, where she is fluent in the native language. To her surprise, the queue began to overflow with violence.
On November 13th, , terrorists who had pledged their loyalty to ISIS killed people and injured more in Paris and its suburb of Saint-Denis, with the majority dying in a mass shooting during a concert at the Bataclan.
It slows everything down. In July , terrorists connected to ISIS drove a cargo truck into a crowd of people celebrating Bastille Day in the French city of Nice, killing 86 people and wounding more. Links to graphic photos and videos began to pile up. Managers pressured Daisy to process an ever-higher number of requests, she says.
We need to kill this backlog , they said. In February, I wrote about the lives of Facebook moderators in the United States , focused on a site in Phoenix where workers complained of low pay, dire working conditions, and long-lasting mental health problems from policing the social network.
In June, I wrote a follow-up report about a Facebook site in Tampa, Florida, where a moderator had died after suffering a massive heart attack on the job. By then, I had received messages from employees of other big social platforms explaining that these issues affected their companies as well.
Beginning this summer, I sought out people who had worked as moderators for Google or YouTube to compare their experiences with those I had previously written about. With its large number of internet services, some of which have attracted user bases with more than a billion people, Google requires an army of moderators. But disturbing content can be found nearly everywhere Google allows users to upload it. In October, the company reported that, in the past year, it had removed , pieces of content for containing violent extremism from Blogger, Google Photos, and Google Drive alone — about per day.
Even on YouTube, much of the content reviewed by moderators is benign. When no videos are reported in their queues, moderators often sit idle. One Finnish-language moderator told me she had gone two months at her job with nothing at all to do during the day. At most, she might be asked to review a few videos and comments over an eight-hour span. She spent most of her workday browsing the internet, she told me, before quitting last month out of boredom.
Several of them told me they mostly enjoy their work, either because they find the task of removing violent and disturbing videos from Google search and YouTube rewarding or because the assigned tasks are simple and allow them ample time during the day to watch videos or relax. We have fun! Instead, they spoke of muscle cramps, stress eating, and — amid the rising rents in Austin — creeping poverty.
They talked of managers who denied them break time, fired them on flimsy pretexts, and changed their shifts without warning. For the workers most deeply affected by the violence, they expressed a growing anxiety about the side effects of witnessing dozens or more murder scenes per day.
When he leaves his job in Austin, Peter tries to unwind. Over time, this has become more difficult. The action movies he once enjoyed no longer seem fictional to him.
Every gunshot, every death, he experiences as if it might be real. Some of his co-workers cope by using drugs — mostly weed. Since Google first hired Accenture to begin spinning up the VE queue in Texas, he has seen them all become more withdrawn. Now nobody is even wanting to talk to the others. He joined the project in , the year it began.
At the time, YouTube had come under significant pressure to clean up the platform. Journalists and academics who investigated the service had found a large volume of videos containing hate speech, harassment, misinformation about mass shootings and other tragedies, and content harmful to children.
Many of those videos had been found on YouTube Kids, an app the company had developed in an effort to steer children toward safer material. In response, YouTube CEO Susan Wojcicki announced that the company would expand its global workforce of moderators to 10, , which it did. Contract content moderators are cheap, making just a little over minimum wage in the United States.
She told me that relying on firms like Accenture helps Google adjust staffing levels more efficiently. If the company is developing a new tool to help catch bad videos, it might need more moderators initially to help train the system.
But afterward, those moderators are no longer needed. Unlike Facebook, Google declined to let me visit any of its sites. Employees work in a dedicated space known as the production floor where they work in shifts to process reports.
Daisy found the terrorist material disturbing, but she was even more unsettled by what Google calls child sexual abuse imagery CSAI. The job listing had promised she would only be reviewing content related to child abuse for an hour or two a week. But in practice, it was a much bigger part of the job. Initially, the company set up a rotation. Daisy might work CSAI for three weeks, then have six weeks of her regular job. But chronic understaffing, combined with high turnover among moderators, meant that she had to review child exploitation cases most weeks, she says.
You talk in your sleep. Her nightmares were getting worse. And she was always, always tired. A roommate came up behind her once and gently poked her, and she instinctively spun around and hit him. One day, Daisy was walking around San Francisco with her friends when she spotted a group of preschool-age children. A caregiver had asked them to hold on to a rope so that they would not stray from the group.
I saw the rope, and I pictured some of the content I saw with children and ropes. And suddenly I stopped, and I was blinking a lot, and my friend had to make sure I was okay. I had to sit down for a second, and I just exploded crying. In the following weeks, Daisy retreated from her friends and roommates. Her job was to remove this content from the internet. To share it with others felt like a betrayal of her mission. Google kept a counselor on staff, but she was made available to the legal removals team at irregular intervals, and her schedule quickly filled up.
Daisy found the counselor warm and sympathetic, but it was hard to get time with her. Because everyone was feeling these effects. When she did successfully make an appointment, the counselor suggested that Daisy begin seeing a private therapist.
Meanwhile, Daisy grew more irritable. She asked the people in her life not to touch her. Every time she looked at the children, she imagined someone hurting them. As her mental health declined, Daisy struggled to keep up with the demands that were placed on her. More and more, she cried at work — sometimes in the bathroom, sometimes in front of the building. Other times, she fell asleep at her desk.
Toward the end of that first year, her manager asked to have a conversation. They met inside a conference room, and the manager expressed his concerns. We need you to step up your productivity game. We have emotions, and those emotions are deeply scarred by looking at children being raped all the time, and people getting their heads chopped off. Sometimes, when she thought about her job, she would imagine walking down a dark alley, surrounded by the worst of everything she saw.
It was as if all of the violence and abuse had taken a physical form and assaulted her. Just keep on doing it. A few days later, Daisy told her manager that she intended to take paid medical leave to address the psychological trauma of the past year — one of several on her team who had taken leave as a result of emotional trauma suffered on the job. She thought she might be gone a few weeks, maybe four. The killings were coming in faster than the Austin office could handle. Even with hundreds of moderators working around the clock in shifts, Accenture struggled to keep up with the incoming videos of brutality.
The violent extremism queue is dominated by videos of Middle Eastern origin, and the company has recruited dozens of Arabic speakers since to review them. Many of the workers are recent immigrants who had previously been working as security guards and delivery drivers and heard about the job from a friend.
We needed to start working and making money. Workers I spoke to were initially grateful for the chance to work for a large technology company like Google. While the contractors technically work for Accenture, Google blurs the boundaries in several ways. Among other things, the contractors are given google. I thought about a career. But until orientation, the actual nature of the work in the violent extremism queue remained opaque. Wojcicki promised to reduce their burden to four hours last year , but it never happened.
Accenture denies setting any productivity quotas for workers. Wellness time is set aside for workers to decompress from the rigors of the job — by taking a walk outside, talking to an on-site counselor, or by playing games with co-workers. Close your screen and just go. Google offers its contractors dramatically more downtime than Facebook, which asks its moderators to make do with two minute breaks, a minute lunch, and just nine minutes per day of wellness time.
Facebook says that with training and coaching, its moderators are viewing content roughly six hours a day. But if two hours of wellness time per day is the ideal, in Austin, it is not the norm.
Four workers told me they were routinely denied break time when the VE queue got particularly busy. Tracking software installed on their computers records each minute of video they watch, with a target of five hours.
The false promise of extended break time in Austin is consistent with the overall picture workers have painted for me at content moderation sites around the world.
Accenture's Kelly Bissell speaks with a child, college student, and tech expert about cybersecurity and the Cloud. Cloud technology is helping businesses and people weather the . Feb 25, · Accenture shares a vision of the future where technology has improved every aspect of daily life: high-speed railways expedite transportation, an athlete practices his sport . The official Instagram account of Accenture—sharing #AccentureMoments from around the andypickfordmusic.comg: youtube.