YouTube is relying on its video filtering software as it adjusts its content moderation practices in the wake of the coronavirus. Google, owner ofYoutube, the world’s largest video site, announced Monday that it will temporarily increase its reliance on automatic content moderation, explaining in a blog post that its goal is to ‘continue to act quickly to remove inappropriate content and protect our ecosystem, while implementing protections in the workplace. ‘
A YouTube spokeswoman declined to comment on whether the company was going todrop chargescontent moderation or if employees could remotely review the videos.
Technology platforms like YouTube andFacebook Inc.They depend on tens of thousands of contractors to examine user-generated content detected by their users or their software as potentially inappropriate. In late 2017, Google said its moderation workforce would exceed 10,000 people. Both companies have faced criticism, including from some of their own employees, for their creation of a two-tier workforce. Many content moderators are employed through outsourcing companies and do not receive the benefits and advantages of full-time employees.
Last week, Google began allowing more staff to work from home, and then began demanding employees onNorth Americalet them do it. On March 10, he said he would compensate contractor staff, such as cafeteria workers, for the office closings and began requiring contracted companies to grant paid sick leave.
Facebook, which has required its direct staff in the area ofSan Francisco BayAnd Seattle working from home has said that some of its content review work cannot be done remotely for ‘security, privacy and legal reasons.’ He said over the weekend that he was exploring remote job options for some contract workers.
“For theroles to run in the officeWe are reducing the number of people present at specific times, as much as we can, and taking additional steps to limit contact for those in the office, “a Facebook spokesperson said over the weekend.