On Tuesday, the White House convened a listening session with workers, researchers, labor and civil rights leaders, and policymakers on the use of automated technologies by employers to surveil, monitor, evaluate, and manage their workers.
During the session, workers representing diverse sectors of the economy—including call centers, trucking, warehousing, home health care, and app-based ride sharing—shared their stories about how the use of these technologies impacted them. Workers raised concerns about health, safety, privacy, fair pay, labor organizing, collective bargaining, and reasonable accommodations for disabilities.
The listening session was attended by officials from the White House Domestic Policy Council (DPC), Office of Science and Technology Policy (OSTP), and Office of Management and Budget (OMB), as well as the Consumer Financial Protection Bureau (CFPB), the Equal Employment Opportunity Commission (EEOC), and the National Labor Relations Board (NLRB).
Tuesday’s listening session followed a Request for Information (RFI) released by the Biden-Harris Administration earlier this month to advance understanding of the design, deployment, prevalence, and impacts of automated technologies that monitor and track workers. Responses to this RFI will be used to inform new policy responses; share relevant research, data, and findings with the public; and amplify best practices among employers, worker organizations, technology vendors, developers, and others in civil society.
The stories workers shared included a warehousing professional who explained that his employer uses automated technology to track how quickly he moves packages—setting quotas he must meet or face discipline, and making him feel like a machine. He and his coworkers face strong pressure to meet those productivity targets by harming their health—such as by skipping meals or breaks—yet many fear to speak out for concern that their employers will retaliate based on data collected on them.
A home health care professional described how her employer’s electronic verification system makes it harder to give her clients the care they need. The system requires her to clock in and out through a phone system—and because she cannot easily clock in or out while providing care, she has worked for hours at a time that are not counted towards her pay. A driver for app-based transportation services explained how apps continuously track his location, speed, phone use, and more—but provide no transparency about how this data affects drivers’ pay or ride offers.
A package delivery driver described how automated surveillance technologies—including cameras, sensors, and handheld devices—have become widespread, and can trigger disciplinary action despite being unreliable. A call center professional shared that AI-based conversation management tools frequently give inaccurate prompts that she is required to use, preventing her from properly supporting customers. While other workers are often afraid to speak out for fear of retaliation, she felt comfortable expressing her concerns through her labor union, which advocates for data transparency, and prevents employers from disciplining or firing workers without cause.