The Consumer Financial Protection Bureau (CFPB) issued guidance to protect workers against employers hiring third-party tracking services to digitally monitor staff.
Reports produced by such services that include background dossiers and surveillance-based, “black box” artificial intelligence or algorithmic scores are used by companies to assess performance and inform decisions regarding employees.
As companies increasingly deploy invasive tools to assess workers, they have to follow Fair Credit Reporting Act (FCRA) rules, the CFPB stated.
“Workers shouldn’t be subject to unchecked surveillance or have their careers determined by opaque third-party reports without basic protections,” CFPB Director Rohit Chopra said.
“The kind of scoring and profiling we’ve long seen in credit markets is now creeping into employment and other aspects of our lives. Our action today makes clear that longstanding consumer protections apply to these new domains just as they do to traditional credit reports.”
If an adverse action is taken based on a consumer report, the concerned individual must be informed of the details, including the name, address, and phone number of the agency that provided the information.
The FCRA rules grant rights to consumers to request their credit scores, dispute incomplete or inaccurate information, delete inaccurate data, obtain a “security freeze” on credit reports, and seek damages from violators.
Why Employers Monitor
An August report by research group Cracked Labs stated, “Employees are seen as major risks.”Companies monitor staff to prevent careless security lapses that may culminate in a cyberattack, according to the “Employees as Risk” report.
Software is deployed to analyze large amounts of activity log records and communications data, but in recent years, the purposes have gone well beyond cybersecurity, Cracked Labs stated.
“The boundaries between information security, the protection of corporate information, fraud and theft prevention and the enforcement of compliance with regulatory requirements and organizational policies are becoming blurred,” Cracked Labs stated.
The results of this intrusive monitoring may lead to arbitrary suspicions of workers, especially when the risk assessment is inaccurate, according to the report.
“When employees or certain groups of employees get inaccurately accused of ‘anomalous’ or otherwise suspicious behavior by an organization’s cybersecurity, compliance or human resource departments or when they get automatically blocked from certain activity, this may lead to Kafkaesque experiences.”
However, the researchers stated, “The current study finds that workplace surveillance has overall damaging consequences for workers’ mental health.”
Meanwhile, the collection of vast troves of employee data by employers poses a risk to workers in case of data breaches because sensitive information could end up in the hands of criminals.
The health care sector was the most affected industry, with 809 such instances, followed by financial services, professional services, manufacturing, and education.
In its Oct. 24 statement, the CFPB noted that the Fair Credit Reporting Act’s protections such as worker consent, transparency, and placing limits on what employers do with employee data “are essential in an era where worker data is increasingly commodified and used to make critical employment decisions.”
“By enforcing these rights, the CFPB aims to ensure that workers have control over their personal information and are protected from abuses,“ the agency stated. ”The CFPB will be working with other federal agencies and state regulators to ensure the responsible use of worker data.”