Toilet break targets and data privacy in the “precision economy”

An AI customer service coach that analyses tone, pitch and word rate to help you communicate more empathetically with frustrated customers. A wristband that tracks your hand movements, correcting you with vibratory nudges when it judges that you are performing a task in an inefficient way. Algorithms that track your facial expression and sleep patterns to detect when you’re running low on energy and deploy a smart coffee delivery drone. These are examples of existing, or recently patented, ways in which companies are seeking to integrate emerging IoT (Internet of Things) sensor technology and machine learning techniques into the workplace, to gain an extraordinary level of insight into, and control over, the behaviours and performance of employees.

While some level of surveillance in the workplace is nothing new, advances in machine learning processing of big data make almost every piece of data worth collecting. And the market for ‘snooptech’ is expected to grow by $3.3bn over the next four years. This vision of the future has been characterised in the RSA’s Four Futures of Work report as the ‘precision economy’: a future workplace of hyper-surveillance and algorithmic optimisation, in which organisations create value by measuring and incentivising virtually every aspect of our working lives. And there are several potential business positives, including: to remove subjective bias from management decisions; to set and reward measurable performance targets, fine-tuned by AI; to enhance wellbeing by detecting overwork or unsafe practices; to prevent fraud; and to provide a more holistic, real-time assessment of employee engagement and emerging risks.

However, news stories over the last few months have also highlighted concerns about the implications that this shift may have for worker privacy, wellbeing and autonomy. These side effects include increased pressure and anxiety for workers under the fear of scrutiny against dehumanising productivity expectations; the creation of distrust among colleagues; and that normal behaviours, which may have difficult-to-track business benefits, could be discouraged – such as taking a toilet break, chatting about Love Island or spending time in creative thought. And these consequences are already being felt, particularly in emerging sectors like the gig economy.

The extraordinary level of insight and control that these tools offer, comes with a high level of responsibility for business leaders, HR departments and workers themselves. How should a manager respond upon receiving the notification that an employee is flagged as likely to be leaving the company? What are the appropriate uses for a worker health score based on heart rate, step count and sleep patterns? Companies need to decide how comfortable, and ready, they are using these technologies, where the responsibility for the decisions made based on these new analyses lies, and how they articulate this process to workers in a transparent way.

Monitoring by employers could also fall foul of increasingly strict data protection requirements, such as GDPR. Growing scrutiny has been directed towards companies from the customer side, yet our employers likely already have more confidential information about us than any of the tech giants. An employee is unlikely to have much choice over the finer details of their employment contract, and this imbalance of power means that demonstrating free ‘consent’ is difficult. The monitoring of sensitive data on computer usage, location or biometric data, will require an employer to prove that their right to the data outweighs the employee’s right to privacy, encountering additional compliance costs (for example, Data Protection Impact Assessment).

Despite these challenges, studies have found that employees are increasingly comfortable with being monitored, especially if they understand the how and why, and that workers acknowledge and want to be consulted over workplace surveillance. This research highlights two fundamental principles for companies: clarity and transparency. Clarity of purpose and intent; rooted in the culture and ethics of the organisation and with clear strategic definition and limitation of new data collection, before it is implemented. And transparency on what is being collected, what the justification for this is, and what the rights of the worker are in response. This should be supported by internal consultation, to formulate appropriate data and privacy guidelines and policies.

New technologies are transforming the risk landscape for the workplace. It is essential that employers are aware of the balance between taking advantage of these to boost productivity, while not overlooking the importance of protecting worker privacy, wellbeing, autonomy and individualism.

Source: CC News Feed

Leave a Reply

Your email address will not be published. Required fields are marked *