May 2018 | Strategy
Amazon recently patented wristbands that track its warehouse workers’ exact location and hand movements in real time. Japanese firm Hitachi is marketing another wearable device, dubbed a “happiness meter,” that closely monitors employees to collect information on mood and engagement. Are these devices going to be a new norm in people analytics, or are they going too far?
Laszlo Bock, former head of Google’s human-resources department, thinks they’re the latter. He says of the trend to collect new data on employees, “It’s going to play out in a bad way before it plays out in a good way.” Wharton professor Cade Massey, who co-directs Wharton’s People Analytics: HR Transformation through Data program, conversely believes the ‘bad way’ may be avoidable.
Massey says the divide between mining technologies for all of their potential without crossing ethical boundaries is a critical issue, but one that can be resolved. “Companies need to consider what lines they want to draw, and what lines they should draw, and then figure out how to get as much as possible within those lines.”
These concerns are explicitly addressed in People Analytics. “We look at both sides,” says Massey. “First there are all of the positive possibilities. Then there are the risks, potential problems, and hard questions you have to ask.” Ethical questions are considered throughout the program. Faculty and participants discuss the kinds of data that should be collected, what forms of permission are needed from employees, and how the data can be used to help those employees without them giving up too much privacy.
But privacy concerns, while getting much of the attention, aren’t the only potential downside. “There are two ways that organizations can get it wrong, and the program includes sessions on both” says Massey. “The first is crossing ethical boundaries, either knowingly or unknowingly. And the second is using the data improperly. Just collecting it isn’t enough — you need to know what to do with it.”
Data are often seen as a perfect counterbalance to intuitive biases. But because people are biased, often in ways they are completely unaware of, their biases inform the ways in which they use data. Massey explains: “When you make predictions based on historical data it is easy to bake in historical biases. If you have never hired a particular demographic, for example, and you use historical data, it’s hard to show that that underrepresented demographic will do well on the job. If historical data are biased, the prediction you make when you use it will also be biased.”
One thing is certain: as new technologies continue to be developed for people analytics, ethical concerns will develop with them. Some companies may find out the hard way that there are limits to how much monitoring their employees will put up with. Massey says that there is plenty of research on intrinsic motivation and incentives, and we can use it to inform decisions on what kinds of data to collect and how to collect them. Ideally, employees should benefit from some of these technologies. “Organizations need to figure out a way to share gains,” says Massey, “not just extraction, but true joint gains.”
Subscribe to the Wharton@Work RSS Feed