By IESE Business School| Sep 15, 2023
A study at Zara shows what companies must do to optimize human-algorithm interactions.
[CAPTION]The human-algorithm relationship is crucial to the tools’ effectiveness.
Image: Shutterstock[/CAPTION]
Companies are turning to algorithms for recommendations on everything from which job candidate may be the best fit, to what products will tempt their customers. But to get the most out of these advanced analytical tools, companies must remember a key element: the human managers who will be using them, because the human-algorithm relationship is crucial to the tools’ effectiveness.
To understand this interaction, IESE Business School’s Anna Saez de Tejada Cuenca and UCLA Anderson’s Felipe Caro looked at managers’ use of algorithm-based decision support systems (DSS) for seven clearance markdown campaigns at Zara. The algorithm made recommendations, but it was up to the human managers to adopt or override them.
_RSS_In a pilot test, Zara found that managers who followed the DSS recommendations increased revenue from clearance sales by almost 6%. That led Zara to roll out the use of DSS across its stores and franchises worldwide.
But then something changed: managers started ignoring the DSS recommendations, sometimes more than half the time; moreover, they would lower prices when the system recommended keeping them the same, or they would apply more aggressive markdowns than what was recommended. The result: managers that deviated more often from the DSS advice achieved less revenue.
What happened? Human biases kicked in. Specifically, managers were used to receiving weekly reports showing current inventory levels and they typically made decisions aimed at ensuring that nearly all remaining stock was sold as soon as possible.
The DSS, however, made recommendations with the goal that overall revenue would be higher at the end of clearance sales, and that could mean more conservative markdowns in the initial periods. Yet, if managers’ attention was focused on inventory, then they naturally ignored the algorithm.
Zara was not alone in experiencing this: past research involving a consumer electronics firm found its managers overrode DSS recommendations more than 80% of the time, also for peculiar human reasons.
Also read: The myth of algorithmic transparency