Vic Bageria, CEO and CVO, Xpandretail discusses the nuances of trusting the technologies that are empowering the digital and machine age.
The digital age is knocking at customers’ doors, amid fears and uncertainties, which are on the rise as the value of trust in a business cannot be overstated. We are seeing that today, trust relates not only to a company’s brand, products, service and people – but more so in the data and analytics, which is empowering its technology.
As evolution takes its course, machines are working in parallel with businesses – we are seeing a clear need for proactive governance of analytics in order to build this trust. The core element of trust is becoming a defining factor of an organisation’s success or failure. Trust reduces uncertainty and builds resilience as well as influencing reputation, inspires employees, drives customer satisfaction and loyalty, which leads to enabling global markets to function.
The buzzword data analytics has been around for years and recently we are witnessing C-suite executives questioning the trustworthiness of data, analytics and intelligent automation. Although it is not hindering the daily operations, few decision-makers are indeed using different types of analytics to implement new methods of business, but not to its complete capabilities.
Disruptors are leveraging data, sophisticated analytics, robotics and increasingly Artificial Intelligence (AI) to create new value propositions and business models. As we experience the shift from humans towards machines, the age of AI is beginning to offer new channels of protecting public trust. In a recent market research, cognitive systems can analyse millions of records and identify patterns to generate more insights on a company’s processes, controls and reporting. Algorithms, on the other hand, can be aimed to reduce human biases in decision-making, and blockchain can offer greater data security and new distributed trust models.
We all see that the digital age is creating opportunities, but at the same it is creating new concerns, which is undermining the trust across industries and our community as a whole. For example, the constant data breaches, data misuses and inaccuracies, are eroding the public trust. On top of that, technology-driven disruptions might even fuel increased nationalism and protectionism as the market reflects on the job losses and redundancies due to automation. There also is concern that the benefits of digital transformation will not be evenly distributed, therefore worsening disparities between the haves and the have-nots.
It is foreseen that the widespread use of AI will make it imperative – and more difficult – to ensure trusted analytics. As the growing number of organisations implement more complex analytics, machine learning models and automated decisions – regulators are exploring new channels of control on those who accumulate, analyse and use consumer and business data. Now is the time to ask how complex algorithms are going to be governed, which will ensure fair treatment and accurate conclusions.
And herein lies the challenge of the double-edged sword. Organisations must embrace new technology, while ensuring high, stable levels of trust in an uncertain, fast changing digital age.
Businesses want the benefits that digital and automation can deliver, but they don’t always trust the underlying analytics that power those machines.
Most organisations already have an idea of what trusted data and analysis should represent in personal and professional lives. But the main concerns which stand out are related to the validity of data and analytics and if the data can be used in the way which they see it fit, by the people they trust and for a purpose they believe is valuable.
Some trust issues are straightforward. If management has experienced unreliable data or poor insights, they are likely to lose trust in the system they are using. But as technologies become more complex, the trust issues also become more complex.
There clearly is an opportunity for data and analytics to be proven accurate, effective and secure. But the research suggests that the trust gap has as much to do with people’s expectations and perceptions as it does with the actual performance of the technology or the risks associated with it. Indeed, humans often prioritise their emotional response over their logical response, and more information does not necessarily help people build trust.
Finally, the ends and means are also critical for trust. For example, is the machine achieving the right level ethically, as well as financially, for all those affected by it? Is it overseen by people who exercise effective control and can manage changes, risks and uncertainties? As machines take on more day-to-day decision making, who is judging whether these controls are proper and effective?