Picture Credit: UBS Companion Website BETA
With advancements in Natural Language Processing (NLP), face recognition, 3-D face model creation, emotion sensing and affective computing, Conversational Expert-Avatars (CEAs) are about to enter the market. Typically used in client support applications, their expertise is supported by data that is generated with deep learning algorithms or provided by the CEAs owner company. Emotional AI is applied to sustain an emotional bond during the conversation and to build-up trust and empathy between the CEA and the client. A whole range of expert-avatars will eventually impact and possibly disrupt activities which are communicative and advisory. We can expect that within 3-5 years the quality of CEAs will match or possibly exceed humans in communicating and advising at an expert level.
In a bold move that could redefine the delivery of advice and the future of work itself, UBS has created a digital clone of its Swiss chief economist, Mr Kalt. Presented at a press conference held in Zurich on July 3, 2018, the event marks the first time that a global bank has created a CEA by cloning a human staff member to be presented to clients to answer their questions. This is a pilot project to test customer acceptance, and the bank states that the project – known as ‘UBS Companion’ – is not about replacing its chief economists, but “trying to find the best possible combination of the human and digital touch”. A selected group of UBS clients will see the cloned Mr Kalt on a video screen. To respond to their questions Mr Kalt is programmed to deliver answers as the real Mr Kalt would. This controlled deployment of IBM’s Watson AI technology will not interpret how the cloned Mr Kalt might respond to a question it doesn’t know. Rather, it will allow the chief investment office’s ‘house view’ to be disseminated to a broader range of clients than might have access to Mr Kalt’s human form. “In the future, we could see UBS advisers in thousands of different rooms, all with Mr Kalt available as a digital human,” said Mark Fitzgerald, the director of government and enterprise at FaceMe, the New Zealand-based AI company that built the avatar for UBS with IBM. The avatar was created after a photo-shooting of Mr Kalt’s head for half a day using more than 120 HD professional cameras on a special scanning rig. The digital rendering is such that a customer might think they’re looking at a video of a real person on a conference call – but the experience is not designed to deceive them, Mr Fitzgerald said.
Characteristics of CEAs and where to apply them
The credibility of a CEA to cover a specific area of expertise should be reflected by the CEAs ‘personality’. For example, a financial CEA should be modelled after a biological individual that is successfully engaged in financial consulting as UBS did by digitally cloning Mr Kalt. A CEA has to radiate expertise in his/her specific field of expertise which in turn supports communication at an emotional level, creating a strong bond of empathy. As has been covered by many psychological studies and management reviews, conversational empathy makes the knowledge transfer far more efficient than one-directional forms of knowledge transfers. The recipients of the knowledge transfer are individuals or groups of people engaging the CEA in a discussion related to his/her expertise.
At UBS, the test equipment (video screen, camera, microphone, loudspeakers, local processor and communication equipment) is installed at one of their branch offices in Zurich. Selected clients communicate with the cloned Mr Kalt in the presence of a human advisor to enhance the overall quality of the conservation and to test the client’s reaction. A logical next step is to imagine a scenario where the client uses his own equipment and internet access to directly engage the CEA without the presence of a human advisor. As this is probably the way things will develop, the question looms as to what role human advisors will play in the future and how they will be trained to fulfil their newly assigned tasks.
Definition and adherence to a code of conduct
In a first reaction, a client’s response to a CEA might show concern about its trustworthiness and credibility. To support trust in communicating with the CEA, a code of conduct could be assigned to the CEA, similar to that which employees of a corporation must typically adhere to. Taking examples quoted from the UBS published code of ‘conduct and ethics’, the following excerpts are applicable:
“Values and ethics: We don’t just follow the laws, rules and regulations in everything we do. We do what is right. We don’t just ask ourselves whether what we’re doing is legal, but whether it fits with
our three UBS Behaviors: integrity, collaboration and challenge….
Client relationships: We look after our clients for the long term, winning their loyalty by earning their trust. We try to anticipate what our clients are going to need before they ask. We go out of our way to give them an exceptional service….
Client confidentiality: Our clients trust us to keep the information they’ve shared with us safe and secure and only use it in the ways we’ve agreed with them. We follow the highest standards of information security to keep our client information confidential and to protect legitimate client privacy rights….”
At UBS each employee and board member, in an annual affirmation process, declares to have read the code and affirms to live up to its content. It appears sensible that a CEA must affirm his/her adherence to the company’s code of conduct as well. This in turn implies that a CEA understands values and ethics both in a societal regulatory and business context, anticipates and understands a client’s need and maintains a client’s confidentiality and trust. Any violation can result in a significant reputation damage to the company that provides the CEA service.
Contractual issues in setting up a CEA service
To implement a CEA service requires a design and review process which addresses some of the following contractual issues:
- Relationship between the human expert being cloned and the CEA service provider. What happens if the cloned expert leaves the service provider?
- Relationship between CEA and client including issues of liability, confidentiality, service availability, quality and a change of cloned expert policy.
- Ownership of personal data supplied by a client communicating with the CEA including emotional sensing profiles and dialogues recorded over time.
- Security both in respect to the transfer as well as the storage of personal data including the clients emotional sensing profiles and dialogues.
- Data quality and integrity of CEAs’ knowledge base to avoid bias in the analytical interpretation of information being processed.
- Rights to erase all personal data if a client decides to terminate the relationship with the company providing the CEA service.
Covering these contractual issues is part of the engineering effort to build and maintain a CEA. The costs of building and operating a CEA are significant as several complex technical issues have to be addressed.
Components of a CEA
A CEA is an integrated, interdisciplinary digital object combining several technologies. The following provides a high-level inventory of technical components that are required to create and maintain a CEA:
Conversational Engine to support communication between CEA and client
- Synchronisation of facial and body expressions with speech
- Identification of client, voice and face recognition
Analysis of a client’s personality and expectations:
- Area of knowledge and interest, history of past sessions with client
- Current relationship with client based on client and CEA service provider data
Knowledge base of the CEA
- Corporate data of the CEA’s owner company
- Public data within the realm of the CEA’s expertise
Mathematical Engine (deep learning AI) supporting the CEA’s knowledge response
- Algorithms applied, transparency of algorithmic results
- Code of ethics applied to the execution of algorithms
Client’s emotion sensing
- Algorithms applied (emotion sensing, voice, eye movement)
- Feedback of data acquired if so requested by client
Psychological Engine (emotional AI) supporting the CEA’s emotional response
- Algorithms applied (facial movements, body movement, voice)
- Code of ethics applied in output algorithms, transparency of algorithmic results
As algorithms and the quality of available data improve, one can expect that CEAs will become a viable, generally available option for knowledge transfer for many areas of expertise. More efficient than one-way knowledge transfers, CEAs are likely to have a major impact across many economic and government sectors as todays ‘talking heads’ will be enhanced by covering the body encompassing body gestures as well. The key to this development is the application of emotions and empathy as CEA’s, presented via a digital screen, will be indistinguishable from human beings. In addition, first experiences indicate that the trustworthiness of a human expert can be transferred to his digital CEA version, as face and body movement -possibly on a subconscious level- can be replicated from the human original. Eventually economics, quality and customer acceptance will dictate the success of a CEA. Being globally available 24/7 in unlimited copies, CEAs will have a major impact in any area where knowledge transfer is part of the value chain. A CEA’s level of expertise will eventually be superior to the expertise a human can comprehend. However, the human’s ability to think across multiple layers of knowledge will remain unique. Once true Artificial General Intelligence (AGI) becomes available, this scenario might change as CEAs will be able to cover a broader range of expertise. So far however, we have a long way to go to reach AGI.