With the advent of ChatGPT, Artificial Intelligence (AI) is suddenly finding its way into the living rooms of “ordinary” citizens. AI has become mainstream. And the whole world is plunging into it. What does this mean for financial services? Will it change the way we invest? What impact will it have on customer contact? And: is our society actually ethically and legally ready for artificial intelligence? In this series, we explore the answers with experts from inside and outside APG.
In part 2: Stefan Ochse, head of Generic Participant Contact (GDC) at APG
There is something ironic about it: while the interview with Stefan Ochse is being recorded via Microsoft Teams, a new feature with artificial intelligence conjures up everything we are saying in real time as text on the screen. Speech to text. And although, shortly after this, the conversation continues at length about the opportunities the technology offers for client contact, the system still occasionally “types” quite a few words we really didn’t say. Microsoft has some fine-tuning to do. Nevertheless, one thing is certain: it is impressive what AI is already capable of. And if it is up to Stefan Ochse, who heads APG’s Client Contact Center, it is going to make working at client contact a lot easier. “But we shouldn’t go too fast either,” he believes.
You are not yet convinced of the potential of AI?
“I get incredibly excited about the technology and what it can mean for our work - and what it already means by the way. But we do tend to think of AI, and more specifically ChatGPT, as a self-thinking computer. Sure, the technology goes beyond what we know so far, but ultimately on the front end the thing does exactly what it was programmed to do. I don’t think we should forget that. But it certainly does have potential.”
You have been seeing that potential in the world of client contact for some time with AI-like applications, such as chatbots. With the advent of ChatGPT, what’s different now?
“Integrality. At APG, we currently already use a number of tools that you can put under the AI umbrella. For example, we have a speech to text application that types out conversations with between employee and participant. We also use a tool - Next Best Question - that can estimate fairly accurately what the participant’s next question will be based on certain factors, so that they can be helped more quickly. And there are numerous other programs that we use to help people with their issues. Very smart AI, as you see in ChatGPT, can link all those resources together. In this way you can, for example, develop a chat or voice bot that knows who is making contact, has an indication of what the call is about and presents certain options directly to the participant. In this way you are already focusing very directly on the customer demand. If that works well and you have that infrastructure in place with AI, it is a wonderful way to influence the participant experience in a positive way. So ‘one size fits one’, where you serve each participant according to his or her specific needs, is getting closer and closer.”
At the same time, one can also wonder to what extent an increase in the use of artificial tools contributes to trust in large organizations. Don’t people just need personal contact?
“Of course. And I am convinced that customer contact will always need a human factor. Especially in the pension sector. Pension is very complex for many people and it is about money. There will always be a need for real human contact to explain things and to dispel concerns or distrust. But look at it this way: AI actually expands the possibilities of customer contact and can ensure that people are helped faster and more specifically with their questions. It enables us to determine even more specifically what the information needs of our target groups are. But in terms of coaching opportunities, the possibilities are also endless. ChatGPT, for example, is already very good at summarizing conversations - recorded via a speech-to-text tool. That is work that was normally done by employees.”
Does that also mean that people should worry that AI will eventually take over more and more tasks?
“You shouldn’t see it that way. AI is an additional tool. And it can be of tremendous added value in the entire chain of client contact. The fact that tasks can be taken over means that you have to look at your work differently. Take the conversation coach on our floor: he will not only have to watch what happens in a conversation between participant and employee, but also what happens in the system. So, understanding that technology and focusing on it becomes more part of the job.”
The question is whether everyone is ready for that.
“We don’t want the use of AI to have a negative impact on the work of our colleagues. That is why I think it’s very cool that we are part of a study by Maastricht University on the impact of AI on people’s well-being. They are going to study this at three places, including APG. In March we will start implementing it and together with the researchers we will take a closer look at this new way of coaching. It’s great that we are taking a more in-depth look at the impact of these kinds of new technologies.”