The Blog
IWD: When Machines Meet Emotions – AI in Financial Education and DEI
Can AI support financial education? Should it? Can it help us make better financial decisions? Is it helping to make company boards more diverse?
By Ms Millennial Money, Sarah Penney
Last month, we celebrated International Women’s Day and this year’s theme, #AccelerateAction, bringing together a group of passionate, knowledgeable and action-oriented women and men to discuss how we can accelerate action for greater retail investor engagement.
The conversation was broad, but for me, the most interesting discussion was on the role of AI in financial education, investor engagement and promoting diversity and inclusion in the corporate world. With experts in AI, financial advice and corporate governance, there were different perspectives experiences, hopes and concerns shared.
A meaty topic with a lot to think about. Here are some of the key points.
Financial Adviser, Karin Schulte, a strong advocate for financial education made the valid point that when it comes to our finances, many of us ‘don’t know what we don’t know’. This raises concerns for her about over-reliance on AI in this space.
“From the perspective of financial understanding and AI, you do need to have a level of basic knowledge so you can check [what it’s telling you] … If you don’t have a basic understanding and then you start to rely on tools you can’t necessarily check or have a way to cross-reference, you may not necessarily you don’t know what you don’t know, and that can be quite a dangerous thing.
“For financial literacy, a lot has got to be done that is outside of AI. AI can be an assisting learning tool, but then you need to have reassurance that what you’re hearing and where it’s coming from is a source that has been reviewed and is at least understood.”
Inderpreet Kaur, Principal – Governance and Secretariat Lead at Barnett Waddingham then raised the important question about emotional intelligence. Questioning how, in the realms of financial education, financial advice and corporate governance, AI could possibly replace the very human aspects of reading each other’s emotions.
“In any field, when we are trying to assess someone through AI, let’s say we have a list of questions built from vast data, do you think there’s a gap? When you speak to someone in person, their emotional quotient, body language, and tonality provide valuable information. For instance, if you’re talking to someone who says, ‘I’ve got £20,000 I want to invest,’ you might assess their risk appetite better face to face. How does AI capture that in the financial services industry?”
Grace Almendras Castillo, CEO and founder of Gifftid, an AI growth platform for SMEs, explained that this is actually possible with AI. The key is in the programming and how the AI is taught to interpret human emotions (and by whom!).
“The prompt I would use is to say I’m speaking with Mr. or Madame blah blah blah, and you provide as much information about the person’s profile as you can. It will return an evaluation of the emotion, tonality, and the message.”
We rounded off by thinking about the role of AI in diversity, equity and inclusion, questioning whether it can play a role in inclusive hiring – particularly on boards, where diversity of thought is imperative for good governance. Fiona Hathorn, CEO of WB Directors and portfolio NED shared her experience.
“I think what’s interesting in terms of the human factor in the ED&I space is in the job applications that don’t reveal your name that take thousands [of applications] in via an algorithm.
They’re fairer for society.
They allow more people and more diversity to get in, but there is a degree of people that get left behind. And I think it’s the same as training within companies. If all training is done by a chat bot with an algorithm, you’re not picking up and supporting people in general. It’s always a balance and I think it’s the same with boards.
As an organisation we support people applying for boards, understanding your value, what’s your campaign. There are lots of board directors that are really talented and get very frustrated with the process because they never get on the shortlist. So again, there’s always a human factor.
There’s a role for everything and it’s just about how you balance the human factor in writing the algorithm.”
Contributing to the debate, TEA founder and CEO, Sheryl Cuisia, added an important point about the need for AI to be emotional. Perhaps it doesn’t need to be? For some, the lack of emotion is why AI is useful, having a simple cookie cutter response to questions allows you to layer on your own interpretations.
Ultimately though, there was general agreement in Fiona’s point that we need a balance. For successful diversity and inclusion, and to empower good financial decision-making, we need to find the right balance between AI and emotional intelligence.
The challenge is defining where to strike that balance. AI will play a role in #AcceleratingAction towards inclusive financial education and investor engagement, but human and robot must work hand in hand.