An AI solution to financial advisors' cognitive bias?

A smiling white man in a suit with money flying around him.
Canva's AI text-to-image tool created the above image using the same prompt that led SoFi to start its "Face of Finance" campaign to un-bias AI image generators.

Many financial advisors acknowledge that cognitive biases influence them in their practices, but few tools exist to combat them. Researchers say artificial intelligence might offer a solution.

A paper recently published in the Journal of Business Research identified 11 different cognitive biases financial advisors exhibit when making recommendations to their clients. Among the 21 advisors interviewed by an international team of academics from four universities, the most common areas of partiality were types that the researchers classified as confirmation, affinity and priming bias. 

"[Confirmation bias] involves financial planners or investors holding preconceived beliefs about an investment," researchers said. "These are reinforced when the information arising from the scenario seems to be aligned with their thought processes. Such information may be true or false, but the financial advisors consider it to be true and to use it in support of their decisions."

Advisors exhibit affinity bias when they show favor towards someone who shares similar interests or experiences to them. Alternatively, priming bias is when one stimulus affects an advisor's reaction to a second stimulus.

This bias can lead advisors to make incorrect assumptions about financial markets, according to the researchers. "If the stock market once crashed due to a sudden spike in stock purchasing, financial planners may tend to relate stock market crashes with such spikes in the future," researchers said. "However, stock markets may crash due to a number of other reasons."

The remaining eight biases observed include familiarity, unconscious, overconfidence, framing, self-serving, belief, anchoring and embodied bias.

After 25 years of education and work experience in finance, "some of my opinions seem to be hardwired into me," said Sean Lovison, a financial advisor at WJL Financial Advisors in Moorestown, New Jersey.

"I often find myself seeing confirmation by seeking out information that confirms my pre-existing beliefs and potentially ignoring information that contradicts them," he said.

Researchers found that advisors generally agreed that AI, "being totally based on data and calculations," would be valuable in helping them avoid such cognitive biases, despite potential fears of being replaced by the technology.

Familiarity bias can be another common issue for time-crunched advisors, according to Sindhu Joseph, the founder of San Francisco-based wealth management AI firm CogniCor. 

Where a busy advisor might otherwise fall back on the same handful of investment vehicles they commonly recommend to clients, AI can present advisors with a broader array of potential options and help them evaluate the best investment for a specific client's needs, according to Joseph.

An advisor's bias extends beyond investment decisions, said Sarah Fallaw, the president of Atlanta-area behavioral finance firm DataPoints. Advisors can often fall back on the same biases when it comes to client interactions.

By using a "narrow" client discovery process, advisors may only collect basic demographic information, causing them to offer guidance that ignores the client's "personality- or behavioral-based characteristics like attitudes, beliefs and values," Fallaw said.

"Confirmation bias may lead advisors to look for information that supports what they think about a client and ignore information that isn't aligned with that view," Fallaw said. "Advisors must watch for this bias, especially considering a client's demographic characteristics: unknowingly trying to confirm a spouse's 'typical' gender role related to investing decisions or stereotypical money-related beliefs based on a client's ethnicity."

Client context can be important not only for advisors to avoid making biased assumptions about a client, but also for the advisor to better account for biases that the client themselves may have.

"Both the client biases and advisor biases can, to a certain extent, be overcome by [AI by] providing the relevant information on both contexts and I think AI can go one step beyond which is providing those next best actions," Joseph said.

While the advisors included in the research were generally receptive to incorporating AI into their practice as a means of combating cognitive biases, they also voiced concerns about the biases of AI itself.

Because AI is trained on data from real-life examples, any biases present in the training data can make their way into the AI itself. For example, ChatGPT explains return on investment (ROI) differently when asked to define the term for women and men.

While the male-oriented response readily uses gendered pronouns – "he invests" – the female-oriented response uses non-gendered, second-person pronouns – "you invest."

Developers can take a couple of different approaches to unbias AI, Joseph said.

First, they can strip information from the data that is not important to the decisions it is making, so that factors like gender or age do not unnecessarily influence its output. Second, developers can create "synthetic data" to train the AI on.

"If there is a huge population for one type of category and there is much less data for another type of category, then you supplement artificially created, synthetic data of that population, and then make it a balanced training set," Joseph said.

SoFi recently announced an effort to produce a synthetic data set that will be used to train image-generating AI on a more balanced set of photos.

According to the company, image-generating AI tools display a gender bias in their output. When asked to create realistic images of people who are good with money, only 2% of the thousands of images depicted women.

While AI can be incredibly useful in helping advisors to avoid cognitive biases in their practice, it is critical that programmers work to mitigate any biases in the AI program itself, according to experts. Lacking any regulatory standards, users are relying on AI companies to remove their biases from their tools without any oversight.

"If some of the regulators could step in and say like, 'The decisions in these categories should demonstrate these kinds of aspects, and only then it will it qualify as an unbiased AI system,' that would become really interesting," Joseph said. "Otherwise, you are putting too much faith in the solution providers to build in these kinds of things, which most people don't do."

For reprint and licensing requests for this article, click here.
Fintech Practice and client management Artificial intelligence
MORE FROM FINANCIAL PLANNING