The Intermediary – March 2025 - Flipbook - Page 76
T E C H N O L O GY
Opinion
The dangers of
brokers using large
language models
A
rtificial intelligence
(AI) has made some
serious inroads over
the past year or two,
thanks largely to
large language models
(LLMs) like ChatGPT.
Many brokers in the mortgage
industry are harnessing the power
of LLMs to streamline operations
and enhance service delivery. AI can
process data at speeds unmatchable
by humans, drastically improving
decision-making efficiency and
accuracy. However, tools like ChatGPT
have inherent risks aached to them.
Use #1: Analysing
financial documents
One of the primary applications in
brokerage is to quickly si through
reams of financial data, ranging from
bank statements to credit files.
AI models are trained to identify
paerns and key information that can
help with assessing an individual’s
financial health and suggested level
of credit. This allows brokers to make
informed decisions faster, enabling
them to advise clients more effectively.
Use #2: Automating
client communication
LLMs are adept at generating various
forms of communication, from emails
to detailed client reports. Brokers use
these models to dra personalised
messages and detailed explanations of
financial advice for their clients.
By automating these tasks, brokers
can save time and reduce human
error, ensuring that communications
are both accurate and consistent.
Use #3: Enhancing
client interactions
Beyond analysing data and automating
communications, AI tools are also
76
The Intermediary | March 2025
used to enhance direct interactions
with clients. AI chatbots, for example,
can handle initial client inquiries,
schedule appointments, and provide
instant responses to basic queries.
This helps manage client
expectations and improves
accessibility, allowing brokers to focus
on complex and nuanced client needs.
Use #4: Risk assessment
and management
AI models help brokers in identifying
and managing risks. By analysing
historical data and current market
conditions, AI can predict potential
risks and recommend precautionary
measures. This proactive approach
in risk management does a good job
of protecting the client’s interests
while also supporting brokers in
maintaining compliance with
regulatory standards.
Ethical implications
Despite these benefits, when brokers
use AI tools like ChatGPT to analyse
personal bank statements or credit
files, they risk exposing sensitive
information. The main issue lies in
how data is processed and stored. AI
models require access to large datasets
to learn and make predictions. This
data, when transmied to or processed
by third-party AI providers, could be
accessed by unauthorised parties.
Beyond this, the nature of
AI-generated outputs, such as
communications detailing reasons for
financial advice, might inadvertently
reveal personal financial details that
should remain confidential. This
compromises client trust and may
also violate General Data Protection
Regulation (GDPR).
Under GDPR, data controllers are
required to implement appropriate
technical and organisational measures
IFTHIKAR MOHAMED
is director at WIS Mortgages
and Insurance Services
to ensure that data processing is
performed in compliance. The use
of AI by brokers must be heavily
scrutinised under these guidelines.
Brokers must ensure that any AI
tool they use complies with GDPR
principles, such as data minimisation,
where only the necessary amount of
data should be processed, and integrity
and confidentiality, ensuring that
personal data is processed securely.
The ethical use of AI also comes
into question. Should AI have access
to personal financial information?
How transparent are these processes
to the clients? Brokers must address
these questions to maintain ethical
standards and client trust.
Specialist solutions
Specialised private platforms that cater
exclusively to financial professionals
can provide a solution. These
typically have robust data protection
measuresin place. By using these types
of services, brokers can leverage the
power of AI while adhering to legal
and ethical standards.
These platforms can facilitate the
secure processing of sensitive data
and provide brokers with the peace of
mind that compliance is maintained.
They offer a controlled environment
where AI’s capabilities can be
harnessed without compromising
client confidentiality or violating
regulatory requirements.
While AI presents significant
opportunities, the risks must be
strongly considered.
To help with this, brokers should
consider specialised private AI
platforms, ensuring that they
deliver superior service without
compromising. ●