Fintech firm Padua predicts ASIC could eventually use artificial intelligence in its compliance and audit checks of financial advisers as the use of AI applications expands in the financial advice industry.
Padua’s co-CEO, Matthew Esler says that potentially “down the track” the firm thinks AI could be used to check if advisory firms are meeting regulatory requirements “…and to check for compliance in advice documents, and in other supporting materials such as file notes.”
He says that AI is increasingly being used in advice firms around Australia and each stage of the financial advice process, although particularly in data collection, is likely to be impacted by emerging AI applications.
“Many advisers are now using natural language AI in their day-to-day operations. Within the advice process, AI can also be used to record minutes of meetings and in the creation of fact find information and file notes.”
Esler says important considerations for advisers include that AI-generated file notes and fact find information comply with Australian regulations and laws.
…There’s a real risk for financial advice firms inadvertently providing recommendations in the information gathering stage…
“There’s a real risk for financial advice firms inadvertently providing recommendations in the information gathering stage which would necessitate an advice document within five days.”
Esler says the firm expects ASIC will be monitoring this.
“Firms too will have to manage their use of AI through a combination of robust technical measures, comprehensive understanding of the regulatory environment and continuous oversight.”
Esler says large language models can perform routine data analytics to ensure the validity of client datasets during the advice process to ensure the ‘client story’ is viable and consistent throughout the advice process.
He says this can save advisers significant time in checking and correcting client data.
Esler says Padua’s tools can help advisers to produce records of advice, statements of advice, product comparisons and best interests duty statements “…within minutes.”
…scrutiny is essential…
He notes, however, that scrutiny is essential.
“As some licensees have already experienced, AI does not come without its risks with comments made by advisers during this process recorded, and the potential for advice to be provided in these meetings, requires caution. There’s also the concern around ‘deep fakes’ – impersonating another person using AI, with fraud a major challenge for advisers, and every person online.”
He says in order to overcome challenges, it is important advice firms have a plan and consider protection considerations which should be given to implement robust data security and privacy measures, following best practices and regulatory guidelines.