KEY TAKE OUTS
- Advances in technology have allowed previously unadvised investors to receive real-time, cost effective digital financial advice.
- With its wider reach and larger target audience, digital advice brings increased risk of disputes, including large-scale, class actions.
- A digital adviser is exposed to risk as to the substantive advice provided and as to the underlying algorithm in respect of cyber-security and calculation error.
- ASIC recommends a series of preventative measures including regular testing and monitoring undertaken by appropriately qualified experts in both financial advice and technology.
Technological advancements and innovation are having a profound effect on the financial services industry. Fintech is driving change. Financial advice is a case in point with digital financial advice, or robo-advice as it otherwise known, changing the landscape.
The benefits of digital financial advice are well documented. The main drivers being time and cost. It is said that by using sophisticated algorithms, digital advice providers can give real-time advice to a mass, previously untapped, market.
To provide an idea of the anticipated scale of that market, digital advice is aimed at the 80% of Australian adults who do not currently access professional financial advice. That is, around 12 million people. It’s a classic case of economies of scale. Such is the expected breadth of digital advice and its anticipated impact on the lives of most Australians, Regulatory Guide 255: Providing digital financial product advice to retail clients (RG 255), targeted specifically at digital advice providers, was released by ASIC on 30 August 2016.
However, as with everything, for every yin there is a yang and every benefit brings a risk.
Like human advisers, digital advice providers are subject to the requirements of legislation such as the Corporations Act, the ASIC Act and myriad ASIC Regulatory Guides. In addition, digital advisers have obligations in tort and are bound by contract.
This article seeks to address some of the potential disputes which may arise having regard to the guidance provided by RG 255.
General obligations applying to digital advice licensees
All digital advisers must hold an AFS licence or act as an authorised representative of an AFS licensee and satisfy the general obligations as set in ss.912A and 912B of the Corporations Act. Those obligations include doing all things necessary to ensure the financial services covered by the AFS licence are provided efficiently, honestly and fairly, as well as to comply with financial services laws.
A digital advice licensee should have an internal dispute resolution system which accords with the relevant rules and regulations and also be a member of an external dispute resolution scheme (see Regulatory Guide 165: Licensing: Internal and external dispute resolution for guidance as to requirements and process).
Regular monitoring and testing
RG 255 states that a digital advice licensee should regularly monitor and test both the advice generated and the algorithms underpinning the advice platform. The adviser must engage people who:
- can review the advice generated to ensure it is appropriate and legally compliant; and
- understand the rationale, risks and rules behind the algorithms underpinning the digital advice.
The risks to the digital adviser appear twofold; first, as to the substantive advice provided and, second, as to the underlying algorithm. In the context of a potential dispute, there is no doubt that the two are related. If the advice being generated is inappropriate this may be directly attributable to the rules and rationale of the algorithm.
A further piece of guidance in RG 255 is that a digital advice licensee should have sufficient technological resources to maintain client records and data integrity, protect confidential and other information and meet current and anticipated future operational needs, including in relation to system capacity. This includes having adequate business continuity, backup and disaster recovery plans for any systems that support the delivery of digital advice to clients.
Adequate risk management systems are especially important when one considers the ramifications of malicious attacks and hacking of the algorithms. Without regular testing, such an attack may go unnoticed meaning the advice being given would be compromised.
From a disputes perspective, the causes of action would be the same as those generally available to the recipient of advice from a human adviser, for example, breach of contract, negligence or misleading or deceptive conduct.
However, the underlying algorithm and risk of defects caused either by malicious attack or simply due to an error in the reasoning process present an intriguing position.
Taking negligence as an example, if a digital adviser failed to adequately protect its algorithm, or failed to adequately analyse the precise set of rules which would be applied to the information provided by the person seeking the advice, then this may give rise to a claim in negligence.
The question is how far does this go? What is “adequate protection”; is it cyber-security similar to that employed by the intelligence services, or security systems similar to those used by major law firms and the big banks? Is an IRPA assessment required with certification received from the Australian Signals Directorate? Given that the provision of digital advice is in its infancy, the position is not entirely clear.
The importance of security measures cannot be underestimated. Given the anticipated scale of digital advice, the effect of someone hacking the system and altering the algorithm could be vast, including class actions by multiple parties.
There is also a genuine risk that people wishing to manipulate the market for their personal gain could see a large-scale digital adviser as the perfect target. If thousands of people are receiving advice on any given day, the ability to influence a platform to recommend investments in a particular entity would lead to the artificial inflation in the share price of that entity.
Such an effect would not only give rise to claims by recipients of the advice but also from other investors who would have overpaid for certain securities and suffered loss when the market corrected. If the digital adviser is held not to have taken adequate steps to protect the system, it could be held liable for any loss attributable to the error, albeit such loss may be apportionable under the relevant legislation assuming the hacker could even be identified.
Moreover, as we have seen with the recently filed class action in the US concerning the Australian banks alleged manipulation of the Bank Bill Swap Rate (BBSW) rate, claims may be commenced in foreign jurisdictions.
The Corporations Act’s best interests obligations
RG 255 also refers to the best interests obligations in the Corporations Act which must be satisfied by any adviser giving personal advice. For example, prior to giving advice the adviser is required to have identified the objectives, financial situation and needs of the client and, where it was reasonably apparent that such information was incomplete or inaccurate, to have made reasonable inquiries to obtain complete and accurate information.
A human adviser may do this by conversation. However, a digital advice platform will need to have a series of questions designed specifically to identify the objectives and needs of the client and to notice missing or incorrect information, followed by a series of questions in order to correct any deficiencies.
Whilst it may be a defence if someone seeking advice deliberately provided false answers, or failed to take proper care and attention when completing the forms, unless the framework used to ensure that inconsistencies and errors are identified is adequate, there is a clear risk that this would not satisfy the obligations. Failure to act in the best interests of the client is a breach of the Act and can give rise to regulatory interest in the form of a proceedings commenced by ASIC, banning orders and civil claims by recipients of the advice.
It is also important to recognise that digital advice is not suitable to everyone. Providers need to acknowledge this and have a series of introductory questions designed specifically to ensure that those for whom digital advice is not suitable are not able to proceed further within the tool.
Put simply, it is not just a case of having a sophisticated algorithm to provide the financial advice, the elements surrounding it need to be equally considered and implemented. Failure to do so could have profoundly damaging consequences, both here and abroad.