I was not an early adopter of ChatGPT because my older brother knows the answers to most of my questions and he has been just a phone call away. Don’t tell him, but I have now uploaded and use the AI app instead of calling him.
I am sure you will be appalled to learn that even me, a lawyer, didn’t read its terms and conditions before uploading it onto my mobile phone. I am not naive — I do not think for a nanosecond that ChatGPT, a free AI solution, keeps my queries private. I assume that everything I enter could be available publicly. I might enter, “What is the best pizza in Toronto,” but I will not enter, “My client has the following problem … what is the solution?”
The rapid adoption of AI technology across the Canadian financial services industry has significant legal implications for financial advisors, their firms and clients. While the technology grows increasingly ubiquitous, it’s worth noting where advisory firms are on this and the potential legal risk associated with the use of AI.
Each of the dealer firms has their own privacy and AI policies, which respect the fact that client information must be kept private. Some dealers pay for licensed AI solutions. Some independent advisors are buying off-the-shelf solutions.
Many dealer policies dictate that you cannot text clients or use WhatsApp for client discussions. They’re not protected by double authentication, and do not protect your clients’ privacy. Communications on these applications are also not subject to scrutiny by your supervisors, which is your dealer’s obligation.
For the same reasons, you cannot use ChatGPT for clients. There is no promise that client information remains private, even if you do not mention the client’s name. Do not use it to answer client questions and certainly do not enter any client information into it.
Clients pay us because we are experts in our respective fields. ChatGPT is not a resource we should be relying on to develop or support the advice we give clients. We need to go to the source of information, and apply our expertise, to best serve our clients’ needs.
Some companies pay for private AI solutions. My firm licenses an enterprise solution which ensures that both searches and findings remain private. These solutions are expensive though — not every dealer in the financial services industry licenses an AI enterprise solution.
If you plan to research an option for your practice, first read the relevant portions of your dealer’s policy manual to make sure you are permitted to pay for and use independent technology. Determine whether you need your dealer’s approval before using any network solution. Then ask your dealer if they have recommendations. (This could also apply to other software solutions you license for other purposes, like planning.)
If you are considering a solution that your dealer is unfamiliar with, you will likely need to send the details to your dealer so they can review and approve it. Read the terms and conditions of the solution to ensure that the information you enter into it is private and that it is not used for any purpose by the company providing the solution.
What about AI note-taking solutions?
One of the most common uses of AI technology is the recording and transcription of conversations. Obviously, this saves advisors significant time and energy. Again though, there are risks to consider.
- The technology isn’t always accurate. If a client says: “I am not worried about fluctuations in the market,” and the word “not” is lost due to the client’s voice or accent, the transcript can read: “I am worried about fluctuations in the market.”
- Transcripts fail to reflect body language or tone of voice. In recorded proceedings, litigation lawyers will comment so there is a description of the change when there is body language or a tone of voice that should be noted. We say something like: “Let the record show that opposing counsel has raised his voice and is yelling at my client,” for example. Advisors are not trained for this. It would also be awkward. Imaging interrupting a client discussion to say: “Note that the client just winked and smiled at me, indicating sarcasm.”
- Sometimes we forget we are being taped. Discussions can trail into private information that you do not necessarily want your compliance officer (or a regulator or judge) to read. Advisors may have close friendships, or may be a family member of the client, and conversations reflect that. For example, you might not want your compliance officer to know about your own marital problems.
Check the transcript immediately after a call, and correct inaccuracies. Remember my five Cs of documentation: correct, current, complete, consistent, contemporaneous.
My preference is to use a transcript to prepare notes (and whatever else you need to do) and then delete it. Tell your client that this is what you’re doing.
If a client issues a complaint and sues you, transcripts (if they have been retained) and your notes will be included in the documents you are obliged to provide to the regulator and opposing lawyers.
While advances in AI have provided us with fantastic tools, we need to appreciate their limitations. Ensure their use protects clients and adheres to your dealer’s policies.
Advisors beware if you are choosing to use AI, or do you prefer to just call my brother?