Printed . This content is updated regularly, please refer back to https://bcfsa.ca to ensure that you are relying on the most up-to-date resources.
What You Should Know When Using Artificial Intelligence
Artificial intelligence (“AI”) has been in use for decades. We use this technology to power our online searches, shop online, and use ride-hailing apps with ease. But a new wave of generative AI tools have garnered attention.
Generative AI apps are changing the way individuals and businesses approach their work, find inspiration for new ideas, create images, and develop content. As the software and algorithms behind these apps become more sophisticated and are exposed to more data, they get better at performing tasks.
AI apps can be useful, but relying on them to provide services that you are licensed to provide is risky. This article looks at the associated pitfalls of AI tools, and what real estate licensees should consider when using them.
A Brief Look Into What AI Apps Can Do
Apps like ChatGPT are chatbots – they are designed as language models and respond to text prompts input by users. They can perform a multitude of tasks, including the following:
- Have human-like conversations;
- Answer questions;
- Write reports;
- Quickly sift through and summarize documents containing hundreds of pages;
- Develop computer code; and
- Translate language.
Other apps can use text prompts to generate graphics, logos, images, illustrations, and more. The technology is rapidly growing, which means the tasks AI apps can perform are proliferating.
The Risks of Using AI Apps
As a real estate licensee, you have duties to your clients. These include maintaining the confidentiality of client information, and acting in their best interest. Using AI apps to help you with professional tasks can compromise these responsibilities.
AI Apps Can Hold onto Personal Data
AI apps require the user to input data in order to generate a response. A key risk that arises is the data you input could be personal information that could cause harm to your client if it was leaked, or damage your professional reputation and the reputation of B.C.’s real estate industry.
The information you enter into an AI app may become part of its data set – AI chatbots are usually designed to hold onto information, because they generate responses based on the data they have “learned”. It’s nearly impossible to verify if the apps will “unlearn” or wipe the information you have fed to them once your task is complete. A risk that stems out of AI apps holding onto data is that you cannot control what that data will be used for in the future. There may be unintended or unanticipated ways that apps use your data to perform tasks for other users.
AI Tools Are Not Always Accurate
Another key risk of relying on AI apps to perform tasks is that the output can contain errors. The output is only as good as the input – the apps crawl online content related to the text prompts, but also pull from information you and other users have entered. If that information is flawed, biased, or inaccurate, it’s likely that the final product created by these apps will be, too.
A Rising Need for Regulatory Oversight
The rapid growth of AI technology and digitalization across the financial services sector has prompted responses from regulators and governments. The privacy of data collected by AI chatbots is also a growing concern for many privacy authorities in Canada and globally. The Office of the Superintendent of Financial Institutions (“OSFI”) recognizes that machine learning provides new opportunities and innovation for financial sector organizations, but specific safeguards need to be in place to ensure the safety and soundness of the sector.
BCFSA is also watching the generative AI space and how financial organizations in B.C. use these tools, in order to better determine how to responsibly manage the associated risks.
How to Protect Yourself When Using AI Apps
Be cautious when using AI apps, as you would when using any online platforms, like social media or search engines.
Consider the following tips if you do plan to use AI apps:
- Only use the tools to receive general responses – do not include sensitive, business, or personal information (even when it has been anonymized) that could compromise your or your clients’ privacy.
- When you receive a response, do not take it at face value. Be diligent and do your own research to fact-check the results.
- Be wary that the responses you receive could be based on information that is confidential, copyrighted, sensitive, or biased.
- You are the licensed professional, not the AI chatbot – your clients rely on you to provide them with accurate and trustworthy information.
- Remember that you have an obligation to act in the best interests of your client and keep their information confidential. If you are considering the use of AI, you should discuss this with your client.