Artificial Intelligence (AI) tools for NDIS Service Providers: A best practice guide.

[Image Description: A yellow chatbot with pink eyes on a digital face, peers at a laptop while working, floating in the air with a dark blue background.]

If you haven’t already delved into experimenting with AI tools for your business, after getting ChatGPT to roast your instagram feed, of course; then are you even embracing our new technological overlords? Joking. But, while there are significant benefits to automating or streamlining elements of your NDIS business with the use of AI, there are a few privacy, security and intellectual property pointers to be aware of before you go too deep.

We’ve teased out some of the most important concerns to be aware of, as NDIS Service Providers using these services - and full disclosure, AI has helped us put this list together.

But, as part of our own internal quality and safeguarding processes, any outputs generated using AI are checked for relevance, accuracy and actual human perspective. Because, while AI can do a lot of the grunt work, there is no replacement for solid human judgement (yet).

What to consider when using AI tools as an NDIS Service Provider

Understand data privacy and security obligations

Kinora has explored online data security during our scams and online safety campaign this year, but when it comes to using AI tools, data, privacy and security should be front of mind for all providers. In case you missed them, here’s the link for a refresher on Cybersecurity for NDIS Businesses.

  • Data minimisation means that any personal or sensitive (contact details, identifying information, medical) information about clients, staff or, really anyone, including yourself, should be left out of the inputs used for AI tools. An input is the prompt or what you’re asking the AI tool to produce for you. This makes the output more general, but also safer and potentially of use for more than one person, i.e. creating templates for use with several clients that require a person to fill out questionnaires or forms. 

  • Encryption should be utilised if data is being used for any customised AI functions in your business, for example, lead generation or auto-responders from your website.

  • Access control is the term used to describe the management of who on your staff has access to what tools and systems within your business. By keeping access control tight on AI tools, meaning only a select number of people have access, you can ensure that staff members that do use them are trained well and know how to use outputs appropriately.

Legal compliance

To be compliant with the Australian Privacy Principles (AAPs) under the Privacy Act 1988, NDIS service providers in Australia (really, any company) must declare any areas of their business that utilise the combination of client data and AI and get their informed consent before it happens.

Part of this compliance is having clear and publicly available data governance policies that outline the company’s use of AI tools, especially in relation to data handling, storing, processing and disposal.

This is so that the Australian consumer has choice and control over where their information can be accessed, digitally and for commercial purposes.

Intellectual Property Ownership

Each AI tool will have different terms of service and use; make sure that any content that you produce as a result of using an AI tool is actually yours and not owned by the AI technology provider. To avoid this, and potential loss of company information or knowledge via public AI tools, businesses can customise AI models that are trained with internal data and are contained within the business. This works to protect company IP and controls sensitive data usage.

Human editors and quality checkers

Every ‘output’ from an AI tool really needs to be quality and relevance checked by a human before it is let out into the world. Make sure you keep in mind the audience for the output, ensure details and information is up to date, no sensitive personal or company data is being shared and that considerations like values, empathy and compassion are applied. Try to avoid an over-reliance on AI outputs within your business, as you may start to lose the subtle nuances in your communication and business processes that make interacting with a human what it is, especially in an industry like the NDIS.

How has AI changed the way you operate your NDIS business?

We’d love to hear your stories in the Providers Only community of Kinora.

Previous
Previous

Interview With Paralympian Wayne Phipps: Lessons to Power Your Own Journey - Video Replay

Next
Next

Getting the NDIS Back on Track No.1 Bill passed by Parliament