Logo

Developing Your AI Strategy for Non-Profits

by Ellie Leftley September 18, 2024
System Implementation
Charity
Education
Healthcare
Membership
AI

Event Archive

Artificial Intelligence promises to revolutionise how we interact with technology. But with the revolution comes a whole host of questions for non-profits including:

  • Where to start? What do you need to consider?
  • How do you ensure you have a strategy to benefit from AI?
  • How do you protect your organisation in changing times and mitigate the risks that come with AI?

Carolyn Brown, CIO of the British Medical Association (BMA) joined us to share the strategic approach BMA took to introducing AI into their organisation.

Throughout the webinar, delegates had the opportunity to put their questions to Carylon and knowing that many of these answers will be valuable to others in the NFP sector, we have complied and detailed the responses below.

If you are looking for more information on understanding how to develop an AI strategy for your non-profit, you can access the full webinar recording by completing the form on this page, or by contacting us at hello@hartsquare.co.uk.

Balancing AI benefits against potential costs

Q. As organisations in the non-profit sector, especially for those of us with charitable purposes to work towards improving society, how do we balance the benefits of using AI with the negative environmental impacts we’re becoming aware that using it has?

Carolyn Brown: The environmental impact that results from Gen AI processing is ever increasing. The only way to mitigate is to ensure that alternative sources of energy are increasingly and progressively used and the application of AI itself is used to mitigate environmental impacts in new ways. From a societal perspective, we have a responsibility to question (and validate) to the extent possible, any “truths” generated by GenAI. Social media misinformation is a prime example of this.

Q. What role do you believe non-profits can play in guiding the responsible use of AI in society?

Carolyn Brown: Each organisation has to OWN its responsible use of AI but for NFP organisations, the focus has to be on efficiencies that allow for a bigger return of value to the “customer” (in profit orgs this is also true but its more about generating higher profits). Sharing best practice and learnings with other NFPs is what we already do well, and this would apply to the learnings around responsible use of AI also. Share on LinkedIn and other engagement channels.

Framework & tools

Q. What frameworks or models can non-profits adopt to implement AI responsibly and effectively in their operations?

Carolyn Brown: There are many adoption models (these have to evolve as people learn) but standard models are offered via GenAI providers (see Google Adoption Framework or Microsoft which are easily available). Adoption models are also available for targeted contexts (e.g. Azure).

Q. What AI tools are available to support organisational productivity and customer/stakeholder engagement?

Carolyn Brown: Many vendors out there (tools for staff to use for Google, MS or other) and tools within apps (chatbots etc). Then there are models (that have intelligent algorithms) that are prebuilt and models that can be built from scratch.

Q. AI utilities are being quickly embedded into data analytics platforms such as google analytics, tableau and power BI. How do you think this will support better and more accurate data-storytelling and what strategies can be put in place to mitigate biases in AI algorithms and promote equitable representation in data storytelling?

Carolyn Brown: AI tools can tell the story in more versatile ways than the limited relational data reports we are used to. Much of this is exploratory and needs some targeted data analytics. Often you don’t know the question you want to ask until you see various “cuts” of the data and the story starts to unfold. Tools will reveal sources and sources must be validated where potential bias can evolve. Where sources cannot be validated, validity is not guaranteed. Within organisations, building out stories can start from small nuclei. Know your sources or have caveats.

Practical applications

Q. How do you ensure AI initiatives are aligned with the values and mission of a not-for-profit organisation?

Carolyn Brown: This generally boils down to the ethical use of AI balanced with the need for efficiencies that make the most from income for the benefit of the organisations “customers” – few recommended steps would be to create Charter for AI use that expresses the NFP’s intentions and responsibilities, provide awareness and links to learning tools, get staff to sign up to ethical and responsible use before they are able to use tools.

Q. Do you have any specific recommendations for how to use Chat GPT/Co-pilot to support productivity across a range of roles (e.g. communication, marketing, administration)?

Carolyn Brown: Best way to approach this is to show people what the tools can do and get them to offer their use cases (get representatives from these lines of business). There are many possibilities but being specific to the end user context makes the tool more relevant.