Harnessing AI’s power, and managing the responsibility, with Diligent’s Nonie Dalton
Diligent and BoardEffect have written extensively about artificial intelligence (AI) over the years, from how nonprofit organisations can benefit to expert tips for AI implementation to guidance on AI governance. Nonie Dalton, VP of product management for BoardEffect and Diligent Community, carries the conversation into the new year.
Dalton oversees the Diligent solutions that are designed to meet the needs of nonprofits and public service organisations, empower mission-driven leaders to streamline processes, increase transparency and engage with their communities.
“When we were at the fall Diligent Users Conference, people were saying that they were thinking about using AI or that they were using it already,” Dalton says. Indeed, this technology has arrived in an age when “nobody’s budgets are increasing, nobody’s got more time, they’re not getting huge staff increases or anything like that.”
With its promises of increased efficiency and speed, AI opens doors of opportunity for resource-limited, time-crunched boards and leaders. But behind those doors, risks lurk that may have damaging consequences for an organisation’s mission, reputation and the vulnerable populations it serves.
In short, with great power comes great responsibility, and mission-driven organisations today need to take note of both.
Dalton works at the intersection of AI’s potential and its risks. She sat down with us to share what her team is working on and thinking about in terms of AI development and governance in 2024 and beyond. Highlights from our conversation follow.
Streamlining board administration and decision-making
By automating everyday repetitive tasks such as meeting scheduling, report generation and data entry, AI can save mission-driven organisations and volunteer boards invaluable amounts of time, giving them the space to focus on the human element, such as building relationships and communities, and solving problems.
How do these capabilities play out in products like BoardEffect and Community?
“We look at how we can make things faster and more streamlined for both the admins and clerks and leadership teams putting board materials together and the board members consuming the information,” Dalton explained.
Diligent Community, for example, includes a feature that summarises agenda items “because there are often 40 pages of attachments that are associated to these items.”
Looking ahead, Dalton continued, “How do we make it efficient for board members to digest large board packages? We’re looking at an AI assistant for board members that really helps them distill the key points they need to be aware of.”
Turning data into insight
Another pain point — and AI opportunity — involves the troves of data that volunteer boards deal with, often across different sources.
“How does one person reasonably get good insight across all of it so they can to look at it holistically?” Dalton asked. “We have initiatives in Diligent Community linking strategic goals within organisations and across all the stuff related to them. What work links to what, and which risks are being created in the meantime? This is so board members can be prepared—asking the right questions and probing to make sure they pick up on the right bits of information.”
The Diligent Community team is also looking at using AI to search the evolving policies that increasingly govern public education at the state and local levels. “What are other school districts in my state proposing? What are they doing in terms of certain policies? This work is about getting insights you’re able to share, going beyond just a semantic search, where you still have to review a bunch of documents, to being able to get summarised points.”
She sees regulatory change management, “being able to know what’s changed, what’s impactful,” as a big area of potential for AI as well.
“We definitely look for opportunities where we can deliver value for our customers,” Dalton declared. “We don’t want to do AI just for the sake of doing AI.”
Being prepared for anything and everything
Some other areas of exploration include:
- Deploying prompt-powered robots to make existing workflows and tools more efficient
- Using predictive risk intelligence to identify emerging threats and vulnerabilities
- Analysing board engagement, from attendance to survey participation to involvement with the meeting materials themselves. “What parts of the board package are people looking at?”
Board members also want to know where they’re going to get the most questions, in a meeting or from key stakeholders.
“They might prep for a whole bunch of things, but it’s always something else that will totally side- swipe the conversation into a different direction. So, knowing where they should spend time is always important.”
This focus on preparation needs to extend to AI governance itself, Dalton said.
“I think the most important thing to remember from a governance perspective is that you have to have a policy and a stance around AI within your organisation.
“If you haven’t said anything like, ‘Is it okay for people to use ChatGPT to craft a marketing email?’ now is the time to do so. You need to have a stance on what’s permissible within your organisation and what’s not.”
It’s also critical to keep a human in the loop when deciding when, where and how to implement AI tools.
“We have legal, security, product, engineering and our executive leadership look at all the different projects to make sure that they are okay to go forward with and that we’re not going to cause any risks,” Dalton said.
“If you don’t have a proper review and a proper risk assessment on these things, and you’re allowing various projects to go forward,” she continued. “Or you implement a solution that could be detrimental, at the end of the day, the board is accountable.”
Ready to put power of AI, and digital governance, to work in your mission-driven organisation? Schedule a BoardEffect demo today.