Skip to main content

CEO Update LIVE: Embracing and controlling technology

Panelists at emerging tech forum say AI requires new policies to restrict its use, but that staff need to be encouraged to experiment.

Fostering a culture of innovation and harnessing artificial intelligence (AI) while managing its risks were topics at the recent CEO Update LIVE: Emerging Technology forum in Washington, D.C. 

Among the biggest issues with AI is the need to create clear policies regarding its use by association staff, volunteers, conference presenters and contractors — and update the policies frequently as the technology rapidly changes, experts said. 

“It’s absolutely critical to have an employee usage policy that says what you can and can’t do and what you can and can’t put in an AI tool,” said panelist Jeff Tenenbaum, managing partner at Tenenbaum Law Group. 

“We’ve been asked to review some of these policies,” he said. “They’re very much evolving, and they’re all over the place.” 

“And it’s not just employee usage,” Tenenbaum said. “If you want to require that speakers or authors disclose if they’ve used generative AI in creating a PowerPoint presentation or an article, you need to put that into your speaker and author agreements.”  

The same goes for any agreements with researchers, independent contractors and even board members, he said.  

Tenenbaum was part of a panel moderated by Information Technology Industry Council CEO Jason Oxman on the human, legal and ethical implications of AI. The other panelists were Tracye Weeks, managing director of strategy and advisory at Nonprofit HR, and Jeff De Cagna, executive advisor at Foresight First. It was one of two panels at the event, held at the headquarters of the National Association of Home Builders on Dec. 13, 2023. Keynote speaker Noelle Russell, global AI solutions lead at Accenture, kicked things off.  

Getting the board on board 

Another critical consideration in terms of law, ethics and people is board members’ collaboration when creating organizational AI policies, said De Cagna. 

“Boards as a matter of fiduciary responsibility need to be involved in this conversation from the very beginning because they are the ones who will have legal exposure when problems occur,” De Cagna said. “We need policies, but the policies will only go so far if the board itself does not understand what’s going into the policy development process and what the outcomes are going to be within the organization.” 

Weeks said associations need to focus on including existing staff in AI solutions, rather than just replacing humans with technology. She noted that HR executives are still learning how to use AI themselves. 

“It’s less about replacement and more about enhancement of positions. It’s about making us more innovative in organizations, reskilling and upskilling positions. But then, what new positions does that create?” Weeks said. “So, it’s twofold. We’re trying to embrace AI and at the same time, we’re trying to remain skeptical enough to make sure that there are some acceptable use parameters in the organization.” 

One key will be making sure employees communicate about what generative AI tools they are using. 

“It will be impossible for you to track who’s using Claude, who’s using Bard, who’s using ChatGPT,” Weeks said, referring to three AI chatbots. “If you’re encouraging innovation and creativity in your organization, you have to make sure you have a culture where people self-report.” 

The risks of AI use and its rapidly changing nature will put a premium on ensuring compliance with policies, Tenenbaum said. Those risks include copyright infringement, data security and the use of incorrect information, including on certification exams. A mistake in the latter that results in personal harm could lead to a lawsuit, he said.  

“Just having a policy is never enough,” he said. “You need to write it, you need to update it, you need to distribute it, you need to train on it and then you need to enforce it.”  

Tapping staff skills 

The other panel discussed how to leverage emerging technologies in association management. The panelists were Tatia Davenport, CEO of the California Association of School Business Officials; Mark Dorsey, CEO of the Construction Specifications Institute; Guillermo Ortiz de Zárate, chief innovation and information officer at the National Council of Architectural Registration Boards; Chantal Almonord, chief information and engagement officer at ISPOR — The Professional Society for Health Economics and Outcomes Research; and Juan Sanchez, chief information officer at Inteleos. 

Brittany Carter, president and CEO of CEO Update, moderated. 

Panelists said the ability to respond to innovations starts with people and culture. 

“We’ve lived through the internet, mobile and now AI,” said Sanchez. “And if your organizations are still wondering why you’re not adopting or haven’t been able to squeeze all the value out of internet and mobile, and we’re still talking about data being bad, or people not knowing things, it’s your culture. It’s not the technology’s fault.  

“AI isn’t going to be any different,” he said. “So, look at your budgets. If you’re cutting upskilling and training budgets, stop. Start upskilling everyone. Everyone. Not just the chiefs, not just directors, but every single person in your organization, because that is ultimately the engine that’s going to adopt all this information, all of this technology, and be able to leverage it and make our organizations better. And they need to be better.” 

Ortiz de Zárate said many employees, not just the tech staff, can move technology forward. In addition to his role at NCARB, Ortiz de Zárate is president of Lineup, a volunteer management software company he and his staff created at NCARB. 

“It starts with being curious about the people in your organization,” he said. “Understanding who may have curiosity and the intention to do something with it and then giving them the space to try in a way that is intentional. The way we’ve been doing it is, we identify people that have those curiosities and then give them company time to try things with the hope that when, after they’ve done some work, they can then share with the rest of us and teach. 

“We’ve been doing that for over 14 years, and it has resulted in a lot of the technologies that we have ended up adopting as an organization, and then at some point also abandoning. Because part of it is that it’s OK to try something that doesn’t work,” Ortiz de Zárate said. “It’s rewarding for them, and you discover new skills that they have.” 

A seat at the table for tech 

Almonord, of ISPOR, said technology leaders, whether chief information officers (CIOs) or directors of technology, need a close relationship with CEOs and to have a seat at the table with volunteer leaders. 

“It’s important to align board initiatives and strategies with tech capabilities,” she said. “It shouldn’t be an afterthought where you plan out this huge strategy, it’s going to take three to five years, you’re going to do all these things, but your infrastructure doesn’t support that, your data is not complete enough. Having the tech leader at the table can help decide, ‘Can we do it?’ or ‘What order should we do it in?’ and ‘What are the prerequisites that we need to put in place?’” 

 “Seeing that the CIO or your tech leader is involved in the strategic decisions or contributing to the strategic decisions flows down,” Almonord said. “And it helps build that culture in which technology’s going to be the thread throughout the organization. No matter the initiative, the department, the business unit, there’s a need for technology and it’s an enterprise-wide thing that should be considered at the strategic level.”