How to Build an AI-Ready Culture

Generative AI redefines what’s possible–but only if your culture is ready to evolve with it
How to Build an AIReady Culture
Getty Images: Olemedia

Generative AI can be applied almost anywhere that information is consumed or created, so its potential to improve business productivity seems vast. Practically every organization is considering how best to use it, whether that’s for speeding up routine tasks like drafting emails and summarizing reports or enhancing more complex, industry-specific workflows.

Unlike many enterprise technologies, however, adopting generative AI is not just about implementation, it’s about culture.

Generative AI’s capabilities are non-deterministic and forever shifting, so it rewards experimentation and knowledge sharing. It doesn’t always give the same answer or the right answer, requiring new ways of thinking about the role of computers at work. Its success is a product of its accessibility—it runs on the cloud and operates via natural language—but that same ease-of-use creates a high risk of shadow IT. And if all that weren’t enough there are the inevitable fears that the technology could steal jobs, breeding resistance to using it in the first place—or at least admitting to usage.

So how do you create a culture that will allow your team to thrive in the era of generative AI? A recent roundtable event hosted in London by WIRED Consulting and the global STEM workforce consultancy SThree explored this important question—and revealed its urgency. “STEM professionals are losing up to six hours each week due to insufficient AI support,” says Rakesh Patel, SThree’s managing director for the UK, France, and Belgium. A report by SThree based on a survey of 2,500 STEM professionals globally found that almost half believed their company lagged behind competitors on AI. Patel believes that “culture is the key ingredient people most often forget when they’re trying to put their business at the front of the pack.”

Here’s what we learned from the roundtable about how to get it right…

Set the ethos at the top

A good AI culture should incentivize adoption and experimentation while mitigating risks. It should have a clear sense of aims and intentions, promote transparency and encourage collaboration and openness.

Culture starts at the top, which means the CEO and the board need to embrace and lead the change. “Otherwise, what ends up happening is you have pockets of change remaining at pilot status and not moving across the whole organization,” says Sana Khareghani, professor of practice in AI at King’s College London and a trustee at the Institute for the Future of Work.

While leaders don’t need to be AI experts, setting this culture requires some knowledge of the technology. The problem is, this knowledge is often lacking, and it can be hard for C-suite figures to admit as much. “They’re too senior to ask questions, and even the people who report into them are too senior to ask questions,” says Khareghani. The result can be layers of decision-making based on little understanding. Indeed, 48 percent of respondents to SThree’s survey believe their organization’s leadership fails to grasp the potential productivity benefits of AI.

The solution? “It’s about creating a safe space for the most senior people to ask what they would deem dumb questions,” says Khareghani. Bring in an outside expert if necessary to help answer these, drawing on industry-relevant examples.

Address concerns head on

Leadership onboard, you need to convince the rest of your team. While STEM professionals often have a strong desire to use AI, don’t be surprised if the broader pool of employees aren’t immediately enamoured with the idea of integrating generative AI into their roles. You may see it as a productivity boon; they may see it as a threat to their jobs.

To address this, be upfront about what you’re hoping to achieve with the tech. If you’re not planning on making redundancies, but instead upskilling your staff so they can spend more time on the tasks AI can’t do, then tell them.

You should also address fears around wider societal impacts. Tessa Clarke, Co-founder and CEO of local sharing app Olio, says she was all-in on generative AI, but noticed a disconnect with her team. “One of the most important things that I did to really help get everyone on the journey was a very deeply personal presentation at a company call,” she says. She talked through her assessment of the potential risks, opportunities, and why she chose to embrace AI. When people questioned potential impacts on the company’s environmental mission, Clarke did the calculations on emissions and water use so they could feel comfortable using the tools. As she puts it: “We had to meet people where they were.”

Embrace the power of play

An initial “play” period will help familiarise people with the technology. You could set a task such as composing a song using an AI music generator, or making a skit using an AI video tool. But play is also vital for innovation: the people best placed to figure out where AI can offer the most value in your business specifically are the people that work for you.

The problem is that creative freedom isn’t always an intuitive part of an organization’s culture. Company policies, especially around information security, can put a dampener on innovation.

Creating a safe sandbox for exploration lets people take some initial steps without risk. So too do policies that clearly define what is and isn’t acceptable, and indicate which tools are approved. Crucially, don’t let fear of failure thwart experimentation. “The ability to fail has to be part of the AI adoption journey,” says Khareghani. “If you’re putting everything on the success of a single pilot, I would worry.”

Break down the walls between tech teams and non-tech teams

While generative AI can be used by almost anyone, you probably still want IT to take the lead on designing policies, approving tools, and training staff. They will likely also be the ones taking on more technical—and impactful—tasks, such as designing bespoke AI tools based on generative AI models, or fine-tuning models for domain-specific solutions. However, it’s usually non-IT functions that best understand the problems that actually need solving. Fostering collaboration between the two is therefore critical.

One approach is to embed IT personnel inside different departments. Michelle Ibbotson, head of architecture at XPO Logistics, says that “we have seen quite a bit of success embedding our IT teams, especially in our back-office functions”, noting that she spent useful time herself working within the finance team.

This kind of collaboration also promotes inclusivity and makes sure everyone is brought along for the journey, argues Pablo Bawdekar, independent Chief Architect previously with Aspen and QBE Insurance. “If you’ve got this team of geniuses, and they’re seen as being geniuses—the spotlight’s on them, and everyone loves them—then I think that actually builds a silo and kills, rather than enhances, the true culture of the organization.”

Integrating tech across functions may be uncomfortable at first, like a collision of two different languages, says Khareghani. But the general-purpose nature of generative AI requires it. “Traditionally, tech has been separated from the rest of the business, and usually it’s just about getting your kit to work,” she says. “But now, more and more, these technologies are part of every business line, and the effects of their adoption should be part of the organizational strategy from the outset.”

Now read SThree’s report “How the STEM world works: Navigating the new era of AI and trust”