Build a Diverse Team to Solve the AI Riddle
‘Hire English majors’ is common advice, but the practicalities of hiring, training, and integrating diverse team members are important.
Topics
Frontiers
The term artificial intelligence describes algorithms that run on powerful computers to solve complex tasks, and computer scientists are indeed the most skilled at writing such algorithms. Yet systems designed by narrowly focused technical experts — such as computer scientists, engineers, and mathematicians — can produce disappointing results, as each expert sees every problem through the lens of his or her respective discipline. Mathematicians, for instance, attempt to solve every problem with statistics.
While it’s natural to assume that computer scientists play the lead role in AI development, not every problem lends itself to such obvious solutions. Systems that actually get the job done are in fact built by better-rounded teams. A diverse approach can maximize a project’s chance of success.
Get Updates on Leading With AI and Data
Get monthly insights on how artificial intelligence impacts your organization and what it means for your company and customers.
Please enter a valid email address
Thank you for signing up
When our team at Principal set out to create an AI-based decision support tool for financial analysts, we found that the essential ingredient was diversity — in our case, having enough English majors on the team.
It’s All About Text
Our goal was to craft an AI system that sifts through financial reports and news bulletins, appropriately highlighting the most urgent items to prioritize the critical information requiring human attention. This digital triage would enhance the analysts’ awareness of the most relevant market conditions so that their judgments would have a strong, factual foundation and thereby be more effective.
Converting the written word into the mathematical forms that a machine can process is no simple task. The very act of breaking down words can destroy meaning, which makes natural language processing tricky. The most commonly used AI software packages begin their analysis by jettisoning stop words such as the, was, and for, under the theory that these most common words in our language tend to add little information to a sentence. Then the process of lemmatization reduces words to their base forms, stripping away tense, mood, gender, and so on: studying and studied become study; went becomes go.
The machine has an easier time processing the compacted expression using statistics, largely because what remains is devoid of nuance and, ultimately, of meaning. This is why an AI textual analysis project benefits from having a more diverse development team with linguists on board — linguists who know there has to be a better way of processing language.
Assembling a Truly Diverse Team
How does one build a team to tackle textual analytics? Parsing meaning is a fundamentally human exercise involving statistics and math, yes, but also linguistics, cognitive psychology, sociology, ethnography, and STEM (science, technology, engineering, and math). It involves multiple disciplines in collaboration, rather than one or two disciplines in isolation.
An effective diverse team needs experts from each of the non-STEM disciplines to contribute to the understanding of text and how we as humans use words to communicate. (For the sake of convenience, we’ll use “English majors” as shorthand for all of the non-STEM disciplines.) These domain experts will notice the subtle distinctions between U.S. English and British English and how grammatical rules change depending on which stylebook is used; they’ll discern subtleties of rhythm and word choice along with echoes of a poet’s phrasing or a novelist’s style. In short, these English majors contribute a valuable perspective that engineers may not naturally share.
Setting Common Standards for Effective Collaboration
Building an advanced AI system takes more deliberate effort than just hiring top-tier computer science talent and setting them off to do their thing. And it isn’t enough to toss a few liberal arts grads into the mix and expect sudden results.
All team members need to speak the same language. To work methodically together toward the larger goal, engineers and hires from the “soft” sciences alike must share a frame of reference. That means training the nonengineers so that their contributions demonstrate the same rigor required in the scientific disciplines. Bringing the English majors up to the standards of engineers is an intensive process but an essential one, teaching them the language of engineering and providing the structure needed for success.
English majors were not, of course, expected to use all of this training to do the job of data scientists; rather, they were expected to work within a shared framework with their engineer colleagues. Those engineers were already well versed in the subjects covered in project management, certified analytics, and systems engineering curricula — in theory. But managers were surprised to discover that, even within the same discipline, what’s taught varies quite a bit from school to school. So the engineers went through the same battery of training as the English majors.
At Principal, every team member obtained Project Management Professional certification, which involved intensive study of the process of initiating, executing, and monitoring complex projects, plus at least six months’ experience leading a project. Next, to bring nonengineers up to speed on the process of converting data into insight — our project’s primary goal — all team members completed the Certified Analytics Professional program offered by Informs. Finally, we brought in top AI experts to conduct systems engineering training to standardize the team’s approach to completing the series of highly complex projects needed to build the AI tool.
Applying standardized training requirements across the board ensures that everyone stands on an equal footing. Nobody gets special treatment, and there’s no playing favorites. Everyone understands that the English majors are as critical to the language problem as the engineers are to the STEM problem. Success requires both.
Of course, it was a lot harder for our English majors to tackle these foreign topics, which is why we never left them to fend for themselves as they engaged in these increasingly difficult tasks. We always paired them up with engineers who walked them through the most challenging concepts and guided them through the process.
Ours is a team-based effort — and it’s not for everyone. But we learned in the hiring process how to spot driven team players.
Spot the Fighters
Prospective candidates were told early on what they were getting into. A thorough screening process involving a series of three interviews made clear the level of effort required. The winning candidates were most excited about the opportunity to branch out and not just learn but master skills in a new field.
During interviews, it’s common to encounter a job prospect eager to rattle off a list of skills, certifications, and abilities, but you want the fighter: the candidate who may not have the exact skill set being sought but who shows the willingness to master it. Because AI deals with the ambiguous, you want someone who has demonstrated an ability to adapt to and deal with ambiguity; look for backgrounds that show continuous learning and curiosity. Executives may want to caution their HR departments against automatically discarding applications from liberal arts grads for even highly technical positions.
One red flag to avoid is the highly qualified individual who flies solo, saying “I did this” rather than “We did this.” This kind of candidate is not helpful in a team-based project.
Build a Better AI Future
All of our team’s hard work paid off with the development of a functional AI system that offers a glimpse of what future AI might look like. As AI algorithms become more powerful, the need for narrow specialization and technical expertise in the machines’ operators will diminish.
Augmented intelligence systems bridge connections between AI and the humans using them, absorbing the available data and scientific knowledge on a given subject and providing recommendations for the human operator. In medicine, for example, these systems have already proved as good as, or better than, human doctors in diagnosing certain forms of disease. While doctors won’t be replaced with iPhone apps anytime soon, these AI systems will enhance the judgment of human doctors, picking up on rare or subtle symptoms that may have been mentioned only in the footnotes of an obscure medical journal. An AI system that understands text can read everything, and it never forgets.
As we transition toward an intelligent enterprise that uses augmented intelligence to boost the abilities of skilled human operators, humans’ raw technical knowledge will diminish in importance. Judgment will prove critical, and a well-rounded education essential.
In the process of building a diverse AI team, we learned just how important it is to find well-rounded candidates. While “best of the best” teams composed of members with the same high-level experience, background, and education can run into roadblocks, we saw firsthand how a diverse team succeeds. Smart companies will diversify their talent pools to get ahead of the competition.