How should we assess Rupert Murdoch’s legacy?
A ship’s captain is responsible for the actions of his entire crew. Leaders who refuse to acknowledge responsibility for the actions of their employees should be robustly challenged, not let off the...
by Michael D. Watkins Published 22 November 2023 in Leadership • 9 min read
Iconoclast: n. A person who attacks or criticizes cherished beliefs or institutions.
This on-going column is intended to challenge the status quo. Not always giving answers, not always right, but smartly challenging our own assumptions, actions, organizations and institutions.
The furor around Open AI CEO Sam Altman being fired, and then finally reinstated, has brought into sharp focus the potential risks of allowing the leaders of the firms developing AI to make critical decisions about the future of a technology that may pose an existential threat to humanity.
Notably, the board that fired Altman was made up of people who had no financial stake in the company’s success and were dedicated to balancing the potential benefits of AI against the risks.*
Altman is a controversial figure. Given his leading role in making decisions about the future of AI, his goals and decisions have a significant impact and so his capacity to make good judgments, and those of other leaders of AI companies, matters a lot.
So, I was particularly concerned when I read an account that Altman is a survivalist. He is quoted in The New Yorker as saying to the founders of one of his companies, “After a Dutch lab modified the H5N1 bird-flu virus five years ago, making it super contagious, the chance of a lethal synthetic virus being released in the next 20 years became, well, non-zero. The other most popular scenario would be that AI attacks us and nations fight with nukes over scarce resources… I try not to think about it too much… but I have guns, gold, potassium iodide, antibiotics, batteries, water, gas masks from the Israeli Defense Force, and a big patch of land in Big Sur I can fly to.”
If true, this raises questions for me about Altman’s fitness to make hugely consequential decisions about the future of AI. Also, interestingly, Open AI’s interim CEO Emmett Shear appears to have quite a different view of the right tradeoff between safety and speed, posting on X in September, “I specifically say I’m in favor of slowing down, which is sort of like pausing except it’s slowing down.”
Think about the future of AI as putting a potential weapon of mass destruction into the hands of people like Sam Altman. The consequences of the decisions he and others will make are orders of magnitude greater than the usual decisions made by tech CEOs. This should make us all very concerned about whether these people have the psychological fitness and wisdom to do the right things. This is especially true given the immense financial returns and power that can be accrued by pushing forward regardless of the risks.
A recent summary of research on “dark” personality traits in CEOs provides important perspectives and raises huge concerns about who is making critical decisions about AI. These traits, which include excessive risk-taking, narcissism, Machiavellianism, and abusiveness, are described in the research as “dark,” not because they are “evil” in a moral sense, but because of their negative impacts on (often many) others due to overconfident risk-taking, pervasive self-oriented manipulation, and a lack of empathy for the effects of their decisions on others.
These qualities have been studied extensively over the past two decades to understand their impact on organizational outcomes. This research has established that dark personality traits can have both positive and negative effects on company performance. The specific traits that have been explored include:
Overconfident risk-taking – This trait can result in a CEO pushing their company to achieve high levels of performance. Still, it can also lead to unrealistic expectations, over commitment of resources, and sometimes a disregard for the welfare of employees or other stakeholders. CEO hubris can lead to a stubborn pursuit of visionary goals and the potential to achieve significant breakthroughs. But, it can also blind a leader to the realities of the market and organizational capabilities and result in failure.
Narcissism – CEOs with narcissistic traits may be more inclined to take bold actions, make big bets on innovative projects, and set visionary goals for their companies. Their self-confidence can be infectious, potentially inspiring employees and attracting investors. But these traits can also lead to risky decision-making, resistance to feedback, and volatile company management.
Machiavellianism – CEOs with Machiavellian traits may be adept at navigating corporate politics and outmaneuvering competitors. They might excel in negotiations and strategic partnerships because they can manipulate situations to their advantage. On the downside, this could create a toxic work environment and lead to unethical business practices.
Abusiveness – This describes leaders who regularly engage in behaviors intended to dominate, belittle, or otherwise cause distress to subordinates or colleagues.
In the context of “normal” technology firms, where innovation and rapid growth are highly valued, these qualities can result in positive business outcomes. A CEO’s willingness to take big risks, for instance, may help persuade stakeholders to invest in an unproven technology. Narcissistic CEOs may excel in projecting a strong image of the company, attracting talent and investment. Machiavellian CEOs may know when and how to “bend the rules” to achieve their goals. Abusiveness is tolerated because of the magnitude of the business results these CEOs achieve.
“It is essential to recognize that while dark personality traits can contribute to positive outcomes, they also carry substantial risks.”
It is essential to recognize that while these traits can contribute to positive outcomes, they also carry substantial risks. The success of CEOs with these traits in technology firms may depend on their ability to balance their more extreme tendencies with sound business judgment and the input of their management teams and advisors. The research highlights the double-edged sword of dark personality traits in leadership and suggests their impact is nuanced and dependent on the broader context within which they operate.
The potential benefits of being a dark personality CEO in a tech firm include superior outcomes for the firm and financial returns for shareholders. The downsides may result in failures and lost investments.
But the companies developing AI are not “normal.” Again, thinking about AI as a potential weapon of mass destruction can be instructive when considering what kinds of personalities we want to make decisions about their development and deployment.
When it comes to nuclear, chemical, and biological weapons, this is not a hypothetical question. The US military has done a great deal of work to develop criteria and assessments to winnow out the people who should and shouldn’t have the keys to trigger Armageddon.
Consider the psychological fitness test below. Based solely on the first three sections, this assessment would eliminate CEOs with dark personalities from making decisions with potentially cataclysmic impacts. To put it another way, it may be less of an issue for dark CEOs to lead normal tech firms, but a huge problem to trust them with making decisions concerning artificial intelligence.
How can we prevent potentially catastrophic consequences from AI being used in the wrong way by the wrong people? In a saner world, their decisions about how far to go, and how quickly, with AI would be subject to oversight by lighter and wiser personalities. With so much at stake, if the leaders of some of these firms are found to be making dark decisions, they would be barred from working at AI firms.
Note: This is not an actual instrument but illustrates the criteria used. In addition, it is set up as a self-assessment, while in practice, the people who access and control weapons of mass destruction are assessed by trained psychologists.
* Altman has been rehired as CEO of OpenAI with a new board. The best account so far about what led to his ousting is here and is worth reading as it provides more background on the safety vs. speed debate within the company and Altman’s apparent efforts to press for speed and stifle criticism.
Professor of Leadership and Organizational Change at IMD
Michael D Watkins is Professor of Leadership and Organizational Change at IMD, and author of The First 90 Days, Master Your Next Move, Predictable Surprises, and 12 other books on leadership and negotiation. His book, The Six Disciplines of Strategic Thinking, explores how executives can learn to think strategically and lead their organizations into the future. A Thinkers 50-ranked management influencer and recognized expert in his field, his work features in HBR Guides and HBR’s 10 Must Reads on leadership, teams, strategic initiatives, and new managers. Over the past 20 years, he has used his First 90 Days® methodology to help leaders make successful transitions, both in his teaching at IMD, INSEAD, and Harvard Business School, where he gained his PhD in decision sciences, as well as through his private consultancy practice Genesis Advisers. At IMD, he directs the First 90 Days open program for leaders taking on challenging new roles and co-directs the Transition to Business Leadership (TBL) executive program for future enterprise leaders, as well as the Program for Executive Development.
8 December 2023 • by Amit Shankar Mukherjee in Iconoclast
A ship’s captain is responsible for the actions of his entire crew. Leaders who refuse to acknowledge responsibility for the actions of their employees should be robustly challenged, not let off the...
23 March 2023 • by Salvatore Cantale in Iconoclast
By ignoring the views of UBS shareholders, trashing the priority rule, and failing to protect Credit Suisse shareholders, regulators have tarnished the reputation of Swiss banking for years to come....
21 March 2023 • by Arturo Bris in Iconoclast
What was behind the shocking demise of a treasured institution – and how will the proposed rescue package play out?...
20 March 2023 • by Peter Nathanial, Ludo Van der Heyden in Iconoclast
Who governs the regulators so that they learn and retain their lessons from previous crises, and apply them with determination to avoid a repeat scenario? It’s not simply about changing the rules...
Explore first person business intelligence from top minds curated for a global executive audience