Make no mistake about it. The unauthorized release of the private data of millions of Facebook users is not only the worst in its checkered history, but it’s even worse for the tech industry as a whole. At the root of the crisis is a highly disturbing pattern. Unless this pattern is recognized for what it is, and ultimately curbed, then we’ll just lurch from one crisis to another.
Indeed, in his testimony before Congress in April 2018, Facebook founder and CEO Mark Zuckerberg said that the crisis was due to his and the company’s failure to see the “big picture.” Although I deal with it throughout, this book is not primarily about Facebook. It’s about a larger pattern that the crisis reveals about the tech industry in general.
Five components are key. Each is not only critical in its own right, but acting together, they spell disaster. They’re guaranteed to bring an organization and its leaders to their demise. Although I discuss each in turn, in reality they operate in tandem. I also explore them in depth throughout:
- 1.
Too much early success is actually detrimental to long-term survival and prosperity. It makes one complacent and thereby blind to the fact that there are serious problems lurking within one’s basic business model that need to be addressed sooner rather than later.
- 2.
The fact that one has weathered early crises also blinds one to the fact that one needs to start building a serious program in crisis management in order to be prepared for major crises later on that can’t be easily dismissed.
- 3.
The smug assumption that compared to technology, management is easy, if not trivial, prevents one from taking management seriously. I explore this later in terms of the phenomenon known as Splitting. It’s responsible for dividing the world sharply into “good” versus “bad guys.” It’s the basis for demonizing those we hold in contempt.
- 4.
The best crisis-prepared companies take immediate responsibility for their crises. They don’t issue meaningless apologies that only make the initial crises worse.
- 5.
And, finally, The Technological Mindset blinds its proponents to the fact that all technologies are abused and misused in ways not envisioned by their creators. Worst, it seriously hampers one from considering that all technologies come with serious downsides, and therefore, from taking appropriate preventative actions to mitigate their worst effects.
The Problems with Too Much Early Success
If a company is too successful from the start, e.g., its technology performs as intended and thus fulfills and even exceeds expectations, it breeds the mistaken belief that it will last forever. The bigger the initial success, and the longer it lasts, the greater the feeling that one has found “the golden goose that lays the legendary golden egg.” In short, the founder/company have found the magic formula for success that should not be tampered with in any way.
In the case of Facebook, early success allowed it to coast longer than it should have on a faulty business model. In a recent PBS Newshour interview, Sheryl Sandberg, Facebook’s Chief Operating Officer, said as much. From the very beginning, Facebook collected personal data from its users, which it then sold it to third parties for profit without any serious repercussions. Indeed, it hoodwinked customers into parting with their personal data with the overly simplistic slogan and promise of “being connected with the world.”
The moral: beware of early success for it blinds one to future problems. Indeed, why think about problems when one is such a great success from the beginning?
The eminent professor and business consultant Peter Drucker captured it best of all when he called the phenomenon “the failure of success.” Nothing fails more than quick and easy success.
The Problem with Weathering One’s Initial Problems
Nonetheless, along with its early success, problems soon appeared. Cyberbullying was among the first of Facebook’s problems to garner serious attention. So was the growing evidence that the more young people used it, the more depressed, isolated, lonely, and insecure they felt. In brief, they couldn’t live up to all of the idealized portraits of others with which they were constantly bombarded.
Paradoxically, because Facebook weathered its initial early crises, it wrongly took it as sign that it didn’t need to prepare for future ones that were not easily handled. The situation was made worse by the fact that it weathered even more serious crises such as its being used a platform by foreign governments to spread dis- and misinformation, and worse, hate speech. This only strengthened the feeling that it didn’t need a serious program in crisis management that would have helped prepare it for future much more severe and devastating crises. The fact that it was used—more accurately, abused and misused—in ways that were not intended should have alerted it to the fact that it was a sitting duck for more serious abuses lurking in its initial business model.
Thus, Facebook and tech companies were not prepared for the havoc that resulted when a truly serious crisis that couldn’t be ignored finally occurred. The fact that the personal information of over 50 million—which was later revised upward to 87 million—of Facebook users was used without their knowledge and permission not only resulted in a backlash against the company, but tech companies in general. The backlash was in fact so great that it caused a significant drop in the value of tech stocks not only affecting the market as a whole, but as well many more millions of people who were not users of Facebook.
Management Is Easy!
In addition, other less visible factors were also lurking in Facebook’s largely taken-for-granted belief system. Unfortunately, it’s an integral part of the tech community as well. It’s nothing less than the smug assumption that compared to technology, management is easy, if not trivial.
When I went to engineering school some years ago (I have a BS, MS, and PhD in engineering, all from UC Berkeley), it was common to divide the world into “hard” versus “soft” subjects. Science and engineering were “hard,” not just because they were difficult to learn and master, but because fundamentally they involved “objective, verifiable knowledge about the physical world, i.e., ‘hard indisputable facts.’” In contrast, because they were riddled through and through with “subjective unverifiable judgments and opinions,” the humanities and social sciences were irrefutably “soft.” It followed that what they had to teach was mostly obvious and trite so one didn’t need to waste one’s time studying them.
Chief among the “soft subjects” was management. After all, how difficult was it to manage an organization? Once the objectives were stated—“make x amount of dollars by time y”—then people either got on board or you got others who would. This of course assumed that one’s objectives were clearly known from the start and not changing and evolving, which they constantly are.
Sadly, far too many engineers and scientists are naïve at best when it comes to the social world. It’s never just about making x dollars by time y, but of not causing irreparable harm to the most vulnerable members of society, and of course the environment. It also ignores the fact that one of the most difficult and important tasks of management is not only getting the buy-in of employees, but retaining them.
Although we’ve made some progress, prevailing attitudes are still largely the same. Management is “easy and soft,” and technology is “hard in every sense and all important.”
No wonder Facebook and the other tech companies don’t take crisis management seriously and plan ahead for the worst. First of all, they really don’t understand what crisis management is. It’s not about being reactive and issuing platitudinous apologies after the fact. Truly effective crisis management is proactive. It not only consists of “thinking the unthinkable,” but being prepared for it so that one knows what steps to take to limit damage and assume rightful responsibility when the worst occurs.
What the Best Crisis-Prepared Organizations Do and What the Unprepared Fail to Do
The best crisis-prepared organizations not only admit their mistakes immediately, but indicate clearly that they are prepared to take decisive steps to lower the chances of future mishaps. More often than not, it involves deep changes in a company’s business model, leadership, structure, and culture. Without this, all the platitudinous apologies in the world only make the original crisis worse. For this and other reasons that I explore, I don’t believe that Mark Zuckerberg is the right person to head a major organization. He obviously has the requisite skills to invent an important technology, but not the maturity and social skills that are necessary to manage a complex organization. Some have even called for even more drastic steps such as it’s time for Facebook to go out of business entirely and be superseded by a new kind of social media company that is socially responsible from day one. 1
Second, true crisis management is not a preparing for a single crisis, but being prepared for a whole range—a system—of crises. As one of the field’s principal founders, and thus having worked in it for over 36 years, my colleagues and I have found repeatedly that no crisis is ever a single crisis. Instead, every crisis typically sets off a chain reaction of other crises.
Thus, Facebook’s breach of the personal data of millions of users by Cambridge Analytica was not only the result of its flawed business model of selling data to third parties for profit, but it triggered a cascade of other crises. First, the initial crisis damaged almost irreparably Facebook’s brand. How could one trust Facebook ever again? And, trust is the cornerstone of any business. Without it, no business can survive let alone prosper.
If this weren’t bad enough, Facebook sent a shock wave throughout the entire tech world. Clear calls for government oversight and regulation to protect against the downsides of technology, including privacy breaches, were sounded as never before. This rattled the stock market, causing not only the value of Facebook’s stock to drop substa...