Leading in the Age of Super-Transparency

Thanks to social media and an increasing flood of data, the capacity to generate causes and controversies almost instantly has become the new norm in today’s “super-transparent society.” Most business leaders have not yet come to grips with the new reality — and what it means for their organizations.

Reading Time: 22 min 

Topics

Permissions and PDF Download

When Martha Payne, a 9-year-old student in Argyll, Scotland, started a blog in April 2012, she had no idea of the stir she would soon cause. The lunches her school offered offended her youthful sense of justice, and she saw no reason to keep her thoughts to herself. So she began blogging under the name “Veg” (short for “veritas ex gustu,” which means “truth from tasting”). With tech support from her dad, Martha photographed and rated the school lunches and posted her reviews to a blog she christened “NeverSeconds.”

Soon Martha was adding new material regularly. The small portions were an early concern. “I’d have enjoyed more than 1 croquet[te],” she wrote, in a post from the first month. “I’m a growing kid and I need to concentrate all afternoon and I can’t do it on one croquette. Do any of you think that you could?”

Readers were supportive. “My toddler eats more than that,” one observed. Other blog posts questioned the food’s nutritional value, using words like “pathetic,” “rubbish,” and “disgraceful.” When celebrity chef Jamie Oliver tweeted in support of Martha’s project, newspapers picked up the story. Riding a wave of publicity, NeverSeconds logged 2 million hits in its first six weeks. Martha donated to a nonprofit that provides free school meals in poor parts of the world.

But the initiative soon screeched to a halt. As Martha explained in a blog entry titled “Goodbye,” her head teacher had removed her from math class, escorted her to the office, and told her to stop taking photos of school lunches. In a separate entry, Martha’s father noted that Martha’s charity efforts, which had raised nearly £2,000 at the time, would end, and thanked the school for being supportive. The decision to shut down NeverSeconds, he explained, came from the local area council.

However, the story didn’t end there — a firestorm ensued. Within 24 hours, there were 2,416 new reader comments on NeverSeconds, most expressing outrage at the local council that had issued the directive. The local council’s website was bombarded with negative comments, and critics launched a petition drive to save the blog. NeverSeconds logged another 1 million hits. The Twitter hashtag #MyLunchforMartha trended, and a dedicated Facebook page amassed comments. Half a world away, Wired magazine published a story on its website headlined “9-Year-Old Who Changed School Lunches Silenced by Politicians.”

The local council quickly reversed its decision. The council explained that, rather than trying to stifle discussion, it had merely been trying to be sensitive to the feelings of cafeteria workers. Just as the clampdown had been roundly criticized, the news of the reversal was greeted with celebration. Martha appeared on television with Jamie Oliver, and her fundraising and campaign for better lunches resumed.

The pattern that underlies this story has become common. Every day, images or events with the potential to incite passions get captured digitally, get posted to the Web, and “go viral.” In Martha’s case, it started with her lunch photos, but it could just as easily have been an audio recording of a rude customer service rep on a phone call (as happened when a customer called Comcast in 2014) or a video capture of police seeming overly aggressive in arresting a man for alleged jaywalking (as happened in Austin, Texas, not too long ago).1 With social media, people share their experiences with friends or followers, who then share with more people. Within a short space of time, incidents can strike a nerve with a nascent virtual community. And some unsuspecting party — such as the council in the area where Martha Payne lives — must respond to a dicey new problem.

Evocative images and events have always propelled causes and controversies, but not always from such obscure, unexpected, or geographically remote sources — or with such speed. In the past, controversies brewed when people came together, and they were transmitted, if the controversies grew active enough, by a limited number of media outlets, which also served as gatekeepers. Today’s controversies, by contrast, spring to life in myriad, overlapping online communities and get distributed via networks of unaccountable independent agents sharing information in real time. The capacity to generate causes and controversies almost instantly is perhaps the most salient aspect of what we call the “super-transparent society,” which has rapidly become a new norm. Because it has formed so recently, most people, especially leaders of organizations, have not yet come to grips with how much the world has changed or with the benefits and risks of living and leading in an era of super-transparency.

Our research explores the causes of these changes, the nature of the new reality that results, and the implications for organizations and their leaders. (See “About the Research.”) In aggregate, the changes amount to a fundamental shift in what is commonly known and knowable that invalidates some assumptions and practices we’ve previously relied upon. Executives need to understand this shift and how it changes the rules with regard to competitive markets, their company’s relationships with customers, the broader political contexts in which they operate, and beyond.

From Data Puddles to Data Floods

From childhood, most of us know that puddles of water form on the ground after it rains. Consequently, we understand how puddles and the water they contain behave. We know, for example, that water can be moved between puddles, but that it does not move by itself. We can move water by creating a channel between puddles or by using a bucket or a cup to move water from one puddle to another. We can even splash water from one to another, intentionally or not. But until the puddle dries up, our most basic assumption about puddles of water echoes a law of physics: Water in a puddle tends to remain in that puddle, provided that no other action is taken to change it.

Information used to behave similarly. Historically, only the people in Martha’s hometown in Scotland paid attention to the local goings-on, and information tended to remain within “information puddles.” When information moved beyond a particular puddle, it was due to deliberate action: Some identifiable person or organization moved it. If you wanted to be certain that information didn’t move, you built a barrier that kept the information contained (within an organization, say). Over the years, leaders have invested in building firewalls to restrict the flow of information. To this day, prevailing assumptions about puddles inform our understanding of how information behaves.

What, exactly, has changed? First and foremost, the amount of information. The volume of new digital data created every year is increasing exponentially. Individuals are the source of most of this data; 75% of all digital data is now created by consumers, much of it via handheld devices we carry around with us.2 By 2019, the technology company Cisco Systems Inc., which knows a thing or two about data traffic, forecasts that mobile data movement will total 24.2 exabytes (an exabyte is equal to 1 billion gigabytes) per month, after expanding at a compound annual growth rate of 57%.3

Put simply, our information puddles have overflowed and become floods. In a flood, water doesn’t behave in ways that are easy to understand. It doesn’t stay in place and is difficult to contain. Although there still can be boundaries separating reservoirs, increasing amounts of pressure behind them makes them more prone than ever to leak.

In many parts of the world, most people have smartphones in their possession most of the time. These devices not only generate huge volumes of new data; they also represent new channels of information flow that bypass intentionally constructed barriers. However much your company has invested in firewalls, a quick photo and transmittal of the information displayed on a computer screen (perhaps by a disgruntled or not very security-minded employee or contractor) can render such safeguards useless. Well-meaning and conscientious employees can also fall prey to tricks. For example, at a French company we know, a junior employee received an email requesting that important information be sent to an external address. She knew enough not to respond. But five minutes later, she received a phone call from someone who identified himself as a “vice president” (who spoke perfect French with exactly the right regional accent), asking her for information so the company could keep a customer happy. She complied. But it turned out to be a clever ploy and competitive information went out into the world, bypassing the company’s digital protections.

Your success in maintaining the integrity of your data depends on your ability to imagine all of the different ways information might flow in a flood. In late 2014, when protestors in Hong Kong feared that the government might shut off cellular networks, they started using FireChat, a smartphone app that connects mobile phones in a “mesh network” (a phone-to-phone relay that can route information without using cellphone towers).4 Wander into a Brookstone store and ask about Brookstone’s video camera pen; although it looks like a normal pen, it’s actually a video capture device that connects to a USB port and can easily dispatch data to the cloud. When direct routes to the Internet are blocked, people can use smart pens, watches, or USB sticks, as Edward Snowden did when he carried an estimated 1.7 million classified documents out of the U.S. National Security Agency, one of the most secure organizations on earth.5 SanDisk offers a 128GB micro SD card for less than $60;6 a two-terabyte card isn’t far off. The technology just keeps getting more sophisticated and cheaper.

Excitable Networks

Last winter the New York Times described how an off-the-cuff message sent into the cloud by a public relations executive just before boarding an 11-hour flight cost her her job.7 People interpreted her tweet as racist (although, as the Times writer pointed out, it could be interpreted otherwise) and shared it via Twitter, provoking a storm of criticism. Similar examples are common in the super-transparent world: awkward jokes or out-of-context comments, captured and transmitted, can produce unexpectedly immense reactions as information moves from an information puddle and into a chaotic flood. We refer to this information flow as “amplification.”

Amplification describes the tendency of certain images, stories, or other forms of information to resonate and travel widely. A precondition seems to be interconnectedness and overlapping networks of a certain density. The interactive nature of connections and the fact that posted information induces multifaceted reactions — a provocative post on Facebook might draw others into an argument, for example — causes information to feed back upon itself. People are drawn in not only by the original message but also by reactions to it (and reactions to reactions).

But connections and interactions don’t fully explain the phenomenon. Certain events seem to have greater capacity to stir passions. An obvious injustice willfully committed, captured in a video and posted for the world to see, is a classic and recurrent motif within viral causes. As the NeverSeconds story illustrates, certain events have compelling plotlines that can be highlighted by the way the events are presented or framed. In May 2013, a Reuters photographer captured an image of a woman, Ceyda Sungur, wearing a red dress and carrying a white bag just as she was being pepper sprayed by Turkish riot police; the “lady in red” photo became, as the online publication the Verge put it, “the symbol of Turkey’s unrest” for reasons that seem largely aesthetic:

With her stance relaxed and face downturn, Sungur, through Orsal’s lens, is the epitome of passive resistance … [The police officer’s] gas mask and crouched stance seem almost comically disproportionate to his target. With a barricade of shields framing the action with ominous uniformity, she stands alone and absorbs the spray.8

The composition of this visual information and the way the photographer’s craft frames and captures it lends itself to amplification. It is a powerful form of artistry, akin to poetry or moviemaking.9 Inexpensive but sophisticated tools for information capture and editing such as Adobe’s Photoshop and Apple’s iMovie make it more likely that someone “out there” will cast events that involve your organization in a surprising, passion-inciting plotline.

Agents of deliberate amplification work within the cloud, reinforcing and repeating items or interpretations of information. “Shamers” use all sorts of techniques to keep a furor from dying down. When the previously mentioned PR executive pursued other jobs, shamers took to social media again to continue to criticize her. Even bystanders become part of plots that become amplified; a joke whispered privately to a colleague can be tweeted or posted by someone who overhears it.10

Although amplification is often negative, there can also be positive effects. Sometimes enhanced transparency and amplification mean that injustices that might have remained hidden get called out and punished. Sometimes the shamed person or organization deserves it, and the credibility of the information is beyond refute. Transparency is often a good thing, and the word carries favorable connotations for this exact reason.

And of course “going viral” can in some cases be harnessed for marketing purposes. But even for marketers, it’s unsettlingly easy to underestimate or misjudge the chaotic behavior of flows within a flood: You can launch messages into the flood, but you cannot control where they go or what others do with them. For example, SodaStream, an Israeli company that sells a countertop device that turns water into seltzer, launched a marketing campaign in 2013 extolling the environmental virtues of its reusable bottles — only to find itself at the center of criticism for locating one of its production facilities in the disputed West Bank.11

Sometimes enhanced transparency and amplification mean that injustices that might have remained hidden get called out and punished.

Trails of “Digital Exhaust”

Dan Geer, a computer security and risk management expert, points out that individuals and organizations inadvertently reveal things about themselves in many ways.12 People generate huge amounts of data as by-products of everyday behaviors, such as Web searching and social media posting. Even if you have not revealed things about yourself, odds are that someone else has. Someone else has posted your photo on Facebook and probably tagged it with your name. What’s more, your appearance can be identified at a distance with pattern recognition software, using databases gleaned from social media. Your way of walking, detectable using the accelerometer in your smartphone, also can identify you. And if you can be identified, you can be tracked, revealing where you are and where you’ve been. Other data you generate divulges a lot about what you’re doing. Individuals and organizations produce a voluminous, mostly involuntary, “digital exhaust,” which reveals much more about them than they think it does.

Add “big data” to this mix and marketers, policy analysts, and others can conclude or predict things from this digital exhaust that they could not in the past. It’s possible to predict flu outbreaks from patterns in the symptoms people search for on Google, and for a company to know a young woman is pregnant before she has told family members (to cite two well-publicized examples).13 The ability to crunch data cheaply and rapidly leads to even greater transparency. Cross-referencing one dataset with another — “putting two and two together” — allows analysts to discover things about you and your organization that you have not disclosed. Researchers have shown, for example, how supposedly anonymized data about customer purchases can be de-anonymized by cross-referencing.14

You might think this capability is limited to sophisticated analysts, but it goes further. Motivated individuals with modest skills and a moderately powerful computer can deduce a lot. Moreover, the so-called “power of the crowd” enables ordinary people to accomplish sophisticated feats of transparency-producing analysis.15 In February 2009, a masked figure posted a video on YouTube that showed him abusing a cat. The video prompted collaborative detective work by cat sympathizers. The cat sympathizers cross-referenced the YouTube video with others and with photos on Facebook, noting similarities in carpets, walls, and flags. Using a process worthy of crime scene investigators, they identified the masked figure and reported him to police. “Dusty the cat” was rescued and his abuser cited for cruelty to animals.16

In addition, the much-vaunted “Internet of things” has already invaded our homes and businesses, and unbeknownst to many people, it has the potential to transmit lots of data about what we do or what we say. For example, in a two-hour Internet scan, HD Moore, chief research officer at Rapid7 LLC, a Boston-based Internet security company, found 5,000 “wide-open” corporate boardrooms, equipped with videoconferencing equipment with inadequate security.17 IP-enabled heating systems can tell people who know how to read them whether you’re home or away — and let’s not forget the data generated by cars.18 In our pursuit of convenience through network-connected devices, we are creating streams of bundled personal data that can make us more and more transparent.

The Rise of Cyber-Snoopers

Regardless of one’s views about the impact of WikiLeaks, as an organization it is committed to setting information free as a matter of first principle. The vast majority of the emails that hackers stole from Sony and that were recently released by WikiLeaks contained no evidence of injustices, only information that has attracted mostly prurient interest (for example, rude comments movie execs have made about movie stars). Nevertheless, some advocates argue that there shouldn’t be secrets — that information needs to be free.

The past few years have seen the rise of what some refer to as “hacktivists,” entities that take up causes and use computer skills in ways that are sometimes either borderline or outright illegal. In recent years, hactivists operating as a loose community under the umbrella name “Anonymous” have compelled law enforcement organizations to reopen cases and companies to change their ways of doing business. They do this by disclosing information Anonymous obtains by unnamed means and sometimes by threatening to release the names and addresses of allegedly guilty insiders. Anonymous has no stable membership, and “members” generally don’t know each other except by screen names. However, like many cause-motivated communities, Anonymous groups coalesce when needed.

Anonymous has targeted many kinds of organizations, even hospitals, whose leaders have taken actions someone did not like, often with impressive effect.19 In 2011, for example, Anonymous went after HBGary, an IT security firm. Aaron Barr, the then-head of the company’s federal division, had attempted to infiltrate Anonymous, which had attacked MasterCard and Visa websites (reportedly because the companies had stopped processing donations to WikiLeaks).20 But Barr may have overreached when he told a Financial Times reporter that he was closing in on his prey. Soon after publication of the Financial Times article, Anonymous took control of the HBGary Federal website, stole and leaked company emails, and deleted company data; the group also took over Barr’s Twitter account.21

The people involved were eventually caught, and they were not master criminals. Two were in their 20s, and three were teenagers. They had access only to modest computers and tools, and had succeeded not only with online skills but also by “social engineering” — a low-tech version of a well-known con where outsiders convince people within companies to disclose information that should be kept private. Some of the problem, then, was simply bad judgment or carelessness on the part of employees. Cybercriminals, often equipped with some expertise and resources, add to this troubling mix. Unlike other forms of hackers, criminals don’t brag about their exploits — they keep them secret so they can use the techniques again.

Managing in a Super-Transparent World

Our research suggests several steps that managers can take to meet the new expectations for transparency.

Examine your assumptions about how you can keep information contained. You can no longer count on information boundaries, either those that occur naturally or ones that you construct. Systematically identify unrealistic assumptions — the points at which you might be unwisely assuming that you can contain information. You might be able to keep some secrets, but it will be more expensive and difficult than it was in the past. And there’s always a risk that your efforts won’t work.

Review your strategy for dealing with vulnerability to unintended transparency. To what extent does your organization’s strategy depend on containing information flows? If the answer is “a lot,” consider moving to strategies that are not sensitive to unexpected information flows and your ability to keep secrets: strategies that are based on what your company can uniquely do as opposed to what it knows. Many organizations can successfully reduce their degree of vulnerability.

Review your organization’s operations for issues that might be problematic if revealed. How many companies have been embarrassed in recent years by revelations about their operations in a distant country that also surprised leaders back at headquarters? The fact is that many companies are vulnerable to attackers who have a specific axe to grind. Consider hiring external, supply chain, public relations, or security consultants to investigate your operations and fix problems preemptively. Identify managers and management practices that might be problematic. Rather than hunting potential leakers, which might cause people to speculate that you have something to hide, the goal should be to identify questionable behaviors or practices that might prompt leaking.

Assume that others will put out information about your organization for their own reasons and that you won’t be able to prevent it. The “Streisand effect” describes what happened when singer Barbra Streisand used lawyers to try to get an aerial photo of her house deleted from a photo collection on the Web — and in the process brought more publicity to the photo. When the prime minister of Turkey tried to “rip out the roots” of Twitter to prevent sharing of recordings of phone conversations with his son, the tech community almost immediately created workarounds, and traffic on Twitter within Turkey increased.22

Managing your image has become a new game, and being prepared to respond quickly, especially to information that is incorrect, is a big part of it. You won’t be able to stop malicious falsehoods from ripping through social media. Fringe media outlets, riding the sensation, may even try to avoid your corrections to keep a great plotline alive. But as you deposit accurate information into the cyberspace “record” and provide responsible people with facts that can be verified, you’ll eventually put the brakes on irresponsible claims. Most companies are much too slow to respond.

Recognize that new information flows change what people consider to be fair. When information that was not previously accessible becomes easily accessible, something changes: People often feel that the information now should be accessible. And when information flows in new ways, how people look at business activities changes. One software company manager told us that the discounts his company had previously offered only to prospective customers became problematic once existing customers learned about them; the existing customers became indignant and demanded similar discounts.

There’s a parallel here to what happened to the music industry in the late 1990s, when the bits and bytes that record companies owned flowed freely across the Web. Viewing themselves as victims, record companies took legal action against downloaders — only to discover that they themselves had become the bad guys in many people’s eyes. Because digital content could be easily downloaded, many people begrudged those who tried to limit free downloads. In the new information era, it became seen as unfair to ask consumers pay $20 for a complete CD when many people just wanted one or two songs. Apple’s iTunes single-song pricing model was a response to the new customer view of fairness.

In many ways, today’s emerging super-transparent reality is the music industry’s late-1990s problem writ large. It will force changes in the way a broad set of companies operate. It goes beyond music flowing in surprising and uncontrollable ways and extends to the contents of our lives, captured by personal, portable digital data devices. In addition to smartphones, increasingly it includes an array of other devices (smart pens, watches, wearables, etc.). And just as the music business never regained its balance, we, too, will increasingly find our realities as individuals, organizations, and managers permanently changed.

In a super-transparent world, it’s important to be on the lookout for shifts in what your customers and the broader public consider reasonable. You won’t anticipate all of them. Expect to be caught wrong-footed when customers or the public suddenly sees something that your organization does in a surprisingly different light. The shift in interpretations of what you do as a manager — and how your organization behaves — will require you to make changes. Even if you can’t be ready for the all the challenges, you’ll be better off if you begin preparing now.

Topics

References

1. E. Lee, “Comcast Customer Service Call Goes Viral: Company ‘Embarrassed’ by Rep’s Treatment of Customer,” July 16, 2014, www.usmagazine.com; and P. Mejia, “Viral Video Shows Brusque, Forceful Arrests in Austin, Texas,” November 18, 2015, www.newsweek.com.

2. “Big Data Gets Personal,” MIT Technology Review report, May 2013, www.technologyreview.com.

3. “Cisco Visual Networking Index: Global Mobile Data Traffic Update 2014-2019 White Paper,” February 3, 2015, www.cisco.com.

4. N. Cohen, “Hong Kong Protests Propel a Phone-to-Phone App,” New York Times, October 5, 2014, www.nytimes.com.

5. L. Franceschi-Bicchierai, “Snowden Stole Secret NSA Documents with a Flash Drive,” June 13, 2013, http://mashable.com.

6. A. Cunningham, “SanDisk’s 128GB MicroSD card is the Biggest, Tiniest Storage You Can Buy,” February 25, 2014, http://arstechnica.com.

7. J. Ronson, “How One Stupid Tweet Blew Up Justine Sacco’s Life,” New York Times Magazine, February 12, 2015.

8. A. Toor, “How a ‘Lady in Red’ Became the Symbol of Turkey’s Unrest,” June 7, 2013, www.theverge.com.

9. See L. Devin and R.D. Austin, “The Soul of Design: Harnessing the Power of Plot to Create Extraordinary Products” (Stanford, California: Stanford University Press, 2012).

10. Ronson, “One Stupid Tweet.”

11. L. Abramson, “SodaStream Criticized for West Bank Plant,” February 10, 2013, www.npr.org.

12. D. Greer, “We Are All Intelligence Officers Now” (presentation at the RSA Conference, San Francisco, California, Feb. 28, 2014).

13. B.E. Hernandez, “Google Knows Where the Flu Outbreaks Are,” January 10, 2013, www.nbcbayarea.com; and K. Hill, “How Target Figured Out a Teen Girl Was Pregnant Before Her Father Did,” February 16, 2012, www.forbes.com.

14. J. Bohannon, “Credit Card Study Blows Holes in Anonymity,” Science 347, no. 6221 (January 30, 2015): 468.

15. For research insights into why this works, see L.B. Jeppesen and K.R. Lakhani, “Marginality and Problem-Solving Effectiveness in Broadcast Search,” Organization Science 21, no. 5 (September 2010): 1016-1033.

16. “Kenny Glenn Case/Dusty the Cat,” September 21, 2011, http://knowyourmeme.com.

17. R. Vamosi, “Corporate Video Conferencing Systems Fail Secure Implementation,” January 26, 2012, www.securityweek.com; and Ms. Smith, “Hacks to Turn Your Wireless IP Surveillance Cameras Against You,” April 14, 2013, www.networkworld.com.

18. D. Takashi, “Hello Dave. I Control Your Thermostat. Google’s Nest Gets Hacked,” August 10, 2014, http://venturebeat.com; and H. Kelly, “The Five Scariest Hacks We Saw Last Week,” August 5, 2013, www.cnn.com.

19. M.B. Farrell and P. Wen, “Hacker Group Anonymous Targets Children’s Hospital,” Boston Globe, April 24, 2014.

20. R. Mackey, “‘Operation Payback’ Attacks Target MasterCard and PayPal Sites to Avenge WikiLeaks,” December 8, 2010, http://thelede.blogs.nytimes.com.

21. P. Bright, “Anonymous Speaks: The Inside Story of the HBGary Hack,” February 15, 2011, http://arstechnica.com.

22. C. Letsch, “Turkey Twitter Users Flout Erdogan Ban on Micro-Blogging Site,” March 21, 2014, www.theguardian.com.

i. D.M. Upton and S. Creese, “The Danger From Within,” Harvard Business Review 92, no. 9 (September 2014): 94-101.

ii. J.A. Winnefeld Jr., C. Kirchhoff, and D.M. Upton, “Cybersecurity’s Human Factor: Lessons From the Pentagon,” Harvard Business Review 93, no. 9 (September 2015): 86-95.

Reprint #:

57208

More Like This

Add a comment

You must to post a comment.

First time here? Sign up for a free account: Comment on articles and get access to many more articles.

Comments (2)
Eduardo Testart
Really Great! Now it will be more and more difficult to "Hide" information, actions and activities, forcing all to behave like "someone might see you..."
Good!! for a better world.
Knowledge Elisha
This is a great article, thank you guys for your work. this has really opened my eyes to how digital era is actually disrupting all facets of the organisation and there is urgent need for the leaders to accept it and align accordingly.