Of all the good technological innovations that were supposedly going to rescue the news business from the bad technological innovations that had laid it low, perhaps none was more highly touted than Apple’s iPad.
Unfortunately, the iPad has proved to be a huge disappointment for news publishers. The reason, according to Shira Ovide of Bloomberg Businessweek, is that though people like their iPads, they love their smartphones. Sales of the iPad peaked at 71 million in 2013 and slid to about 44 million last year. Meanwhile, about 1.5 billion smartphones were sold in 2017. Against that backdrop, iPad sales are barely a rounding error.
Over the weekend Apple removed software from its Chinese App Store that enabled iPhone users to get around censorship laws in that country. The action was widely portrayed as a blow to those working for freedom and human rights in China. And it seemed especially tawdry following as it did the recent death of Nobel Peace Prize winner Liu Xiaobo while in Chinese custody.
But I would argue that Apple did the right thing. My intention is not to write a love letter to Apple, whose leadership, I’m sure, was motivated more by commerce rather than by conscience. Nevertheless, Apple’s decision was a welcome example of Americans’ dealing with the world as it is rather than as they wish it to be. Our values are not everyone’s values.
Typical of Apple’s critics is New York Times technology columnist Farhad Manjoo, who couldn’t understand why Apple would back down so quickly after successfully fighting the FBI’s demand last year to provide a software key to a terrorist’s iPhone — and, thus, to all other iPhones as well.
“When Apple took a public stand for its users’ liberty and privacy, the American government blinked,” Manjoo wrote. “Yet in China over the weekend, when faced with a broad demand by the Chinese internet authority, it was Apple that blinked.” Yes. But what Manjoo was describing was not situational ethics on Apple’s part. Rather, it was the difference between the United States, a free country ruled by laws, and China, a repressive authoritarian state. In fact, as Manjoo conceded later in his column, Apple would likely have accomplished nothing by pushing back against Chinese officials.
China may show little respect for the rights of its citizens, but it is part of the world community. It makes sense to ban interactions with pariah regimes such as North Korea and Syria, and to prohibit companies from doing business in China in a way that leads to the direct persecution of citizens (something that could in fact arise from Apple’s plan to build a data center in China) or that involves prison labor. But we have no more right to impose our vision of free speech on China than, say, Canada does to insist that we adopt its immigration policies as a condition of doing business.
Besides, even most Western democracies do not have as expansive a view of free speech as we do — yet no one seems to find it outrageous that we accommodate ourselves to their laws when doing business overseas. In the early days of the commercial web, Yahoo was fined $15 million for violating French hate-speech laws that prohibited the display and sale of Nazi memorabilia. Such laws would be regarded in the United States as an outrage against the First Amendment. But of course Europe has a history with hate speech that, so far, we have been fortunate to avoid.
More recently, Google has had to contend with “the right to be forgotten,” as European Union countries — again led by France — have passed laws requiring that certain types of private information be removed from the internet. To comply, Google has set up an “EU Privacy Removal” form that lets users fill out a questionnaire about offending material.
As an online columnist for The Guardian from 2007 to ’11, I had to contend with British libel laws several times. My editors told me that some of my media and political commentary had to be toned down even though it wouldn’t have raised an eyebrow in this country. Indeed, at one time it was common for plaintiffs to engage in “libel tourism,” filing suits in the U.K. because they were more likely to win there than in the U.S. Reforms have made that less of an issue, but it is still far easier to win a libel suit in London than in New York. The difference is that, under the First Amendment, speech about public officials and public figures is protected except when it is egregiously and deliberately false.
All of this, I realize, is rather far afield from the oppression and violence experienced by anyone in China who refuse to conform. These examples do show, however, that American businesses see nothing abnormal about adapting their practices to other countries’ laws and traditions, even on fundamental values like freedom of expression.
In 1940 Sen. Kenneth Wherry, a Nebraska Republican, cast an eye toward China and declared, “With God’s help, we will lift Shanghai up and up, ever up, until it is just like Kansas City.” It was a naive view of American exceptionalism then, and it is expressed today by those who think we can use our economic leverage to bend China to our will.
We can’t, and Apple’s executives recognize that. Despite its repression, China today is freer than it was when Richard Nixon made his historic visit. We can hope that it will be more free in the future. By engaging with the Chinese on their own terms, we might be able slowly help that process along.
I had expected fireworks—or at least strong disagreements—when Internet privacy advocate Jonathan Zittrain and former CIA director John Deutsch debated the impasse between Apple and the FBI over a locked iPhone used by one of the San Bernardino shooters.
Instead, the two men offered nuance and a rough if imperfect consensus over how much access we should have to technologies that allow us to encrypt our personal data in ways that place it beyond the government’s reach.
“Many other paths to data are available. We are exuding data all over the place,” said Zittrain, a professor at Harvard Law School and the author of The Future of the Internet—And How to Stop It. “The FBI has chosen this case … in large part, I think, because there is so little privacy interest on the other side.”
Deutsch, an emeritus professor at MIT, sought to draw a distinction between law enforcement and terrorism investigations such as the San Bernardino case. Authorities say they need to know what was on the phone used by the late Syed Rizwan Farook because it might reveal the identities of accomplices who are planning future attacks.
“There’s a big difference between law enforcement and national security,” Deutsch said. Law enforcement, he explained, is about “catching bad people,” whereas the aim of national security is “to avoid a catastrophe.” There is a public interest in requiring cooperation from companies such as Apple in a national-security investigation, he said, with the courts setting boundaries for when such cooperation should be compelled.
If that sounds like disagreement, it was so polite and mildly worded as to create barely a stir. Indeed, people in the packed hall at Harvard’s Kennedy School on Monday evening—most of them Apple partisans, I suspect—seemed to appreciate a discussion that focused more on the fine points of technology and the law rather than on broad proclamations about Internet freedom versus the threat of terrorism.
Then, too, technology is changing so rapidly that the points raised during the debate may soon be obsolete.
Apple has been ordered to write software that will enable government investigators to gain access to Farook’s data; the company has filed an appeal seeking to overturn that order. But as Zittrain noted, Apple executives say they will soon offer encryption software to consumers that will make it impossible for anyone—even Apple itself—to break in. Such software, Zittrain added, is already available from various sources, which means that even if it were legally banned, it could still be used.
And that changes the nature of what’s at stake. As the San Bernardino case has played out, I’ve been more sympathetic to the government than to Apple. Why shouldn’t a corporation be required to comply with a court order to provide information in a terrorism investigation? And if it’s extraordinary to demand that Apple to write software so that the phone can be accessed, what of it? That’s simply a consequence of Apple’s engineering decisions.
As Deutsch put it: “That’s not really the essential point. It’s a minor part of the issue.”
On the other hand, I’m as uncomfortable as anyone with the idea that Apple and other companies could be forbidden to offer encryption so strong that even they would lack the means to bypass it. Requiring companies to build in a so-called back door would open the way not just to legitimate investigations but to privacy breaches and fraud, and would hand yet another tool to authoritarian governments seeking to repress dissent.
Zittrain and Deutsch talked about what role Congress and the courts might play in finding the proper balance between privacy and security. I asked them whether those institutions could have any role at all in a world in which no one but the end user would be able to bypass the encryption settings.
Zittrain responded that we have never lived in an era when every bit of data is accessible to a government investigator with a warrant. Even so, he said, there will continue to be vast amounts of personal data available to investigators despite the existence of strong encryption. “There’s a whole constellation of data points out there,” he said, calling it “an embarrassment of riches.”
I found Deutsch’s response more intriguing, reflecting as it did his both cloak-and-dagger days at the CIA and his long career in science.
“I don’t believe that phones irrevocably go dark,” he said, explaining that he believes Apple and other companies will retain the ability to unlock encrypted devices regardless of what they publicly proclaim. He also offered what he called “a suspicious paranoid point: all of these phones are made in China.” Would the Chinese government really allow the manufacture of technology that it couldn’t somehow access?
With technology changing so rapidly, Zittrain said, the current dispute between Apple and the FBI is “a bellwether rather than the case of the century.”
This time, in other words, Apple says that it won’t. Next time, it may say that it can’t.
The stakes in the raging battle over ad-blocking software are high — but they’re not quite what you might think.
On the surface, it all seems straightforward enough. In one corner are executives at struggling news organizations who want to be sure that visitors to their websites actually see the ads. Thus did the Washington Post recently experiment with blocking the ad-blockers, a development first reported by BuzzFeed.
“Many people already receive our journalism for free online, with digital advertising paying only a portion of the cost,” a Post spokesperson was quoted as saying. “Without income via subscriptions or advertising, we are unable to deliver the journalism that people coming to our site expect from us.”
In the other corner are users who are sick and tired of popups, pop-unders, scroll-across-the-screeners and other obstrusive ads that invade your privacy by tracking your interests and that, in some cases, carry spyware or malware.
“What is unlikely to fly as a long-term strategy is begging readers to load all of the 50 or so trackers and ad-loaders and popups and banners, each of which might make a publisher three cents per thousand clicks, if they are lucky,” writes Mathew Ingram at Fortune. “That business is in a death spiral, and yelling about ad blockers isn’t going to change that.”
In fact, the ad-blocking controversy is anything but a simple morality play. Nor is it a coincidence that the issue has reached a frenzied peak thanks to Apple’s decision to include ad-blocking in its iOS 9 software for iPhones and iPads. Because the real stakes are being fought not on the Internet but in the boardrooms of the giant tech companies that want to control your online experience.
Nilay Patel, editor-in-chief of The Verge, explained it last week. Essentially, it comes down to this: publishers that rely on web advertising are helping to drive revenue to Apple’s archenemy, Google, which controls much of the infrastructure for online ads. Block those ads and those publishers are more likely to run into the warm embrace of Apple, whose new Apple News platform provides a nice, safe, closed environment with ads that can’t be blocked. And Apple gets a 30 percent cut.
Facebook offers a similar service, the still-aborning Instant Articles, which allows publishers to post their content directly inside Facebook’s all-powerful newsfeed. As with Apple News, Facebook takes a cut of the action from the unblockable ads that will be displayed. It’s such an attractive proposition that the same Washington Post that’s trying to block the ad-blockers announced Tuesday that it will also publish 100 percent of its content to Facebook. Patel writes:
So it’s Apple vs. Google vs. Facebook, all with their own revenue platforms. Google has the web, Facebook has its app, and Apple has the iPhone. This is the newest and biggest war in tech going today.
And the collateral damage of that war — of Apple going after Google’s revenue platform — is going to include the web, and in particular any small publisher on the web that can’t invest in proprietary platform distribution, native advertising, and the type of media wining-and-dining it takes to secure favorable distribution deals on proprietary platforms. It is going to be a bloodbath of independent media.
As a matter of principle, I refuse to use ad-blocking software — but I turned on AdBlock while researching this article just to see what would happen. As anyone could have told me, sites loaded more quickly and with fewer distractions. ESPN.com, which is so bogged down with ad-related bloatware that it’s become virtually unreadable, was zippier than I’ve ever seen it. A small hyperlocal site that I often visit suddenly appeared ad-free, simply because the site relies on an external ad-server business that AdBlock intercepted.
Interestingly enough, Marco Arment, the creator of the best-selling ad-blocking program Peace, pulled the software from Apple’s App Store almost as soon as it was released last week. “Achieving this much success with Peace just doesn’t feel good, which I didn’t anticipate, but probably should have,” he wrote on his blog. “Ad blockers come with an important asterisk: while they do benefit a ton of people in major ways, they also hurt some, including many who don’t deserve the hit.”
By acting as he did, Arment may have pointed the way to a possible solution. Because the problems ad-blockers are designed to solve are real, and they run a lot deeper than mere inconvenience. As Dan Gillmor recently wrote in Slate, “The advertising and tracking industries, abetted by telecommunications carriers, are investing in all kinds of technologies aimed at thwarting users’ wishes to retain some control over their online activities.”
So why not come up with a different kind of blocker — a piece of software that informs you when you’re about to access a website that fails to follow some agreed-upon list of best practices regarding privacy and user experience?
Such an arrangement may be the best way to preserve independent media on the open web. Users would be able to protect themselves from abusive adware without freeloading. And web publishers who see their traffic drop might decide it’s time to change their ways.
Because I’m working on a book that deals in part with how Amazon founder and chief executive Jeff Bezos is transforming The Washington Post, I read The New York Times’ account of Amazon’s brutal workplace environment with great interest.
Reporters Jodi Kantor and David Streitfeld portray a company in which high-ranking employees are regularly reduced to tears, in which everyone is encouraged to drop anonymous dimes on one another, and in which a culture of 80-hour-plus work weeks is so ingrained that nothing — not even serious health problems — must be allowed to interfere.
This story is still playing out, but I have a few preliminary observations.
First, very little in the Times story will surprise anyone who read Brad Stone’s 2013 book “The Everything Store: Jeff Bezos and the Age of Amazon.” Stone goes into great detail about what a difficult place Amazon is to work. A key difference is that Stone, unlike Kantor and Streitfeld, is at least somewhat sympathetic to Bezos and understands that he and his team have built something truly remarkable.
Second, the Times article did not convince me that the culture of Amazon is uniquely awful. If you’ve read Walter Isaacson’s biography of Steve Jobs, you know that the upper reaches of Apple could be pretty hellish back when Jobs was ranting and raving. Occasionally you hear stories along similar lines about other tech companies. Would you want to run afoul of Mark Zuckerberg, Larry Ellison or Steve Ballmer? We’re also talking here about a special kind of white-collar, highly educated hell among people who could easily leave and work elsewhere. How about working as a clerk at Wal-Mart? Or as a farm laborer in California?
Third, some of the details in the Times article are being disputed. Nick Ciubotariu, a high-ranking engineer at Amazon, has written a long response to the Times article defending his company. It’s a mixed bag that will provide fodder for Amazon’s critics and defenders alike. Some of it is mind-bending, such as this: “No one is ‘quizzed’ — the quiz is totally, 100% voluntary.” Huh?
Some of it, though, is worth pondering. Ciubotariu, a newish employee (he’s been there 18 months), writes that he has heard the Amazon culture has improved in recent years, and he accuses the Times of relying on old stories from former employees. That has some resonance, as Stone in “The Everything Store” describes Bezos’ halting efforts to curb some of his excesses.
But Ciubotariu also offers specific denials of some of the Times’ assertions, including the most toxic one of all — that a certain number of employees are fired every year as a deliberate management practice. Here’s how the Times puts it: “Losers leave or are fired in annual cullings of the staff — ‘purposeful Darwinism,’ one former Amazon human resources director said.”
Here’s Ciubotariu: “There is no ‘culling of the staff’ annually. That’s just not true. No one would be here if that actually took place and it was a thing.”
At Re/code, Peter Kafka reports that Bezos himself has responded in a memo to his employees, urging them to read both the Times story and Ciubotariu’s response. Bezos writes in part:
The [Times] article goes further than reporting isolated anecdotes. It claims that our intentional approach is to create a soulless, dystopian workplace where no fun is had and no laughter heard. Again, I don’t recognize this Amazon and I very much hope you don’t, either.
I am sure that we haven’t heard the last word.
Photo (cc) by Luke Dorny and published under a Creative Commons license. Some rights reserved.
Remember when the iPad was going to save the news business? How did that work out? But if the redemptive qualities of tablets turned out to be overblown, they are nevertheless a compelling platform for consuming all kinds of text and multimedia material, including news.
This morning I spent way too much time with The Washington Post’s new iOS app, which is detailed at the Nieman Journalism lab by Shan Wang. It is beautiful, with large pictures and highly readable type. I was already a fan of what the Post is now calling “Washington Post Classic.” But this is better.
So do I have a complaint? Of course. The Classic app is more complete; it includes local news (no, I have no connection to the Washington area, but it’s nice to be able to look in on occasion), whereas the new app is aimed at “national, international audiences.”
And both apps rely more on viral content than the print edition, a sluggish version of which is included in Classic.
Quibbles aside, this is a great step forward, and evidence of the breakthroughs that are possible with technology billionaire Jeff Bezos in charge. In fact, the new app is a version of one that was released last fall for the Amazon Fire. So it’s also heartening to see that Bezos isn’t leveraging his ownership of the Post entirely to Amazon’s advantage.
Another paper with a billionaire owner has taken a different approach. Several months ago John Henry’s Boston Globe mothballed its iOS replica edition — that is, an edition based on images of the print paper — and replaced it with an app that is still print-centric but faster and easier to use. It was developed by miLibris, a French company.
The first few iterations were buggy, but it’s gotten better. In general, I’m not a fan of looking at the print edition on a screen. But I find that the Globe’s website is slow enough on my aging iPad that I often turn to the app just so I can zoom through the paper more quickly, even if I’m missing out on video and other Web extras.
One big bug that still needs to be squashed: When you try to tweet a story, the app generates a link that goes not to the story but, rather, to the Apple Store so that you can download the app. Which, of course, you already have.
Finally, it’s worth noting that the Boston Herald has a pretty nice iOS app, developed by DoApp of Minneapolis. It’s based on tiles, so it’s fast and simple to use. It’s so superior to the Herald’s creaky website that I wish there were a Web version.
Do apps for individual news organizations even matter? We are, after all, entering the age of Apple News and Facebook Instant Articles.
My provisional answer is that the news organizations should both experiment with and push back against the drive toward distributed content. It’s fine for news executives to cut deals with the likes of Tim Cook and Mark Zuckerberg. But it would be a huge mistake if, in the process, they let their own platforms wither.
Recently I had a chance to interview three smart people about the future of local journalism:
Josh Stearns, director of journalism and sustainability at the Geraldine R. Dodge Foundation, who is studying six digital startups in New Jersey and New York. (You can see my full interview with Stearns by clicking here.)
Meg Heckman, a University of New Hampshire journalism professor whose master’s thesis at Northeastern University was on the role of women at digital startups — and why women are more likely to be involved in hyperlocal sites than in larger national projects.
Tim Coco, the president and general manager of WHAV Radio in Haverhill, a mostly online community station (it also has a weak AM signal) for which Coco is seeking a low-power FM license.
I don’t get to make videos that often, but I wanted to scrape some of the rust off my skills for the benefit of my graduate students, who are currently making their own videos. My philosophy is that every journalist needs to know how to make a decent video with the tools at hand — in my case, an iPhone 5S, a portable tripod that I bought five years ago for less than $20, and iMovie ’11, also known as iMovie 9. (The newer iMovie 10 strikes me as slow and kludgy, but maybe I just need a faster computer.)
The one luxury I indulged in was a Røde lapel mic (known in the trade as a lav mic), which I bought for well under $100 just before I started this project. It made a huge difference — the audio is of far better quality, with much less interference from outside noise, than in previous videos I’ve made.
What I should have done, but didn’t, was use a better app than Apple’s built-in Camera so that I could lock in brightness and contrast. That way I could have avoided the sudden shifts from dark to light and back that mar my interview with Stearns.
Still, it’s useful to know that you can shoot a decent video without spending many hundreds of dollars on a professional camera and Final Cut Pro. I think there’s a tendency at journalism schools to believe that we’re selling our students short if they don’t get to use the latest and greatest technology. And yes, they should have a chance to use the good stuff. But they also need to know that many news organizations, especially smaller ones, expect their journalists to make do with what’s available.
I haven’t said anything yet about the nude photos of Jennifer Lawrence and other celebrities that got hacked and distributed. But we’re going to talk about it on “Beat the Press,” so I’ve been thinking about it.
To me, the big thing is that the women were using iCloud, Apple’s private backup service. If they had posted their photos to some allegedly private area of Facebook, I guess I’d be snickering right along with some of the others and saying, “Well, what did you expect?” But what the hackers did in this case was identical to sitting in a car outside your house, breaking into your WiFi and looking at what’s on your computer. We all know it can happen, but it’s not the sort of thing that anyone prepares for.
It’s yet another reminder that nothing online is secure.
Toward the end of The Innovator’s Dilemma, Clayton Christensen’s influential 1997 book about why good companies sometimes fail, he writes, “I have found that many of life’s most useful insights are often quite simple.”
Indeed, the fundamental ideas at the heart of his book are so blindingly self-evident that, in retrospect, it is hard to imagine it took a Harvard Business School professor to describe them for the first time. And that poses a problem for Jill Lepore, a Harvard historian who recently wrote a scathingly critical essay about Christensen’s theories for the New Yorker titled “The Disruption Machine.” Call it the Skeptic’s Dilemma.
Christensen offers reams of data and graphs to support his claims, but his argument is easy to understand. Companies generally succeed by improving their products, upgrading their technology, and listening to their customers — processes that are at the heart of what Christensen calls “sustaining innovations.” What destroys some of those companies are “disruptive innovations” — crude, cheap at first, attacking from below, and gradually (or not) moving up the food chain. The “innovator’s dilemma” is that companies sometimes fail not in spite of doing everything right, but because they did everything right.
Some examples of this phenomenon make it easy to understand. Kodak, focusing its efforts on improving photographic film and paper, paid no attention to digital technology (invented by one of its own engineers), which at first could not compete on quality but which later swallowed the entire industry. Manufacturers of mainframe computers like IBM could not be bothered with the minicomputer market developed by companies like Digital Equipment Corporation; and DEC, in turn, failed to adapt to the personal computer revolution led by the likes of Apple and, yes, IBM. (Christensen shows how the success of the IBM PC actually validates his ideas: the company set up a separate, autonomous division, far from the mothership, to develop its once-ubiquitous personal computer.)
Christensen has applied his theories to journalism as well. In 2012 he wrote a long essay for Nieman Reports in collaboration with David Skok, a Canadian journalist who was then a Nieman Fellow and is now the digital adviser to Boston Globe editor Brian McGrory, and James Allworth, a regular contributor to the Harvard Business Review. In the essay, titled “Breaking News,” they describe how Time magazine began in the 1920s as a cheaply produced aggregator, full of “rip-and-read copy from the day’s major publications,” and gradually moved up the journalistic chain by hiring reporters and producing original reportage. Today, they note, websites like the Huffington Post and BuzzFeed, which began as little more than aggregators, have begun “their march up the value network” in much the same way as Time some 90 years ago.
And though Christensen, Skok, and Allworth don’t say it explicitly, Time magazine, once a disruptive innovator and long since ensconced as a crown jewel of the quality press, is now on the ropes — cast out of the Time Warner empire, as David Carr describes it in the New York Times, with little hope of long-term survival.
Lepore pursues two approaches in her attempted takedown of Christensen. The first is to look at The Innovator’s Dilemma as a cultural critic would, arguing that Christensen popularized a concept — “disruption” — that resonates in an era when we are all fearful of our place in an uncertain, rapidly changing economy. In the face of that uncertainty, notions such as disruption offer a possible way out, provided you can find a way to be the disruptor. She writes:
The idea of innovation is the idea of progress stripped of the aspirations of the Enlightenment, scrubbed clean of the horrors of the twentieth century, and relieved of its critics. Disruptive innovation goes further, holding out the hope of salvation against the very damnation it describes: disrupt, and you will be saved.
The second approach Lepore pursues is more daring, as she takes the fight from her turf — history and culture — to Christensen’s. According to Lepore, Christensen made some key mistakes. The disk-drive companies that were supposedly done in by disruptive innovators eating away at their businesses from below actually did quite well, she writes. And she claims that his analysis of the steel industry is flawed by his failure to take into account the effects of labor strife. “Christensen’s sources are often dubious and his logic questionable,” Lepore argues.
But Lepore saves her real venom for the dubious effects she says the cult of disruption has had on society, from financial services (“it led to a global financial crisis”) to higher education (she partly blames a book Christensen co-authored, The Innovative University, for the rise of massive open online courses, or MOOCs, of which she takes a dim view) to journalism (one of several fields, she writes, with “obligations that lie outside the realm of earnings”).
Christensen has not yet written a response; perhaps he will, perhaps he won’t. But in an interview with Drake Bennett of Bloomberg Businessweek, he asserts that it was hardly his fault if the term “disruption” has become overused and misunderstood:
I was delighted that somebody with her standing would join me in trying to bring discipline and understanding around a very useful theory. I’ve been trying to do it for 20 years. And then in a stunning reversal, she starts instead to try to discredit Clay Christensen, in a really mean way. And mean is fine, but in order to discredit me, Jill had to break all of the rules of scholarship that she accused me of breaking — in just egregious ways, truly egregious ways.
As for the “egregious” behavior of which he accuses Lepore, Christensen is especially worked up that she read The Innovator’s Dilemma, published 17 years ago, yet seems not to have read any of his subsequent books — books in which he says he continued to develop and refine his theories about disruptive innovation. He defends his data. And he explains his prediction that Apple’s iPhone would fail (a prediction mocked by Lepore) by saying that he initially thought it was a sustaining innovation that built on less expensive smartphones. Only later, he says, did he realize that it was a disruptive innovation aimed at laptops — less capable than laptops, but also cheaper and easier to carry.
“I just missed that,” he tells Bennett. “And it really helped me with the theory, because I had to figure out: Who are you disrupting?”
Christensen also refers to Lepore as “Jill” so many times that Bennett finally asks him if he knows her. His response: “I’ve never met her in my life.”
CHRISTENSEN’S DESCRIPTION of how his understanding of the iPhone evolved demonstrates a weakness of disruption theory: It’s far easier to explain the rise and fall of companies in terms of sustaining and disruptive innovations after the fact, when you can pick them apart and make them the subject of case studies.