For reasons that I can’t quite grasp, there seems to be an irresistible urge on the part of news entrepreneurs to think regionally rather than locally. Maybe a regional focus makes fundraising easier. Maybe folks think it makes little sense to build out a digital infrastructure for a project that serves just one community.
The latest to argue for a regional approach is Christopher Baxter, the executive director and editor-in-chief of Spotlight PA, which produces investigative reporting in Pennsylvania. Writing in Nieman Reports, Baxter says his site uses a “hub-and-spoke” model to provide statewide stories to local news organizations, which in turn feed local stories back to the hub. He writes:
This “hub-and-spoke” model using statewide entities like Spotlight PA, VTDigger, Mississippi Today, Mountain State Spotlight, and many others provides a ready pathway to scale coverage to local cities and towns without building new organizations in every location. The hub provides the organizational support and wide distribution platform, maintaining a focus on Capitol and statewide stories, while the spokes focus on local stories, always with an eye toward what might be of interest to a statewide audience.
So far, so good. But then he adds: “To be clear, this approach won’t replace the heyday of local journalism, when every town council meeting, zoning meeting, and school board meeting was covered.” And yet that’s what’s desperately needed — and it’s exactly what’s being provided by the local projects I mention above.
Back in 2015, I interviewed Anne Galloway, the founder of VT Digger, a statewide site based in Vermont’s capital, Montpelier. At that time Digger was just beginning to expand into local coverage in Chittenden County, where Burlington is located, and Windham County, in the southern part of the state.
Digger has been grown considerably since then. But in perusing the site, it seems clear that it’s stuck mainly to its original mission of providing first-rate investigative coverage of statewide issues, while occasionally branching out into local stories like the recent newspaper battle in Charlotte.
That’s as it should be. But real local journalism of the sort that covers “every town council meeting, zoning meeting and school board meeting,” as Baxter puts it, is perhaps the greatest unmet need today. Let’s let the regionals do what they do best — and keep pushing for local coverage of community life.
The horrifying execution of journalist James Foley raises an uncomfortable if familiar question: Is there anything to be gained by watching the video of his beheading at the hands of an ISIS terrorist?
It’s a question that I explored 12 years ago, when Wall Street Journal reporter Daniel Pearl was similarly murdered. I searched for the video online and found it at a website whose sick operators presented such fare for the entertainment of their disturbed viewers. I shared it with my friends at The Boston Phoenix, who — to my surprise — published several small black-and-white stills of Pearl’s beheading and provided a link to the full video. “This is the single most gruesome, horrible, despicable, and horrifying thing I’ve ever seen,” the Phoenix’s outraged publisher, Stephen Mindich, wrote in an accompanying editorial.
The Phoenix’s actions created a national controversy. I defended Mindich and editor Peter Kadzis, first in the Phoenix, later in Nieman Reports. (At the time I had left the paper to write my first book, though I continued to contribute freelance pieces. My departure turned out to be temporary. And Kadzis, my editor then, is also my editor now: he is the senior editor of WGBH News.) I wrote in the Nieman piece:
Daniel Pearl didn’t seek martyrdom, but martyrdom found him. The three-and-a-half-minute video shows us the true face of evil, an evil that manifested itself unambiguously last September 11…. We turn away from such evil at our peril.
I stand by what I wrote then, but I haven’t watched the execution of Jim Foley. In contrast to the Daniel Pearl footage, the Foley video is bright and clear, in high definition. I’ve watched a bit of it, listened to him speak while kneeling in the desert; but that was all I could handle.
Boston Globe columnist Jeff Jacoby takes a different view, writing, “The intrepid and compassionate reporter from New Hampshire didn’t travel to Syria to sanitize and downplay the horror occurring there. He went to document and expose it.”
I don’t disagree. But it should be a matter of choice. Gawker, among the first media outlets to post a link to the video, made sure its readers knew that what they would see if they clicked was “extremely disturbing.” By contrast, the New York Post and the Daily News published front-page images of Foley (I’ve linked to a Washington Post story, not the actual images) just before his beheading — in the New York Post’s case, barely a nanosecond before.
It’s a fine line, but I’d say Gawker was on the right side of it, and the New York tabloids were not.
At the time of his capture, Foley was freelancing for GlobalPost, the Boston-based international news organization. GlobalPost co-founder and chief executive Phil Balboni, in a tribute published in the Globe, wrote:
For those of us who knew Jim, the road ahead will be particularly long and trying. As a lifelong journalist, the path forward for me will be rooted in a renewed and profound respect for a profession that for Jim was not a job, but a calling.
We’ve learned a lot since the execution of Daniel Pearl. One of the things we’ve learned is that bearing witness does not necessarily lead to a good result. Years of war in Iraq and Afghanistan have not created a safer world.
Do we have a right to view the James Foley video? Of course. Twitter, a private company that has become a virtual public utility, is heading down a dangerous road by banning images from the video. Should we watch the video as a way of witnessing unspeakable evil, as Jeff Jacoby argues? That, I would suggest, should be up to each of us.
Above all, we should honor the bravery and sacrifice of journalists like Daniel Pearl and James Foley, who take risks most of us can scarcely imagine. Let’s keep the Foley family in our thoughts, and celebrate the safe return of Peter Theo Curtis. And let’s send offer whatever good thoughts we can for Steven Sotloff, a fellow hostage of Foley’s who was threatened with death last week.
Toward the end of The Innovator’s Dilemma, Clayton Christensen’s influential 1997 book about why good companies sometimes fail, he writes, “I have found that many of life’s most useful insights are often quite simple.”
Indeed, the fundamental ideas at the heart of his book are so blindingly self-evident that, in retrospect, it is hard to imagine it took a Harvard Business School professor to describe them for the first time. And that poses a problem for Jill Lepore, a Harvard historian who recently wrote a scathingly critical essay about Christensen’s theories for the New Yorker titled “The Disruption Machine.” Call it the Skeptic’s Dilemma.
Christensen offers reams of data and graphs to support his claims, but his argument is easy to understand. Companies generally succeed by improving their products, upgrading their technology, and listening to their customers — processes that are at the heart of what Christensen calls “sustaining innovations.” What destroys some of those companies are “disruptive innovations” — crude, cheap at first, attacking from below, and gradually (or not) moving up the food chain. The “innovator’s dilemma” is that companies sometimes fail not in spite of doing everything right, but because they did everything right.
Some examples of this phenomenon make it easy to understand. Kodak, focusing its efforts on improving photographic film and paper, paid no attention to digital technology (invented by one of its own engineers), which at first could not compete on quality but which later swallowed the entire industry. Manufacturers of mainframe computers like IBM could not be bothered with the minicomputer market developed by companies like Digital Equipment Corporation; and DEC, in turn, failed to adapt to the personal computer revolution led by the likes of Apple and, yes, IBM. (Christensen shows how the success of the IBM PC actually validates his ideas: the company set up a separate, autonomous division, far from the mothership, to develop its once-ubiquitous personal computer.)
Christensen has applied his theories to journalism as well. In 2012 he wrote a long essay for Nieman Reports in collaboration with David Skok, a Canadian journalist who was then a Nieman Fellow and is now the digital adviser to Boston Globe editor Brian McGrory, and James Allworth, a regular contributor to the Harvard Business Review. In the essay, titled “Breaking News,” they describe how Time magazine began in the 1920s as a cheaply produced aggregator, full of “rip-and-read copy from the day’s major publications,” and gradually moved up the journalistic chain by hiring reporters and producing original reportage. Today, they note, websites like the Huffington Post and BuzzFeed, which began as little more than aggregators, have begun “their march up the value network” in much the same way as Time some 90 years ago.
And though Christensen, Skok, and Allworth don’t say it explicitly, Time magazine, once a disruptive innovator and long since ensconced as a crown jewel of the quality press, is now on the ropes — cast out of the Time Warner empire, as David Carr describes it in the New York Times, with little hope of long-term survival.
Lepore pursues two approaches in her attempted takedown of Christensen. The first is to look at The Innovator’s Dilemma as a cultural critic would, arguing that Christensen popularized a concept — “disruption” — that resonates in an era when we are all fearful of our place in an uncertain, rapidly changing economy. In the face of that uncertainty, notions such as disruption offer a possible way out, provided you can find a way to be the disruptor. She writes:
The idea of innovation is the idea of progress stripped of the aspirations of the Enlightenment, scrubbed clean of the horrors of the twentieth century, and relieved of its critics. Disruptive innovation goes further, holding out the hope of salvation against the very damnation it describes: disrupt, and you will be saved.
The second approach Lepore pursues is more daring, as she takes the fight from her turf — history and culture — to Christensen’s. According to Lepore, Christensen made some key mistakes. The disk-drive companies that were supposedly done in by disruptive innovators eating away at their businesses from below actually did quite well, she writes. And she claims that his analysis of the steel industry is flawed by his failure to take into account the effects of labor strife. “Christensen’s sources are often dubious and his logic questionable,” Lepore argues.
But Lepore saves her real venom for the dubious effects she says the cult of disruption has had on society, from financial services (“it led to a global financial crisis”) to higher education (she partly blames a book Christensen co-authored, The Innovative University, for the rise of massive open online courses, or MOOCs, of which she takes a dim view) to journalism (one of several fields, she writes, with “obligations that lie outside the realm of earnings”).
Christensen has not yet written a response; perhaps he will, perhaps he won’t. But in an interview with Drake Bennett of Bloomberg Businessweek, he asserts that it was hardly his fault if the term “disruption” has become overused and misunderstood:
I was delighted that somebody with her standing would join me in trying to bring discipline and understanding around a very useful theory. I’ve been trying to do it for 20 years. And then in a stunning reversal, she starts instead to try to discredit Clay Christensen, in a really mean way. And mean is fine, but in order to discredit me, Jill had to break all of the rules of scholarship that she accused me of breaking — in just egregious ways, truly egregious ways.
As for the “egregious” behavior of which he accuses Lepore, Christensen is especially worked up that she read The Innovator’s Dilemma, published 17 years ago, yet seems not to have read any of his subsequent books — books in which he says he continued to develop and refine his theories about disruptive innovation. He defends his data. And he explains his prediction that Apple’s iPhone would fail (a prediction mocked by Lepore) by saying that he initially thought it was a sustaining innovation that built on less expensive smartphones. Only later, he says, did he realize that it was a disruptive innovation aimed at laptops — less capable than laptops, but also cheaper and easier to carry.
“I just missed that,” he tells Bennett. “And it really helped me with the theory, because I had to figure out: Who are you disrupting?”
Christensen also refers to Lepore as “Jill” so many times that Bennett finally asks him if he knows her. His response: “I’ve never met her in my life.”
CHRISTENSEN’S DESCRIPTION of how his understanding of the iPhone evolved demonstrates a weakness of disruption theory: It’s far easier to explain the rise and fall of companies in terms of sustaining and disruptive innovations after the fact, when you can pick them apart and make them the subject of case studies.
Five years ago Clay Shirky wrote an eloquent blog post titled “Newspapers and Thinking the Unthinkable.” His essential argument was that we were only at the very beginning of trying to figure out new models for journalism following the cataclysmic changes wrought by the Internet — like Europeans in the decades immediately following the invention of Gutenberg’s press. Along with a subsequent talk he gave at Harvard’s Shorenstein Center, Shirky helped me frame the ideas that form the foundation of “The Wired City,” my book about online community journalism.
Now Shirky has written a rant. In “Nostalgia and Newspapers,” posted on Tuesday, the New York University professor and author wants us to know that we’re not getting it fast enough — that print is dead, and anything that diverts us from the hard work of figuring out what’s next is a dangerous distraction. His targets range from Aaron Kushner and his alleged apologists to journalism-school professors who are supposedly letting their students get away with thinking that print can somehow be saved.
As always, Shirky offers a lot to think about, as he did at a recent panel discussion at WGBH. I don’t take issue with the overarching arguments he makes in “Nostalgia and Newspapers.” But I do want to offer a countervailing view on some of the particulars.
1. Good journalism schools are not print-centric: Shirky writes that he “exploded” when he was recently asked by an NYU student, in front of the class, “So how do we save print?” I assume Shirky is exaggerating his reaction for effect. It wasn’t a terrible question, and in any case there was no reason for him to embarrass a student in front of her classmates. I’m sure he didn’t.
More important, Shirky takes the view that students haven’t given up on print because no one had given it to them straight until he came along to tell them otherwise. He writes that he told the students that “print was in terminal decline and that everyone in the class needed to understand this if they were thinking of journalism as a major or a profession.” And he attributed their nostalgic views to “Adults lying to them.”
Now, I find it hard to believe that Shirky’s take on the decline of print was novel to journalism students at a progressive institution like NYU. And from what I’ve seen from my own small perch within academia, all of us are looking well beyond print. In the new issue of Nieman Reports, Jon Marcus surveys changes in journalism education (including the media innovation program for graduate students headed by my Northeastern colleague Jeff Howe that will begin this fall). Citing a recent survey by Poynter, Marcus writes that, in many cases, j-schools are actually ahead of professional newsrooms in pushing for digital change:
A recent Poynter survey — which some argue demonstrates that educators are outpacing editors in their approaches to digital innovation — underlines the divide between j-schools and newsrooms. Educators are more likely than professional journalists to believe it’s important for journalism graduates to have multimedia skills, for instance, according to the survey Poynter released in April. They are more likely to think it’s crucial for j-school grads to understand HTML and other computer languages, and how to shoot and edit video and photos, record audio, tell stories with visuals, and write for different platforms.
Could we be doing better? No doubt. But we’re already doing a lot.
2. Aaron Kushner might have been on to something. OK, I’m pushing it here. There’s no doubt that Kushner’s moves after he bought the Orange County Register in 2012 have blown up in his face — the hiring spree, the launching of new daily newspapers in Long Beach and Los Angeles, the emphasis on print. Earlier this month, it all seemed to be coming to a very bad end, though Kushner himself says he simply needs time to retrench.
But Kushner’s ideas may not have been entirely beyond the realm of reality. Over the past several decades, great newspapers have been laid low by debt-addled chains trying to squeeze every last drop of profit out of them. This long-term disinvestment has had at least as harmful an effect on the news business as the Internet-driven loss of advertising revenues. Yes, Kushner’s love of print seems — well, odd, although it’s also true that newspapers continue to derive most of their shrinking advertising revenues from print. But investing in growth, even without a clear plan (or, rather, even with an ever-changing plan), strikes me as exactly what we ought to hope news(paper) companies will do. After all, that’s what Jeff Bezos is doing at The Washington Post and John Henry at The Boston Globe. And that’s not to say there won’t be layoffs and downsizing along the way.
Shirky also mocks Ryan Chittum of the Columbia Journalism Review and Ken Doctor, a newspaper analyst and blogger who writes for the Nieman Journalism Lab, writing that they “wrote puff pieces for Kushner, because they couldn’t bear to treat him like the snake-oil salesman he is.” (Shirky does concede that Chittum offered some qualifications.)
Chittum recently disagreed with me merely for writing that he had “hailed their [Kushner’s and his business partner Eric Spitz’s] print-centric approach.” It will be interesting to see whether and how he and Doctor respond to Shirky. I’ll be watching. Chittum has already posted this.
In any case, I hardly think it was “terrible” (Shirky’s description) for Chittum and Doctor to play down their doubts given that Kushner, a smart, seemingly well-funded outsider, claimed to have a better way.
Post-publication updates. After this commentary was published at WGBH News on Wednesday, the reactions, as expected, started rolling in. First up: Chittum, who apologized for his F-bomb, though not the sentiment behind it.
by the way, apologies to @cshirky for the f u thing. the “lying” line in that nasty piece, but that’s no excuse. #breathe
My two favorite stories about Jill Abramson both speak to her insistence on holding The New York Times to account. Those stories may help explain why she was removed as executive editor on Wednesday.
The first pertains to investor Steven Rattner, a friend of publisher Arthur Sulzberger Jr. who was being investigated by the Securities and Exchange Commission over a kickback scheme involving the New York State pension fund. (In November 2010 Rattner paid a $6.2 million settlement and accepted a two-year ban on some of his trading activities.)
According to The New Yorker’s Ken Auletta, Abramson — then the managing editor, serving as Bill Keller’s number two — didn’t hesitate to green-light a front-page investigative report on Rattner, the Sulzberger connection be damned. “What better test is there for an editor than how they handle the publisher’s best friend?” Auletta quoted an unnamed Times source as saying.
To Sulzberger’s credit, the incident didn’t prevent him from naming Abramson to succeed Keller in 2011. But what may have created an irreparable breech was a second, similar story. In 2012, Sulzberger chose Mark Thompson, the former director general of the BBC, to become chief executive officer of the New York Times Co. Before Thompson could begin, Abramson dispatched one of the Times’ top investigative reporters to look into whether Thompson had any role in the child-sex-abuse scandal whirling around Jimmy Savile, a once-popular TV host.
Both Thompson and Sulzberger were angry, reports Gabriel Sherman in New York magazine. A source was quoted as saying of Sulzberger: “He was livid, in a very passive aggressive way. These were a set of headaches Jill had created for Arthur.”
Now the Times’ internal top cop is off the beat. And Thompson, presumably, has a freer hand to enact his agenda — an agenda that is said to include, among other things, more online video and more native advertising, the term of art used to describe what used to be disparagingly referred to as “advertorials.”
Abramson’s successor and former number two, Dean Baquet, is now the paper’s first African-American executive editor, a not-insignificant milestone on a par with Abramson’s being the first woman. He is said to be a fine editor and a popular choice with the newsroom.
But given that Sulzberger’s own son recently wrote a report arguing that the Times isn’t moving quickly enough on the digital front, it might seem strange that Abramson’s successor would be someone regarded as even less digitally savvy than she. The likely explanation is that Thompson sees himself as the paper’s chief digital officer. Certainly Thompson does not lack for confidence. Less than a year ago he supposedly told a Times executive, “I could be the editor of the New York Times,” according to an article by Joe Hagan in New York magazine.
I don’t mean to play down any of the other reasons that have been given for Abramson’s abrupt and brutal dismissal. There is the matter of her brusque demeanor, described in detail last year by Dylan Byers of Politico. At the time I dismissed it as anonymously sourced sexism, but Byers is deservedly taking a victory lap this week.
Another factor was her complaints about making less money than Bill Keller did when he was editor, a story Ken Auletta broke within hours of Abramson’s dismissal. Auletta reported that Abramson even learned she made less than a male deputy managing editor when she was managing editor. The Times has denied all, although in language that makes it hard to figure out what, precisely, it is denying.
And then there is the incident that may have precipitated the final crisis — her reported attempts to hire Janine Gibson away from The Guardian to serve as a co-managing editor for digital without bothering to inform Baquet. Certainly that’s the angle that the Times’ David Carr and Ravi Somaiya play up in their own coverage of Abramson’s dismissal. (Other accounts say Gibson would have been a deputy managing editor, and thus presumably less of a threat to Baquet’s authority.)
“I think what it says to us is there is still enormous challenges for women out there, for women who assume those key and influential roles in journalism,” Melissa Ludtke, a pioneering sports journalist and former editor of Nieman Reports, told Politico’s Anna Palmer.
I think it’s more complicated than that. It is nevertheless a fact that in the past few years Sulzberger has fired two of the highest-ranking women in the newspaper business — first Janet Robinson, creating the vacancy that Mark Thompson later filled, and now Abramson.
In addressing the staff Wednesday, Sulzberger referred to “an issue with management in the newsroom.” That’s not good enough. And it’s not the kind of accountability Abramson pushed for in covering the powerful institution that she worked for. I hope we’ll learn more in the days ahead.
Harvard Business School professor Clayton Christensen has an important article in The Boston Globe today on the disruptive changes coming to higher education, arguing that the fading away of MOOCs (massive online open courses) will amount to nothing more than a temporary reprieve for the old way of doing things.
Ultimately, Christensen and his co-author Michelle Weise argue, college and university administrators will have to deal with “disruptive innovations” coming from the outside as they find that their high and increasing costs are unsustainable.
But what I find at least as interesting as Christensen’s views on education is connecting the dots between him and the Globe. Consider:
In the fall of 2012, Christensen and two co-authors — David Skok and James Allworth — wrote the cover story for Nieman Reports, “Breaking News,” on the challenges facing the news business in a time of disruptive innovation.
Last October, John Henry, shortly after completing his purchase of the Globe, wrote a piece for his new paper outlining his vision — and citing Christensen’s oft-repeated mantra that business leaders should think in terms of “jobs to be done.”
In an exchange of emails with Boston magazine earlier this year, Henry expressed admiration for Christensen and Skok, adding, “I’m not sure it is necessarily up to the disrupted to be disruptive as a strategy, but virtually everything these days is subject to disruption.”
Given that context, Christensen’s appearance in today’s Globe would appear to be a side effect of the “jobs to be done” thinking that has already permeated John Henry’s news organization.
I’m pretty excited about this. Nine years ago Andrew Solomon, winner of the National Book Award, blurbed my book on dwarfism, “Little People.” He also interviewed me at the 2003 Little People of America conference for his next project — a book about families whose children were different from their parents, whether they be disabled, gay or suffering from mental illness, to name just a few examples.
Naturally, I’m trying to figure out how this might benefit “Little People.” Although it’s officially out of print, I sell a high-quality self-published paperback. (You can read about how that came about in a piece I wrote for Nieman Reports.) So far I’ve taken a few small steps: I’ve removed the free online edition (except for the Introduction and Chapter One) and made it easier to buy a copy. As you can see in the right-hand column, I’ve pumped up its presence on Media Nation. And I’m going to try Google ads again, at least through Christmas.
Anyone have any other ideas? Are there any independent bookstores in the area that would be interested in carrying it?
I’ll be speaking at the National Writers Union’s annual book party this Sunday, Jan. 22, which is being held from 2 to 5 p.m. in Central Square. Details here. My subject will be the new world of self-publishing, which I wrote about recently for Nieman Reports. Hope to see you there.
I’ve got an essay in the new issue of Nieman Reports on how technology enabled me to revive “Little People,” my 2003 memoir on raising a daughter with dwarfism — online at first, and then later as a print-on-demand paperback.
Today is the 100th birthday of Marshall McLuhan, the Canadian scholar who forever changed the way we think about media and their effects on the human psyche.
Last week I sat down for a conversation with Len Edgerly, host of “The Kindle Chronicles,” on what McLuhan would think about the Kindle, the iPad, and what effects e-readers would have on our perception of text, reading and linearity. The interview grew out of my recent review of Douglas Coupland’s McLuhan biography for Nieman Reports.
Len and I had great fun, and I hope you’ll have a chance to give it a listen.