Hoyt gets results

New York Times public editor Clark Hoyt, writing on Sunday:

Susan Chira, the foreign editor, … acknowledged that the paper had used “excessive shorthand” when referring to Al Qaeda in Mesopotamia. “We’ve been sloppy,” she said. She and other editors started worrying about it, Chira said, when the American military began an operation in mid-June against what it said were strongholds of Al Qaeda in Mesopotamia.

On Thursday, she and her deputy, Ethan Bronner, circulated a memo with guidelines on how to distinguish Al Qaeda in Mesopotamia from bin Laden’s Al Qaeda.

New York Times reporter David Sanger, writing today:

Officials say that Mr. Gates has been quietly pressing for a pullback that could roughly halve the number of combat brigades now patrolling the most violent sections of Baghdad and surrounding provinces by early next year. The remaining combat units would then take up a far more limited mission of training, protecting Iraq’s borders and preventing the use of Iraq as a sanctuary by Al Qaeda in Mesopotamia, a Sunni Arab extremist group that claims to have an affiliation with Osama bin Laden’s network, though the precise relationship is unknown.

Brown unbound

Rachel Sklar has a good interview in the Huffington Post with Aaron Brown, who’s finally free from his CNN contract and, thus, can (1) talk and (2) look for a job. Brown is one of my favorite TV journalists, but, unfortunately, he comes across as diffident about returning to the trenches. Then again, Brown tends to sound diffident about everything, which is one of his quirky charms.

Wikipedia, probability and community

I’m sure I’m far from the only person to have a conflicted relationship with Wikipedia. Yes, I know that at least one prominent study found that the user-produced encyclopedia is about as accurate as the venerable Britannica. I also know that Wikipedia can be mind-bogglingly wrong — sometimes only for a few minutes or a few days, until an adult (I’m talking about maturity, not age) undoes someone else’s vandalism. But that’s not much consolation if you’re the victim of that bad information.

I tell my students that Wikipedia can be a great starting point, but that they should use it to find more-authoritative sources of information, not cite it in their papers. As for me, well, I’ve been known to link to Wikipedia articles, but I try to be careful, and I try to keep it to a minimum.

This week’s New York Times Magazine includes a worthwhile story by Jonathan Dee on the emergence of Wikipedia as a news source. Dee reports on a small army of activists (one is just 16) who jump in with summaries of major news events even as they are unfolding. These activists come across as admirably dedicated to the idea of fair, neutral content; many look for vandalism after a major news event takes place, such as the death of the Rev. Jerry Falwell, a favorite target of those who opposed his homophobic, right-wing views. (Yes, if I wrote that on Wikipedia, someone would edit those descriptions out.) But I would still have a nagging sense that something might be very wrong.

So what is the real difference between Wikipedia and a more traditional encyclopedia such as the Britannica? It’s not just the notion that anonymous and pseudonymous amateurs write and edit Wikipedia articles, whereas Britannica relies on experts. That’s certainly part of it, although if that were the entire explanation, Wikipedia would be worthless.

The more important difference is the idea of community-based, bottom-up verification (Wikipedia) versus authority-based, top-down verification (Britannica). Each has its purpose. The question is why the community-based model works — or at least works often enough that Wikipedia is a worthwhile stop on anyone’s research quest.

To that end, I want to mention a couple of ideas I’ve run across recently that help explain Wikipedia. The first is a 2006 book by Wired editor Chris Anderson, “The Long Tail,” in which he suggests that the accuracy of Wikipedia is based on probability theory rather than direct verification. The more widely read a Wikipedia article is, the more likely it is to be edited and re-edited, and thus be more accurate and comprehensive than even a Britannica article. But you never know. Anderson writes (I’m quoting from the book, but this blog post captures the same idea):

Wikipedia, like Google and the collective wisdom of millions of blogs, operates on the alien logic of probabilistic statistics — a matter of likelihood rather than certainty. But our brains aren’t wired to think in terms of statistics and probability. We want to know whether an encyclopedia entry is right or wrong. We want to know that there’s a wise hand (ideally human) guiding Google’s results. We want to trust what we read.

When professionals — editors, academics, journalists — are running the show, we at least know that it’s someone’s job to look out for such things as accuracy. But now we’re depending more and more on systems where nobody’s in charge; the intelligence is simply “emergent,” which is to say that it appears to arise spontaneously from the number-crunching. These probabilistic systems aren’t perfect, but they are statistically optimized to excel over time and large numbers. They’re designed to “scale,” or improve with size. And a little slop at the microscale is the price of such efficiency at the macroscale.

Anderson is no Wikipedia triumphalist. He also writes: “[Y]ou need to take any single result with a grain of salt. Wikipedia should be the first source of information, not the last. It should be a site for information exploration, not the definitive source of facts.”

Right now I’m reading “Convergence Culture” (2006), by Henry Jenkins, director of the Comparative Media Studies Program at MIT. Jenkins, like Anderson, offers some insight into that clichéd phrase “the wisdom of the crowd,” and why it often works. Jenkins quotes the philosopher Pierre Lévy, who has said of the Internet, “No one knows everything, everyone knows something, all knowledge resides in humanity.” Jenkins continues:

Lévy draws a distinction between shared knowledge, information that is believed to be true and held in common by the entire group, and collective intelligence, the sum total of information held individually by the members of the group that can be accessed in response to a specific question. He explains: “The knowledge of a thinking community is no longer a shared knowledge for it is now impossible for a single human being, or even a group of people, to master all knowledge, all skills. It is fundamentally collective knowledge, impossible to gather together into a single creature.” Only certain things are known by all — the things the community needs to sustain its existence and fulfill its goals. Everything else is known by individuals who are on call to share what they know when the occasion arises. But communities must closely scrutinize any information that is going to become part of their shared knowledge, since misinformation can lead to more and more misconceptions as new insight is read against what the group believes to be core knowledge.

Jenkins is writing not about Wikipedia but about an online fan community dedicated to figuring out the winners and losers on CBS’s “Survivor.” But the parallels to Wikipedia are obvious.

I’ve sometimes joked that the madness of the mob must turn into the wisdom of the crowd when you give everyone a laptop. The Jenkins/Lévy model suggests something else — “shared knowledge” defines a mob mentality; “collective intelligence” is the wisdom of the crowd. At its best, that what drives Wikipedia.

The 10th annual Muzzle Awards

It seems hard to believe, but today is the 10th anniversary of the Phoenix Muzzle Awards. In 1998, at the suggestion of Harvey Silverglate, I began compiling an annual Fourth of July roundup of outrages against free speech and civil liberties in New England.

This year, for the second year in a row, Mitt Romney leads the pack. This time it’s for refusing to provide security last September at a Harvard speech by former Iranian president Mohammad Khatami — a routine matter, but the then-governor decided to make a grandstanding play instead. If the Boston Police Department had not stepped forward so that Khatami could deliver his address, Romney would have handed the reformist Khatami’s enemies back home a considerable victory.

There’s also some breaking Muzzle news. In the last item, I single out Boston England High School headmaster Jose Duarte for placing longtime substitute teacher Jeffrey Herman on a “do not call” list — retaliation, according to Herman, for Herman’s speaking out against the city’s $1.2 million Junior ROTC program. Just yesterday, the ACLU of Massachusetts announced (PDF) that the city would pay a $15,000 settlement to Herman without admitting any wrongdoing on Duarte’s part.

A controversy over a 2006 Muzzle was recently resolved as well. Last year I criticized the Massachusetts State Police for threatening a Leominster political activist named Mary T. Jean for posting on the Web a streaming video of a man being arrested in his home. The video — captured by a “baby cam” in the arrestee’s home — had been posted with his permission, but the state-police troopers somehow saw it as a violation of their rights.

On June 22, the U.S. Court of Appeals for the First Circuit ruled in favor of Jean. As media lawyer Robert Ambrogi reported on his blog:

The court ruled that the First Amendment prevents law enforcement officials from interfering with an individual’s Internet posting of an audio and video recording of an arrest and warrantless search of a private residence, even though the individual had reason to know the recording was made illegally.

The principle here is particularly important, because Jean used her Web site to criticize then-Worcester County district attorney John Conte, and because she claimed the video showed troopers assigned to Conte’s office making a warrantless arrest. This is political speech, pure and simple, and thus deserving of the highest level of First Amendment protection.

Photo of Romney (cc) by MyTwistedLens. Some rights served.