Over the weekend Apple removed software from its Chinese App Store that enabled iPhone users to get around censorship laws in that country. The action was widely portrayed as a blow to those working for freedom and human rights in China. And it seemed especially tawdry following as it did the recent death of Nobel Peace Prize winner Liu Xiaobo while in Chinese custody.
But I would argue that Apple did the right thing. My intention is not to write a love letter to Apple, whose leadership, I’m sure, was motivated more by commerce rather than by conscience. Nevertheless, Apple’s decision was a welcome example of Americans’ dealing with the world as it is rather than as they wish it to be. Our values are not everyone’s values.
Last week Zack Beauchamp of Vox explained on the public radio program “On the Media” why liberals want to believe in outlandish conspiracies about President Trump. “One expert I spoke to on political misinformation said that conspiracy theories were a weapon of the weak,” he said. “They were a way to understand and make sense out of the world when it doesn’t seem to make sense to you or seems hostile to you.”
Beauchamp was referring specifically to the ridiculous drivel promoted by Louise Mensch, a former British parliamentarian whose disinformation campaign has taken in a few Trump critics who should have known better. (A sample: Trump, Vice President Mike Pence, and House Speaker Paul Ryan were all about to be arrested because of their ties to Russia).
But I think Beauchamp’s insight is also useful in thinking about a couple of other theories making the rounds among liberals who are trying to explain why a boorish lout like Trump won: his campaign’s use of big data, funded by the shadowy Mercer family, and the proliferation of dubious pro-Trump websites and bot-controlled Twitter accounts.
It’s hard to imagine a less likely viral video sensation than Republican congressman Jim Sensenbrenner of Wisconsin. But there he was last week, all 73 years of him, wagging his finger at a constituent concerned about online privacy and telling her, “Nobody’s got to use the internet.”
Sensenbrenner’s lecture was a clarifying moment in the debate over the future of online privacy and digital democracy. After eight years of the Obama administration, whose telecommunications policies were more often than not in the public interest, President Trump and his Republican allies are rushing headlong into a future that is of, by and for the telecom companies. It’s a debate that hasn’t gotten nearly as much attention as it should — and that could set the tone for how we communicate with one another for at least a generation.
The web—or, as we used to call it, the World Wide Web—is 25 years old this month. On August 6, 1991, Sir Tim Berners-Lee, who had outlined his idea for the web two years earlier, published the first website. It was, as the Telegraph put it, “a basic text page with hyperlinked words that connected to other pages.”
Those of us who were there at the beginning understood that this was a big deal. Even so, the revolution it launched could not have been imagined. As Virginia Heffernan put it in her recent book Magic and Loss: The Internet as Art, “The Internet is the great masterpiece of human civilization.” And the web provides the road map that makes the internet navigable.
Magic and Loss: The Internet as Art (Simon & Schuster, 272 pages, $26) is an honest-to-God book, with paper, ink, and a binding. (Or so I’ve heard. I downloaded the Kindle version.) Reading it, though, feels more like randomly browsing the web than it does like reading a book.
Look here: An essay on the aesthetics of Instagram and Flickr photography. Click. An argument that closed apps offer a better—yet more elitist—experience than the open web. Click. A discussion of epigrammatic poetry demonstrating that its most influential practitioners would have been right at home on Twitter. (Blaise Pascal’s “Do you wish people to believe good of you? Don’t speak,” published in 1669, takes up just 58 characters.) Click. Newton Minow did all of us a favor by calling television “a vast wasteland,” since it imbued the young medium with the transgressive quality that all great art needs. Click.
But if Heffernan offers us a lot of little ideas, she has a big one as well: that the Internet giveth, and it taketh away. At 46, she isn’t quite a digital native, though she’s certainly more of one than I am. Perhaps more relevant is that she’s been around just long enough to experience the digital revolution in its many forms. The good and bad of life online is clearer to her than it would be to someone 20 years younger.
“The Internet is the great masterpiece of human civilization,” she writes, adding: “As an idea it rivals monotheism.” But even monotheism has its drawbacks. In her chapter on music, for instance, she offers a compelling argument on what has been lost as music was transformed from performers on a stage to tinny, ultracompressed sounds that you listen to on your smartphone. (Click. A diversion into the rise of military headphones in World War II and how returning veterans embraced them as a way to listen to music while tuning out the rest of the family.)
Of course, the assertion that MP3s offer sound quality inferior to the CDs and LPs that preceded them is hardly novel. But Heffernan gives it an unexpected twist, writing that she bought her first iPod around the time of the 9/11 terrorist attacks, and that she welcomed a mechanical tone untethered from the messy reality of how music is actually supposed to sound. (But Norah Jones? Really?) Only later did she realize that she missed “the echo of the chirp of the bassist’s sneakers on the wooden stage as he nervously kicks his foot or the sound of the backup singer’s lungs still metabolizing pot smoke.”
There is more, much more—on the humanistic orientation of technologists like Steve Jobs versus the cold rationality of scientists; on the aesthetic differences between electricity (“the province of the engineer and the rationalist”) and electronics (“the province of the irrationalist, the deconstructionist, the druggie, and the mystic”).
Heffernan ties these disparate strands together in a closing chapter that starts off as annoyingly self-indulgent but ends with a measure of humility and grace. She traces her development from an Episcopalian-turned-Jew-turned-Episcopalian (with detours into something like atheism); as someone who rejected philosophy in favor of literary criticism (she has a Ph.D. in English from Harvard); and as the author of a widely mocked 2013 essay titled “Why I’m a Creationist,” whose ethos (“They say it works even if you don’t believe in it,” she writes, quoting a physicist Twitter friend) remains a guidepost for her.
I started out reading Magic and Loss hoping to glean some ideas that would be useful for my work as a journalist and academic who writes about journalism. What I encountered was an extended meditation on the nature of art and God, on immortality and death. Heffernan has written a book that is by turns frustrating and insightful—and that always aims high.
At Universal Hub, cybah analyzes two Huffington Post articles by Bruce Kushnick, executive director of the New Networks Institute, and concludes that Boston Mayor Marty Walsh may have gotten taken for a ride by Verizon.
Let me cut to the chase: Apparently Verizon’s $300 million deal to provide FiOS broadband service to homes and businesses in Boston is a lot less than it seems. Instead, Verizon may be planning to install wireless transmitters on utility poles around the city for two reasons: (1) it costs a whole lot less and (2) it would allow the company to avoid being regulated as a full-fledged cable provider.
I have not delved into this deeply. This is more of an assignment-desk post: well worth following up by local journalists.
More:Additional context from local media and technology activist Saul Tanenbaum, writing for the Boston Institute for Nonprofit Journalism.