This Time, Irony Might Really Be Dead

After terrorist attacks on New York and Washington shocked people around the world on September 11, 2001, it seemed to many Americans that everything had changed. The government promised a rapid response to counter the unexpected national security threats, and the public rallied behind the president. Some members of the nation’s intelligentsia also viewed the traumatic attack as a cultural turning point. Roger Rosenblatt of Time and Graydon Carter of Vanity Fair suggested that U.S. culture had reached the “end of irony.” These critics (among others) thought 9/11 would cause Americans to return to the serious-mindedness of prior eras, putting an end to the political and cultural frivolousness of the prosperous 1990s. 

Their diagnosis soon appeared premature.   Their pronouncement was mocked by defenders of the ironic sensibility in the months after the shock of the terrorist attacks wore off. Developments during the years that followed 9/11 proved the skeptics right. Like partisan politics, culture wars, and pop culture frivolity, the ironic posture of cultural mavens was stalled briefly by 9/11, but returned stronger then ever in the years that followed. Irony reached a new apex in the form of “hipster culture” during Barack Obama’s first term. Hipsters would wear T-shirts ironically of “uncool” bands and products that they disliked. They used this sartorial gesture to signal their wit and hipness to those sophisticated enough to “get the joke.” The ability to mockingly appropriate the lowbrow became a highbrow status marker.

However, with the economic recession and the revival of economic and racial tensions during the Obama years, irony became less compatible with the cultural zeitgeist. Google Trends indicates that the search term “hipster” peaked in the early 2010s and has been in a steady decline ever since (and by 2015, some outlets writing about the hipster movement were declaring it dead) which likely indicates a decline in irony’s centrality to our cultural conversation. During the 2010s, grassroots anger on the Right led to the Tea Party movement and popular conspiracy theories about President Obama’s true allegiances. The Left responded not just with “Daily Show” style satire and mockery, but with also with serious-minded (if not entirely successful in the short-term) activist movements such as Occupy Wall Street and Black Lives Matter

Continue reading This Time, Irony Might Really Be Dead

The Garden of Interconnected Delights

This blog has often been critical of the current state of the tech industry. It has critiqued the negative impact that new technological developments are having on modern society. However, today I would like to discuss the upsides of a wired world.

The generic “smart person’s criticism” of the digital revolution is to say: “We all know that the Internet is great on balance, but a few major problems need to be addressed.” These articles go on to discuss only the problems, without explaining what is so obviously “great.” I often think digital critics are afraid of following their critical beliefs to their conclusion, which is that negative consequences of the digital revolution may have outweighed its positive impacts for society, perhaps out of fear of being perceived as neo-luddites or knee-jerk reactionaries.

This post’s discussion of the wonders of the Internet is not an attempt to prove why the Internet has been more good than bad. The aggregate impact of our technologies may be negative, and the question of how to weigh that impact is quite complicated. Nevertheless, aspects of our wired world have excited me and filled me with wonder during the Internet era.

Continue reading The Garden of Interconnected Delights

The Silicon Backlash: How the Tech Industry Went From Media Darling to Political Scapegoat

American culture is experiencing an unprecedented backlash against Silicon Valley and the wired world that it has provided us. The tech giants are being slammed by criticism from all sides. On the Left, they are criticized for aiding the spread of anti-Hillary fake news and for fostering income inequality. On the Right, they are criticized for allegedly enforcing “political correctness” and preventing their workers from expressing non-liberal ideas. This has led even parties on the usually anti-regulatory conservative side of the American political spectrum, from relatively mainstream Fox News blowhards to racist alt-right extremists, to come out in favor regulating digital platforms like utilities.

It’s interesting to consider how we got here. For most of my adult life, the Internet has existed, changing the workplace, popular culture, and social life in profound ways. However, for most of that period there has been surprisingly little political commentary and social criticism about the Internet’s transformative effects. When television first emerged as a dominant medium, it received some scorn from cultural critics for being allegedly mindless and lowbrow. The rise of the Internet was an even more profound change than the rise of television – no one was doing financial transactions or finding love over their TV sets, after all – and yet its emergence did not seem to elicit the same amount of cultural angst.

There are some good reasons for this. Television was closed, centralized, and unilateral, with a few networks dominating its early years. The Internet was open, decentralized, and interactive. If someone accused the Internet of being a “vast wasteland” of lowbrow culture and poor-quality writing, you could respond by advising them to create and upload superior content. The Internet quickly became far too diverse and vast to be subject to any such generalizations. Any topics you might be interested in, and millions of others you had no interest in, were being discussed and documented somewhere online. Individual users’ experiences were totally self-customizable. Furthermore, they could participate in online discussions, which seemed to be a major improvement over television, which did not allow you to talk back to it (at least, it was far less responsive when you tried).

Continue reading The Silicon Backlash: How the Tech Industry Went From Media Darling to Political Scapegoat

A Few Thoughts on Bots

Anne Applebaum, the award-winning historian and Washington Post columnist, was at the forefront of covering a plethora of nefarious Russian-orchestrated cyberattacks that have sought to distort political outcomes around the globe. Long before the 2016 presidential election, she made Americans aware of the Russian government’s online efforts to propagandize and deceive.

Her recent column notes that bad actors have perpetuated substantial online mischief and fraud by exploiting the difficulty we all face in distinguishing humans from automatons on the Internet. Robotic online mobs posing as humans and targeting real people through the Internet is a threat posed by “artificial intelligence” that has not been addressed in many dystopian science-fiction stories, but it is a growing real-world problem.

Continue reading A Few Thoughts on Bots

Should Next US President be a Supercomputer?

There is a lot of hype right now about the mechanization of the workplace and the extent to which human jobs will be replaced by machines. At the extreme end of the spectrum is Artificial Intelligence, and machines that (even if not self-aware) will be intelligent and insightful enough to make decisions and take initiative rather than relying on programmers to give them precise instructions.

It may reflect the professional-class bias of our news media that there seems to be a lot more attention and concern over this issue now that white-collar jobs, in addition to blue-collar jobs, are potentially threatened by new technology. LegalZoom has created anxiety among lawyers that mass-online lawyering operations will put small firms and solo practitioners out of business. Mass-produced online lectures threaten teachers and professors in academia.

Now a provocative article by Michael Linhorst in Politico reveals that some tech-utopians believe that even the most high-profile white-collar executive position in the world, the United States President, could be replaced and improved by computer technology. Linhorst describes the proponents’ ideal as a superhuman supercomputer: “The president would more likely be a computer in a closet somewhere, chugging away at solving our country’s toughest problems. Unlike a human, a robot could take into account vast amounts of data about the possible outcomes of a particular policy. It could foresee pitfalls that would escape a human mind and weigh the options more reliably than any person could—without individual impulses or biases coming into play. We could wind up with an executive branch that works harder, is more efficient and responds better to our needs than any we’ve ever seen.”

Continue reading Should Next US President be a Supercomputer?