Select Page
Kara Swisher Is Sick of Tech People, So She Wrote a Book About Them

Kara Swisher Is Sick of Tech People, So She Wrote a Book About Them

In her new memoir, Burn Book, Kara Swisher cites a 2014 profile that dubbed her “Silicon Valley’s Most Feared and Well-Liked Journalist.” She might prefer to downplay the first and emphasize the second. Some people would switch that around. But there is no dispute about Swisher’s impact: When it comes to tech punditry, she’s at the top of the heap.

No tech journalist has built a bigger brand for herself. Her three-decade career is a study in hard work and uncommon confidence. She rose from being a reporter at The Washington Post to The Wall Street Journal’s internet reporter and then, in her biggest leap, the cofounder of the All Things D Conference and website with her revered mentor, tech reviewer Walt Mossberg. In one of their most famous interviews, she and Mossberg moderated a blissfully convivial joint session with lifetime rivals Bill Gates and Steve Jobs in 2007 that brought many in the audience to tears. Swisher and Mossberg left the Journal in 2013 and started the successful Code conference, with Swisher heading a news site. Her interviews can be tough, the most famous being with Mark Zuckerberg in 2010, when he was so rattled by the way Swisher and Mossberg pressed him on privacy that he literally sweated through his hoodie. In addition to interviewing the entire tech CEO pantheon, Swisher has tossed questions at figures in politics and culture—Hillary Clinton, Kim Kardashian, Maria Ressa, and so on. All the while Swisher has broken plenty of news, fueled by her deep sources. In the past few years, she has mastered the podcast medium with two hits—On With Kara Swisher, an interview show, and Pivot, with business professor Scott Galloway—as well as a coveted stint hosting HBO’s Succession podcast. Swisher also had a short, high-profile run as a New York Times op-ed columnist. She’s played herself on Silicon Valley and The Simpsons. Her current affiliations are with Vox and New York magazine, and she is a permanent panelist on The Chris Wallace Show, a CNN Saturday morning talkfest.

Despite the title, Burn Book is less a scorched-earth exposé than a primer for Swisher newbies and those who want to know the tech world from an insider perspective. On her podcasts she loves to riff on the big trouble she’s courting by revealing the skeletons in tech’s closet, but for her regular listeners there’s little in Burn Book that they won’t have already heard. (She explains that the title is a play on her Mean Girls reputation, a reference to the book of rumors written by the movie’s high school bullies, and that the cover shot of her face with her trademark Ray-Bans, a raging inferno reflected in the lenses, is kind of a joke.) In the memoir, Swisher slashes her way through the tech world like John Wick with a word processor, vanquishing vain CEOs and clueless legacy media bosses and emerging without a scratch. Those humbled bros include Elon Musk, a former pal who’s now a nemesis. But unlike Musk, who Swisher says recently declared her an “asshole,” most of the tech world still, well, likes and fears her. Other journalists dream of interviewing the likes of OpenAI CEO Sam Altman. At one stop on Swisher’s book tour, Altman is slated to interview her.

During my afternoon with Swisher at her house in a tony neighborhood in northwest Washington, DC, she took frequent breaks for fond exchanges with three of her four children, her wife Amanda Katz (an editor for The Washington Post), and her ex-wife, Megan Smith, a former US chief technology officer, who dropped in. Our conversation, though, was feisty, as we talked about her storied career, why she abandoned the conference business and The New York Times, and how she answers to the charge that she’s mean.

Steven Levy: What prompted you to write a memoir?

Kara Swisher: I didn’t want to. Jonathan Karp, the publisher of Simon & Schuster, bugged me for years to write something. I was much more interested in the blogs or the podcasts or whatever. I never really liked writing my books. The process was so slow. And I’d had enough of these [tech] people. I don’t like most of them anymore. I didn’t want to reflect on them. I’m sick of them. They’re sick of me. And Walt Mossberg was supposed to write his memoir, right?

The One Internet Hack That Could Save Everything

The One Internet Hack That Could Save Everything

The impact on the public sphere has been, to say the least, substantial. In removing so much liability, Section 230 forced a certain sort of business plan into prominence, one based not on uniquely available information from a given service, but on the paid arbitration of access and influence. Thus, we ended up with the deceptively named “advertising” business model—and a whole society thrust into a 24/7 competition for attention. A polarized social media ecosystem. Recommender algorithms that mediate content and optimize for engagement. We have learned that humans are most engaged, at least from an algorithm’s point of view, by rapid-fire emotions related to fight-or-flight responses and other high-stakes interactions. In enabling the privatization of the public square, Section 230 has inadvertently rendered impossible deliberation between citizens who are supposed to be equal before the law. Perverse incentives promote cranky speech, which effectively suppresses thoughtful speech.

And then there is the economic imbalance. Internet platforms that rely on Section 230 tend to harvest personal data for their business goals without appropriate compensation. Even when data ought to be protected or prohibited by copyright or some other method, Section 230 often effectively places the onus on the violated party through the requirement of takedown notices. That switch in the order of events related to liability is comparable to the difference between opt-in and opt-out in privacy. It might seem like a technicality, but it is actually a massive difference that produces substantial harms. For example, workers in information-related industries such as local news have seen stark declines in economic success and prestige. Section 230 makes a world of data dignity functionally impossible.

To date, content moderation has too often been beholden to the quest for attention and engagement, regularly disregarding the stated corporate terms of service. Rules are often bent to maximize engagement through inflammation, which can mean doing harm to personal and societal well-being. The excuse is that this is not censorship, but is it really not? Arbitrary rules, doxing practices, and cancel culture have led to something hard to distinguish from censorship for the sober and well-meaning. At the same time, the amplification of incendiary free speech for bad actors encourages mob rule. All of this takes place under Section 230’s liability shield, which effectively gives tech companies carte blanche for a short-sighted version of self-serving behavior. Disdain for these companies—which found a way to be more than carriers, and yet not publishers—is the only thing everyone in America seems to agree on now.

Trading a known for an unknown is always terrifying, especially for those with the most to lose. Since at least some of Section 230’s network effects were anticipated at its inception, it should have had a sunset clause. It did not. Rather than focusing exclusively on the disruption that axing 26 words would spawn, it is useful to consider potential positive effects. When we imagine a post-230 world, we discover something surprising: a world of hope and renewal worth inhabiting.

In one sense, it’s already happening. Certain companies are taking steps on their own, right now, toward a post-230 future. YouTube, for instance, is diligently building alternative income streams to advertising, and top creators are getting more options for earning. Together, these voluntary moves suggest a different, more publisher-like self-concept. YouTube is ready for the post-230 era, it would seem. (On the other hand, a company like X, which leans hard into 230, has been destroying its value with astonishing velocity.) Plus, there have always been exceptions to Section 230. For instance, if someone enters private information, there are laws to protect it in some cases. That means dating websites, say, have the option of charging fees instead of relying on a 230-style business model. The existence of these exceptions suggests that more examples would appear in a post-230 world.

2054, Part VI: Standoff at Arlington

2054, Part VI: Standoff at Arlington


18:46 April 15, 2054 (GMT‑5)

Arlington National Cemetery

That night in her apartment Julia Hunt ordered in sushi and watched the coverage of Slake’s botched press conference on her living room sofa. Days later, Slake’s panicked responses to the questions about Castro’s death continued to air, and they appeared even worse on the news.

Hunt raised a piece of salmon sashimi between two chopsticks as she read the chyron for the next story: Castro Autopsy Leaked on Common Sense Confirms Foul Play and White House Lies. She dropped the fish onto her lap.

News of the withheld autopsy exploded. On every channel the prime-time anchors flashed printed copies of the report to the camera. They read whole sections aloud, describing the dimensions of the marble-sized mass of cells inexplicably lodged in Castro’s aorta and the excerpted transcript of the autopsy itself, in which the chief internist concluded, “This can’t be the same heart.”

Within the hour, Truthers flooded the streets in cities around the country. As Hunt scrolled the channels, a news crew in Lafayette Park was conducting interviews with the growing mass of protesters, one of whom she recognized; it was the man in the wheelchair she’d met on the Metro. She had thought of him often. Now she learned his identity: retired gunnery sergeant Joseph William Sherman III. Beneath his name on the screen were the words Truther Volunteer Organizer. She placed his name in a search engine and learned that he’d lost his legs in the Spratly Islands and that the Chinese nuclear attack on San Diego had killed his wife and three daughters, who’d lived at nearby Camp Pendleton. Hunt could hear in Sherman’s voice how deeply he resented a president who while alive flaunted constitutional norms by clinging to power for an attempted fourth term and whose successor, Smith, now flaunted norms again by withholding an autopsy and refusing to be transparent about his predecessor’s death.

“Point your camera here,” said Sherman, thumbing toward his missing legs. “I sacrificed these for my country, and you’re going to lie to me … you’re going to lie to all of us.” He gestured expansively to a cluster of Truthers who’d placed him at their center, the core of them veterans, wearing old military fatigues adorned with medals that dangled from their chest pockets. “It’s a lie that Smith is the legitimate president when he so clearly had a hand in killing Castro. Is this what America has become? Dreamers drunk on power led by a dictator-president. Lies to the many so long as it gives power to the few.” Sherman held the camera’s focus with his insistent blue eyes.

His tone was so resolved, the correspondent felt compelled to answer him. In a meek voice, she said, “I don’t know.”

“Of course you don’t.” Sherman leaned into the camera. “President Smith,” he began, “you are illegitimate. You will find that everyday Americans—we patriots who demand the truth about your crimes and the excesses of the Dreamers—will not be led by a thief, by someone who stole the presidency. We served our country before, and we’ll serve it again. And don’t even think of trying to place your predecessor in Arlington’s hallowed ground.” Sherman swiveled around, turning his back to the camera, and wheeled himself away.

The news cut to commercial.

Julia Hunt rested her head against the arm of her sofa, her eyes still glued to the screen. Weeks of exhaustion swept over her. While she waited for the program to return, she fell into a black wilderness of sleep. Deep into this sleep, in the early hours of the morning, she began to dream: Here, in the dream, she is asleep in her girlhood bedroom and is woken before dawn by a noise, the sound of something hitting the floor. Her surroundings are familiar, the adobe ranch house in New Mexico where Sarah Hunt had raised her. Wearing her nightgown, she carefully shuts the door behind her and steps into the dark corridor. At its far end a single band of light escapes from the base of another door. She begins to walk down the corridor. The tiles are cool beneath her bare feet. As she draws closer, she can hear what sounds like a struggle.

The Real Reason Steph Curry Is So Damn Good

The Real Reason Steph Curry Is So Damn Good

In looking over the footage for the doc, was there anything that surprised you?

Curry: I guess the one thing that surprised me was how bad my first college game was. Because I tell the story—I’ve told the story all the time. Like you saw, you can hear, “He had 13 turnovers in this game.” And Coach had to make a decision, do I keep playing him or bench him? He could have made or broken my college career at that moment. But it was worse than I remember.

How do you go about forgiving yourself for a bad performance or a bad mistake?

Curry: It’s easier to move on to the next thing as long as you’re not cheating the process. In terms of learning the lessons you need to learn, you need to be honest with yourself, vulnerable with yourself. I know human nature is powerful, the mind is a powerful thing. You can’t be afraid of failure, you can’t be afraid of the negative outcome.

Coogler: I try to do analysis. If I have a failure, I see if there was any points where I could have done better. Was it anywhere where I had an inkling and went against it? Or I didn’t do something I knew I should have done?

I got Panther shot in maybe 100 or 117 days, or something like that. Not all of them 117 days I was efficient. So, after a day when we didn’t really get what we needed, it’s like alright, cool, well, what happened? Did you not have a shot list? Did you not talk to the actors? Some days you might get rained out. You can’t control that. And like Steph’s saying, you gotta be in honesty with yourself like, man, did I do everything I was supposed to do? Could I have been better? But the thing is, you got to get excited and say, “I’ma fix it tomorrow.”

Peyton: So, I’ma be real.

Yeah, please. I hope everyone’s being real.

Peyton: There’s this irrational confidence that both Steph and I have.

Curry: You’re a maverick.

Peyton: In my day-to-day life, I’m constantly examining how I can be a better husband, father, all those things. As a producer, I think the idea is to work as hard as you can to make this thing better. But once that thing is there, to me, it’s beautiful. It’s almost like a baby coming out. No matter the scars or whatever, to me it is beautiful. Because, before, that thing did not exist. So now that it exists, it is beautiful.

Steph has said that faith is an important part of his life. The documentary, in that vein, feels almost spiritual.

Curry: There’s the old saying that I’m not smacking people over the head with the Bible or trying to force anybody to adopt a belief. It’s about identifying, what makes you unique? What do you tap into? That is like a superpower.

Coogler: In a way for me, like, [long, long pause] it’s like, the film is like, constantly in conversation with fate.

Curry: You said “fate”?

Coogler: Fate. Like, I think about, what if that didn’t happen? What if Coach McKillop didn’t leave him in that [first college] game? And didn’t play him in the second game? I think he was signaling to Steph and to Davidson, and to everybody, “I didn’t pick him up for a player he is going to be. I picked him up for right now. [Bangs the table] You don’t sit on a bench man. ‘I’ll put you back in when you ready?’ No. Right now. You’re ready right now, even if you think you not ready, I’m gonna show you that you are.” And that? Well, that changed basketball. Not benching that freshman changed the way we play basketball.

The AI-Powered, Totally Autonomous Future of War Is Here

The AI-Powered, Totally Autonomous Future of War Is Here

A fleet of robot ships bobs gently in the warm waters of the Persian Gulf, somewhere between Bahrain and Qatar, maybe 100 miles off the coast of Iran. I am on the nearby deck of a US Coast Guard speedboat, squinting off what I understand is the port side. On this morning in early December 2022, the horizon is dotted with oil tankers and cargo ships and tiny fishing dhows, all shimmering in the heat. As the speedboat zips around the robot fleet, I long for a parasol, or even a cloud.

The robots do not share my pathetic human need for shade, nor do they require any other biological amenities. This is evident in their design. A few resemble typical patrol boats like the one I’m on, but most are smaller, leaner, lower to the water. One looks like a solar-powered kayak. Another looks like a surfboard with a metal sail. Yet another reminds me of a Google Street View car on pontoons.

These machines have mustered here for an exercise run by Task Force 59, a group within the US Navy’s Fifth Fleet. Its focus is robotics and artificial intelligence, two rapidly evolving technologies shaping the future of war. Task Force 59’s mission is to swiftly integrate them into naval operations, which it does by acquiring the latest off-the-shelf tech from private contractors and putting the pieces together into a coherent whole. The exercise in the Gulf has brought together more than a dozen uncrewed platforms—surface vessels, submersibles, aerial drones. They are to be Task Force 59’s distributed eyes and ears: They will watch the ocean’s surface with cameras and radar, listen beneath the water with hydrophones, and run the data they collect through pattern-matching algorithms that sort the oil tankers from the smugglers.

A fellow human on the speedboat draws my attention to one of the surfboard-style vessels. It abruptly folds its sail down, like a switchblade, and slips beneath the swell. Called a Triton, it can be programmed to do this when its systems sense danger. It seems to me that this disappearing act could prove handy in the real world: A couple of months before this exercise, an Iranian warship seized two autonomous vessels, called Saildrones, which can’t submerge. The Navy had to intervene to get them back.

The Triton could stay down for as long as five days, resurfacing when the coast is clear to charge its batteries and phone home. Fortunately, my speedboat won’t be hanging around that long. It fires up its engine and roars back to the docking bay of a 150-foot-long Coast Guard cutter. I head straight for the upper deck, where I know there’s a stack of bottled water beneath an awning. I size up the heavy machine guns and mortars pointed out to sea as I pass.

The deck cools in the wind as the cutter heads back to base in Manama, Bahrain. During the journey, I fall into conversation with the crew. I’m eager to talk with them about the war in Ukraine and the heavy use of drones there, from hobbyist quadcopters equipped with hand grenades to full-on military systems. I want to ask them about a recent attack on the Russian-occupied naval base in Sevastopol, which involved a number of Ukrainian-built drone boats bearing explosives—and a public crowdfunding campaign to build more. But these conversations will not be possible, says my chaperone, a reservist from the social media company Snap. Because the Fifth Fleet operates in a different region, those on Task Force 59 don’t have much information about what’s going on in Ukraine, she says. Instead, we talk about AI image generators and whether they’ll put artists out of a job, about how civilian society seems to be reaching its own inflection point with artificial intelligence. In truth, we don’t know the half of it yet. It has been just a day since OpenAI launched ChatGPT 504, the conversational interface that would break the internet.