ARTICLE AD
The Mediterranean Sea traces the 25-mile-long stretch of coastal land that makes up Gaza; for many people there, it is a lifeline. The poet Mahmoud Darwish understood those waters as the sole possession of Palestinians, writing in his poem “The Strangers’ Picnic”: “I will embrace a wave and say: Take me to the sea again. This is what the fearful do: when a burning star torments them, they go to the sea.”
Wafa Al-Udaini grew up entranced by that same sea. The waters offered relief for overheated bodies, a backdrop for gatherings, fish to grill for dinner. When the Israeli military destroyed clean-water wells and desalination plants, it became essential in a different way: It was a place to wash dishes, do laundry and, occasionally, under extreme duress, even drink.
When Al-Udaini became a journalist, she traced the story of Israel’s occupation of Gaza by its impact on the water and, in turn, on the lives of those who depended on it. This was also her way of ensuring that the story of Palestinians was not told only by the number of civilians killed or bombs dropped.
She wrote about the fishermen who relied on catching anchovies and sardines to sell and feed their families. It was a dangerous profession, she wrote, mainly because of the risk of being fired on by Israeli naval gunners. The men she interviewed told her that the Israeli blockade limited their access to equipment for their boats, leading them to replace their motors with truck engines, which can result in capsizing. And yet many of them returned despite the precarity. As one fisherman told her: “I love the sea and its smell. It’s an indescribable thing.” In another article, she wrote about the reduction of water pollution to the sea, which was caused by Israeli airstrikes on sewage networks. She interviewed a woman named Sabah who was elated to go surfing for the first time in years.
Al-Udaini was raised in the city Deir al-Balah, in the middle of the Gaza Strip, by a large extended family. In 2007, she completed her English studies at Al Aqsa University. She quickly became frustrated by Western media’s coverage of Gaza as a place of violence, extremism and poverty. She resolved to be a voice from within, telling the stories of its pleasures alongside its hardships. She wrote for smaller outlets to start, eventually working her way up to regular bylines in The Guardian and The Middle East Monitor. In 2009, she founded an organization called 16th October to train young writers and activists in Gaza to work with English-language organizations. (The group was named for the day the U.N. began to review a report that found evidence that both the Israeli military and Hamas committed war crimes in the 2008-9 Gaza conflict.)
Al-Udaini documented the way cultural traditions could become peaceful weapons of defiance. In 2018, she covered the Great March of Return, which started out as weekly demonstrations demanding the end of the Israeli blockade and the right of return for refugees and ended up stretching on for over a year. She marveled at the endurance of those who kept attending the protests despite losing limbs after being hit by rubber bullets and tear-gas canisters, describing them as “the image of sumoud,” Arabic for steadfastness. Once, she observed people in an exuberant folk dance called the dabka; another time, she was moved by the sight of elders holding the keys to their former homes that they were forced to leave in 1948.
Al-Udaini became a frequent commentator on Palestine Chronicle TV, which airs on YouTube. She always wore a niqab, even though she felt it risked impeding her ambitions to become a prominent voice in the West. Her eye shadow often matched the fabric of her head scarf, pale pink or shimmery opal, eyes flashing as she delivered her reports. Her voice was often raspy, carrying the strain of someone deprived of sleep and time.
After a 2023 interview with a particularly hostile British journalist was picked up by the Israeli media, Al-Udaini shared with friends and colleagues that she was receiving threats. Eventually she moved to a town outside Deir al-Balah, where she and her husband built a two-story house surrounded by fields and farms. They loved it, even though electricity and running water had been cut off.
On Aug. 7, Al-Udaini published her last article, about outrage over the deaths of Ismael Alghoul and Rami Alrifi, two Al-Jazeera journalists who were killed by an Israeli drone strike while in a car marked “PRESS.” A little over a month later, Al-Udaini’s house was hit by an Israeli airstrike, killing her; her husband; their 5-year-old daughter, Balsam; and their 7-month-old son, Tamim. She is survived by her other two children, Malek and Siraj, who now live with her parents. Their whereabouts is unknown.
Al-Udaini once wrote an article about the catastrophic loss of Gaza’s almond trees. The Israeli military forced farmers to uproot acres of trees — and the rest were damaged by the contaminated water supply. Almond blooms signify spring and form the basis of local dishes and even children’s games. “They grew so well in Palestine that when asked how they are, locals would reply, ‘Almond!’” she wrote. “It was a sign of goodness, health, greatness. No longer.”
Jenna (J) Wortham is a staff writer for the magazine who has written about wellness apps and how the pandemic changed the internet.
In November 1979, less than four months after he was inducted into the Hall of Fame, Willie Mays was banned from Major League Baseball. He was 48, six years into retirement, but he was still one of the most recognizable athletes on the planet. He still appeared on talk shows and network sitcoms. He still got mobbed in restaurants. Kids who weren’t alive to watch him play still practiced his basket catch in the backyard and memorized his iconic numbers: the 660 career home runs; the .301 lifetime batting average; the 12 Gold Gloves in center field; the 24 All-Star game appearances (a feat possible only because M.L.B. hosted two All-Star games per year from 1959 to 1962, and Mays played in all of them).
But he was also deep in debt, which was nothing new for Mays, except now he was no longer being paid like a superstar. For much of the 1960s, Mays was baseball’s highest-paid player; in 1970, his salary was $135,000, or about $1.1 million today. And he always parted easily with his money. He gave it away to kids in his neighborhood after stickball games in the street. He lent it to friends he knew would never pay him back. He paid his housekeeper’s income taxes on top of her salary, even as he was in arrears to the I.R.S. himself. He also liked nice things: cars, clothes, furniture, houses. After his first marriage ended in (costly) divorce, he bought a multilevel home built into the side of a steep slope overlooking the Golden Gate Bridge, then added a spiral staircase running from the living room to the garage level so he could get to his convertible without going outside.
In the fall of 1979, Mays was making $50,000 a year as a good-will ambassador for the New York Mets, the last team he played for, when the Bally’s Park Place Casino Hotel in Atlantic City offered him a 10-year contract for $100,000 per year to spend 10 days a month at the casino as a celebrity greeter. Sign autographs, take pictures, tell stories, play golf with the high rollers. Be Willie Mays. Mays didn’t gamble, or drink, and even in retirement being Willie Mays depleted him, but he was always good at it, and he needed the money.
Today Major League Baseball has a lucrative partnership deal with FanDuel, the sports-betting platform, but in 1979, the baseball commissioner Bowie Kuhn insisted that the entire sport would be tarnished by any kind of paid arrangement between a gambling operation and a baseball legend. Kuhn forced Mays to choose between Bally’s and the Mets, and if he chose Bally’s, he would have to accept banishment from M.L.B. — no employment of any kind, no appearances on the field at Giants games, or the World Series, or the Hall of Fame. If he wanted to attend a game, he’d have to buy a ticket. Choosing Bally’s was a rare act of defiance for Mays, a peacemaker by temperament who hated causing a fuss. But he felt disrespected by Kuhn. “They had no cause to go and dump me like that,” he told a Washington Post reporter who visited him at Bally's in 1980, a year into the job. “Baseball needs people like me.”
Bally’s put Mays’s picture on its poker chips — along with his jersey number, 24, but no mention of the Giants or any other trademarked imagery — and his casino handlers packed his schedule, making sure to get their money’s worth out of him. Invariably patrons would bring up “the Catch” — Mays’s game-saving over-the-shoulder grab at the Polo Grounds during Game 1 of the 1954 World Series — and he always said the same thing: that he had made several better catches before TV cameras were commonplace, that he knew he had it all the way (notice how he taps his thigh with his glove as he’s sprinting back for the ball) and that the best part of the Catch wasn’t the catch itself but how fast he spun around and fired the ball back to the infield, preventing the tiebreaking run from advancing past third base.
Mays played so hard on the field and heaped so much pressure on himself to perform like a god that his body sometimes had to remind him he was just a man. At least twice in the middle of his career, he collapsed during games from exhaustion and spent the next few days recovering in a hospital bed. It happened again while he was working for Bally’s, onstage in front of 400 kids at a junior high school near Atlantic City. The school principal later told reporters that Mays “just collapsed like a Slinky toy.” This time he was unconscious for 15 minutes before being revived. Everyone thought he was dead. He was rushed to a hospital but then discharged just two hours later, and by the weekend, he was back to posing for photos and regaling gamblers.
Mays got some company in exile in 1983, when the Yankee legend Mickey Mantle accepted a similar job at Claridge Hotel and Casino, prompting Kuhn to ban him too. Twice Mays petitioned Kuhn to reconsider his banishment, but the commissioner didn’t budge. When Kuhn stepped down in 1984, though, his successor, Peter Ueberroth, moved quickly to end what had become an embarrassment for M.L.B., lifting the ban during spring training in 1985. Mays and Mantle, the new commissioner declared, were “exceptions to the current guidelines.” They were, he said, “more a part of baseball than perhaps anyone else.” The time had come to resume treating them like it.
Devin Gordon is a writer based in Massachusetts. He is the author of ‘‘So Many Ways to Lose: The Amazin’ True Story of the New York Mets, the Best Worst Team in Sports.’’
Legend has it that Shelley Duvall was hosting a party in Houston in 1970 when a couple of her guests, crew members on Robert Altman’s new film, thought it would be good to introduce her to Altman. She was only 20 and had never acted before, but that hardly mattered; the wonder of early Altman is that everyone — even the familiar faces — seemed to be doing it for the first time. Not a lot of people saw the film she was eventually cast in, “Brewster McCloud,” but those of us who did felt as if we were experiencing the shock of the new. Keith Carradine would articulate the feeling four years later when, gazing at Duvall in Altman’s 1974 film, “Thieves Like Us,” he says, “I never seen nobody like you before.”
It was hard to miss the promise in her sweetly alert gaze. She seemed to be ushering us into the new decade, offering a preview of the emotional openness and sexual freedom that potentially lay ahead. After a decade in which we had to spend a little too much time with Julie Andrews and Doris Day, we were now being confronted by the sexual assertiveness of Glenda Jackson in “Women in Love” and Barbara Hershey in “Last Summer” or the slow-burning anger of Carrie Snodgress in “Diary of a Mad Housewife.” Duvall’s distinction lay in couching her assertiveness in something softer, less threatening, that still let you know she could see right through any potential romantic partner and fully expected him not to measure up to much.
After discovering in “Brewster McCloud” that the boy (Bud Cort) she has effortlessly seduced is a virgin, she says, “That means that I’m responsible for you from now on.” Lest that seem a little too soft, she was ready to remind you that she was nobody’s pushover. In the same movie, she concludes the story of an attempted rape by blithely intoning: “I hit him with a lug wrench.” Pauline Kael got it right when she wrote that Duvall “seems able to be herself on the screen in a way that nobody has ever been before.”
Women found something liberating in Duvall. She offered them permission to violate the conventions of how a woman was supposed to look and act: the spider-web eyelashes she sported at the outset, the endlessly long legs landing in checkerboard knee socks and platform sandals in “Nashville.” Her pouty, off-center beauty was perfect for 1970s Hollywood in its attempt to make stars out of actors whose looks were more real.
For a while, it must have seemed as if the sky was the limit. Duvall’s development into a serious actress led to a nuanced performance in Altman’s 1977 film, “3 Women.” Playing an insecure, delusional health care worker, she finally got to show some hard metal, layering it so skillfully with vulnerability that she won best actress at Cannes that year. You might have expected parts like Norma Rae to come her way.
Instead, she got Stanley Kubrick and “The Shining.” Her performance in that film is often described as “iconic”; it’s the one most people know her by. But it amounted to a kind of dead end. According to Duvall, Kubrick cast her because “he said I was great at crying.” That’s mostly what she does in “The Shining”: cry and scream, while Jack Nicholson comes after her with an ax. But who wanted to see Shelley Duvall doing that? If you had been rooting for her since “Brewster McCloud,” you could be forgiven for wanting to shout at the screen, “Shelley, where’s that lug wrench?” In its own way, “The Shining” seemed to betray the early-’70s promise that by the end of the decade women were going to have something better to do than run from thuggish, violent men.
Duvall’s experience during the grueling 56-week “Shining” shoot seems to have done her in. She had nice things to say about Kubrick, but they alternated with darker thoughts: “I will never give that much again. If you want to get into pain and call it art, go ahead, but not with me.” In the ’80s, she played smaller parts before turning to producing the admirable “Faerie Tale Theatre” for television. Then, in the ’90s, she left Hollywood and, in the company of her musician partner, Dan Gilroy, became again what she started out as: a Texas girl. She moved to Blanco, a Hill Country town with a diner moonlighting as a bowling alley and a used-book store heavy on the works of Fred Gipson (author of “Old Yeller”). “I wouldn’t say I became a recluse,” she said. “I just took time out.” The time stretched to 30 years, punctuated by rumors of mental illness brought about by a sensationalized appearance on the “Dr. Phil” show. She seems, in her later years, to have most enjoyed driving around in her Toyota 4Runner, rarely getting out of the car.
Local restaurateurs took care of her, taking food to her vehicle. And fans kept coming to the Hill Country, hoping for “Shelley sightings.” My own took place a dozen years ago, in a restaurant in Johnson City, not far from Blanco. I was finishing my cherry pie when a woman I didn’t recognize came in, little bells dangling from her boots, demanding that she be served “no organ meat.” The eccentricity and assertiveness should have tipped me off. When I asked my waitress who she was, she whispered, “I don’t know, but I hear she used to be a really big movie star.”
Anthony Giardina is the author, most recently, of the plays "The City of Conversation” and “Dan Cody’s Yacht.”
b. 1935/1928
Phil Donahue and Ruth Westheimer
Before the answers to life’s questions fit in our pocket, you used to have to turn a dial. If you were lucky, Phil Donahue would be on, ready to guide you toward enlightenment. In a stroke of deluxe good fortune, Dr. Ruth Westheimer might have stopped by to be the enlightenment. He was the search engine. She was a trusted result.
Donahue hailed from Cleveland. The windshield glasses, increasingly snowy thatch of hair, marble eyes, occasional pair of suspenders and obvious geniality said “card catalog,” “manager of the ’79 Reds,” “Stage Manager in a Chevy Motors production of ‘Our Town.’” Dr. Ruth was Donahue’s antonym, a step stool to his straight ladder. She kept her hair in a butterscotch helmet, fancied a uniform of jacket-blouse-skirt and came to our aid, via Germany, with a voice of crinkled tissue paper. Not even eight years separated them, yet so boyish was he and so seasoned was she that he read as her grandson. (She maybe reached his armpit.) Together and apart, they were public servants, American utilities.
Donahue was a journalist. His forum was the talk show, but some new strain in which the main attraction bypassed celebrities. People — every kind of them — lined up to witness other people being human, to experience Donahue’s radical conduit of edification, identification, curiosity, shock, wonder, outrage, surprise and dispute, all visible in the show’s televisual jackpot: cutaways to us, reacting, taking it all in, nodding, gasping. When a celebrity made it to the “Donahue” stage — Bill Clinton, say, La Toya Jackson, the Judds — they were expected to be human, too, to be accountable for their own humanity. From 1967 to 1996, for more than 6,000 episodes, he permitted us to be accountable to ourselves.
What Donahue knew was that we — women especially — were eager, desperate, to be understood, to learn and learn and learn. We call his job “host” when, really, the way he did it, running that microphone throughout the audience, racing up, down, around, sticking it here then here then over here, was closer to “switchboard operator.” It was “hot dog vendor at Madison Square Garden.” The man got his steps in. He let us do more of the questioning than he did — he would just edit, interpret, clarify. Egalitarianism ruled. Articulation, too. And anybody who needed the mic usually got it.
The show was about both what was on our mind and what had never once crossed it. Atheism. Naziism. Colorism. Childbirth. Prison. Rapists. AIDS. Chippendales, Chernobyl, Cher. Name a fetish, Phil Donahue tried to get to its bottom, sometimes by trying it himself. (Let us never forget the episode when he made his entrance in a long skirt, blouse and pussy bow for one of the show’s many cross-dressing studies.) Now’s the time to add that “Donahue” was a morning talk show. In Philadelphia, he arrived every weekday at 9 a.m., which meant that, in the summers, I could learn about compulsive shopping or shifting gender roles from the same kitchen TV set as my grandmother.
Sex and sexuality were the show’s prime subjects. There was so much that needed confessing, correction, corroboration, an ear lent. For that, Donahue needed an expert. Many times, the expert was Dr. Ruth, a godsend who didn’t land in this country until she was in her late 20s and didn’t land on television until she was in her 50s. Ruth Westheimer arrived to us from Germany, where she started as Karola Ruth Siegel and strapped in as her life corkscrewed, as it mocked fiction. Her family most likely perished in the Auschwitz death camps after she was whisked to the safety of a Swiss children’s home, where she was expected to clean. The twists include sniper training for one of the military outfits that would become the Israel Defense Forces, maiming by cannonball on her 20th birthday, doing research at a Planned Parenthood in Harlem, single motherhood and three husbands. She earned her doctorate from Columbia University, in education, and spent her postdoc researching human sexuality. And because her timing was perfect, she emerged at the dawn of the 1980s, an affable vector of an era’s craze for gnomic sages (Zelda Rubinstein, Linda Hunt, Yoda), masterpiece branding and the nasty.
Hers was the age of Mapplethorpe and Madonna, of Prince, Skinemax and 2 Live Crew. On her radio and television shows, in a raft of books and a Playgirl column and through her promiscuous approach to talk-show appearances, she aimed to purge sex of shame, to promote sexual literacy. Her feline accent and jolly innuendo pitched, among other stuff, the Honda Prelude, Pepsi, Sling TV and Herbal Essences. (“Hey!” she offers to a young elevator passenger. “This is where we get off.”) The instructions for Dr. Ruth’s Game of Good Sex says it can be played by up to four couples; the board is vulval and includes stops at “Yeast Infection,” “Chauvinism” and “Goose Him.”
On “Donahue,” she is direct, explicit, dispelling, humorous, clear, common-sensical, serious, vivid. A professional therapist. It was Donahue who handled the comedy. On one visit in 1987, a caller needs advice about a husband who cheats because he wants to have sex more often than she does. Dr. Ruth tells Donahue that if the caller wants to keep the marriage, and her husband wants to do it all the time, “then what she should do is to masturbate him. And it’s all right for him to masturbate himself also a few times.” The audience is hear-a-pin-drop rapt or maybe just squirmy. So Donahue reaches into his parochial-school-student war chest and pulls out the joke about the teacher who tells third-grade boys, “Don’t play with yourself, or you’ll go blind.” And Donahue raises his hand like a kid at the back of the classroom and asks, “Can I do it till I need glasses?” Westheimer giggles, maybe noticing the large pair on Donahue’s face. This was that day’s cold open.
They were children of salesmen, these two; his father was in the furniture business, hers sold what people in the garment industry call notions. They inherited a salesman’s facility for people and packaging. When a “Donahue” audience member asks Westheimer whether her own husband believes she practices what she preaches, she says this is why she never brings him anywhere. “He would tell you and Phil: ‘Do not listen to her. It’s all talk,’” which cracks the audience up.
But consider what she talked about — and consider how she said it. My favorite Dr. Ruth word was “pleasure.” From a German mouth, the word conveys what it lacks with an American tongue: sensual unfurling. She vowed to speak about sex to mass audiences using the proper terminology. Damn the euphemisms. People waited as long as a year and a half for tickets to “Donahue” so they could damn them, too. But of everything Westheimer pitched, of all the terms she precisely used, pleasure was her most cogent product, a gift she believed we could give to others, a gift she swore we owed ourselves.
I miss the talk show that Donahue reinvented. I miss the way Dr. Ruth talked about sex. It’s fitting somehow that this antidogmatic-yet-priestly Irish Catholic man would, on occasion, join forces with a carnal, lucky-to-be-alive Jew to urge the exploration of our bodies while demonstrating respect, civility, reciprocation. They believed in us, that we were all interesting, that we could be trustworthy panelists in the discourse of being alive. Trauma, triviality, tubal ligation: Let’s talk about it! Fear doesn’t seem to have occurred to them. Or if it did, it was never a deterrent. Boldly they went. — And with her encouragement, boldly we came.
Wesley Morris is a critic at large for The New York Times and a staff writer for the magazine.
When James Earl Jones was a small child, he stopped talking. He was born in Arkabutla, Miss., and learned to love its gullies and rises and deep ravines and the loam enriched by the Mississippi River and the way it felt against his bare feet. His parents had left him to his maternal grandparents, and when he was 5 or 6 they moved to Dublin, Mich., reluctantly taking the boy with them. He was devastated. The move opened opportunities for the family, but was a profound rupture for the young Jones. “The move from Mississippi to Michigan was supposed to be a glorious event,” he remembered in his 1993 memoir, “Voices and Silences.” “For me it was a heartbreak.”
Jones would later characterize himself as mute. It’s not that he couldn’t speak, but when he did it was with a terrible stutter that he developed after moving to Michigan. Before strangers, he wouldn’t muster his words. “I talked to my family in basic terms,” he wrote. His grandmother allowed him to skip church, the only place that would force him to struggle through words during those years.
Though he refused to talk with other people, his time in Mississippi taught him to listen. “Out in the country, with few books or strangers, and no such thing as television,” he wrote, “we depended on the stories we knew.” His grandmother told bedtime stories about “women cursed to wear the heads of mules and men who had bellies full of writhing snakes.” His family spoke of his great-grandparents, Wyatt Connolly and Sharlett Jeeter, and the 300 acres of Mississippi land they cultivated.
When Jones was in high school, a poet and former college professor named Donald Crouch persuaded him to break his silence. After the teacher introduced him to the poems of Henry Wadsworth Longfellow, Jones wrote an ode to grapefruit patterned after Longfellow’s “Song of Hiawatha.” In what might have been a bid to make the boy speak, Crouch said the poem was so good that it must have been plagiarized — a notion Jones could only dispel by reading it aloud. It turns out that poetry, like song, can help stutterers overcome disfluency, and in performance Jones found himself able to speak. Jones went on to participate in school competitions and recitals, sometimes reading lines of Edgar Allan Poe’s verse. As he discovered his voice, he gravitated toward anything that required him to use it: public speaking, debate, oratory contests.
After graduating from high school, he began his studies at the University of Michigan as a pre-med student. But medicine was simply a profession any first-generation college student might choose in 1949, not the vehicle for Jones’s self-expression. After struggling with chemistry and physics, he decided to become an English major because it was the closest thing to studying drama. Though he spent four years in college, he left without a degree and served in the military before eventually returning to school to study drama in New York City at the American Theater Wing, supporting himself through odd jobs. (He returned to college and got his degree in 1955.)
To discover the lower frequencies of the Mississippi in his timbre, he spent hours learning to strip away his Southern accent. Years spent in silence gave Jones an ear not just for what people conveyed with words but also for all the accumulated meaning of voice, cadence and articulation. He believed that understanding those details was the key to knowing a character.
One of his earliest stage roles was as Othello in a summer production of Shakespeare’s play in Michigan — a performance that foreshadowed his successful run in the early 1960s as a fixture of The New York Shakespeare Festival (now Shakespeare in the Park). After the director Stanley Kubrick saw him in a production of “The Merchant of Venice,” Jones earned his first film role in Kubrick’s “Doctor Strangelove.”
As his stature grew, Jones would come to embody a strange paradox: He was synonymous with a voice that he wouldn’t allow himself to hear. The risk? If he listened to himself, he might get caught up in his emotions and succumb to that old stutter. Instead he learned to listen for and discover the voices of his characters. In Howard Sackler’s 1968 play “The Great White Hope,” Jones portrayed a lightly fictionalized version of Jack Johnson, the champion boxer whose defeat of several white competitors and public relationships with white women made him a target for racial harassment. Roles like that allowed Jones to tell the multitudinous stories of Black men.
His voice, arguably the most memorable in cinema, was as flexible as it was indelible, the tool he used to transform himself from a poet into an actor. Despite that, he remained a chameleon, embodying the authority of fatherhood in so many guises: the resplendent King Jaffe Joffer in “Coming to America”; the heartbroken, embittered and abusive Troy Maxson in August Wilson’s “Fences”; the stern and upright Mufasa in “The Lion King”; and the foreboding Darth Vader in the “Star Wars” franchise.
Jones never imagined himself as representing that kind of authority, despite the unmistakable baritone that sounded as if it came from the hollowest part of a drum. For him, that was always the point. His discovery of language allowed him to understand in a different way the vital need humans have for hearing and telling stories.
Reginald Dwayne Betts a contributing writer for the magazine and the founding chief executive of Freedom Reads.
In late July 1961, 14-year-old Paul Auster was hiking with a group of boys from his sleepaway camp when a storm stole over the horizon. Lightning “danced around us like spears,” Auster later recalled, turning “everything a bright ghostly white” and striking a camper named Ralph. Trapped in a meadow, far from the safety of camp, Auster stood watch over Ralph’s blue and cooling body, using his finger to prevent the boy from swallowing his own tongue. “In spite of the mounting evidence,” Auster said, “it never occurred to me that he wasn’t going to come around.”
Death had marked Auster, and not for the final time. He had already mourned the sudden loss of one grandmother (heart attack) and would soon mourn the slow demise of the other to A.L.S., a disease, Auster observed, that seemed to leave its victims with “no hope, no remedy, nothing in front of you but a prolonged march towards disintegration. …” Later there were the deaths of his mother and father; a horrific car crash in which the lives of his second wife, Siri Hustvedt, and their daughter, Sophie, were somehow spared (“We should be dead,” Auster later said); and finally the passing of his 10-month-old granddaughter Ruby, and his son, Daniel, who overdosed in 2022. By then, Auster had long since decided that “the world was capricious and unstable, that the future can be stolen from us at any moment, that the sky is full of lightning bolts that can crash down and kill the young as well as the old, and always, always, the lightning strikes when we are least expecting it.”
Sadness permeated Auster’s work like storm water: In his novels and memoirs — and in a range of collected and uncollected essays — rarely is anyone left dry. “Paul was extremely interested in the idea of the hero who is cast into a new world by grief,” Hustvedt says. “He used that device a lot: the stripped person. The person who has lost their most profound connections to the world. And I’d argue that it goes all the way back to ‘The New York Trilogy.’”
Published in installments in the mid-1980s by a small California press, and in its entirety in 1987 by the British house Faber and Faber, the “Trilogy” is the series that made Auster famous — the three works he is still best known for today. Rereading it, you encounter all the themes that he would return to over the course of his career: the role of chance and dumb luck in human existence; the dark secrets we keep from those around us; the persistent suspicion that somewhere else, somewhere out of view, a different version of ourselves is living a life that both resembles and diverges from ours; and over it all, the pall of tragedy. Introducing Quinn, the character at the center of the first book, “City of Glass,” Auster writes that “he had once been married, had once been a father, and that both his wife and son were now dead.” Grief drives Quinn to do the things he does, even if he’s not always aware of it; and grief nudges Quinn (and eventually a metafictional version of Auster himself) toward epiphany, resolution, truth.
In April 2022, at age 75, now laureled and honored many times over and still producing fiction at a sturdy clip, Auster published a story titled “Worms” in Harper’s Magazine. The protagonist is Sy Baumgartner, an academic still grieving for his wife, Anna. “He gave me the pages to read, and said, ‘Hey, if I get a novel out of this, would you be interested in publishing it?’” remembers Morgan Entrekin, the publisher and C.E.O. of Grove.
Neither the author nor publisher was aware it would be Auster’s last book. As he was preparing a draft of what would become “Baumgartner” for Grove, Auster began to suffer from peculiar but persistent fevers. Soon after, he was diagnosed with lung cancer. “I’m certain Paul didn’t know he was ill when he started the book,” Hustvedt says, “but he definitely knew he was ill when he was finishing it. And I’ve had this uncanny feeling ever since he died that what he was really doing was writing my grief.”
In the book’s most indelible scene, Baumgartner hears the phone ring in his wife’s study, which he has kept preserved exactly as it was when she was still alive. Anna is on the line. She is in the “Great Nowhere,” she reports, “a black space in which nothing is visible, a soundless vacuum of nullity, the oblivion of the void.” She is not in pain, she tells her husband. She is not hungry. She can’t feel anything at all. You sense she has been returned whence she came. Still, Auster writes, “she suspects that he is the one who is sustaining her through this incomprehensible afterlife, this paradoxical state of conscious nonexistence, which must and will come to an end at some point, she feels, but as long as he is alive and still able to think about her, her consciousness will continue to be awakened and reawakened by his thoughts.” If he doesn’t let her go, in other words, Anna won’t truly be gone. She will stay with him.
“Baumgartner” was released in November 2023. In April, Auster entered hospice care and died shortly after at his home in Brooklyn, in his library, his favorite room in the house. A circle was closed. As Auster wrote a decade earlier, in a memoir titled “Winter Journal,” in which he returned to the image of young Ralph lying lightning-struck in a field, “You think it will never happen to you, that it cannot happen to you, that you are the only person in the world to whom none of these things will ever happen.” And then they do.
Matthew Shaer is a contributing writer for the magazine and a founder of the podcast studio Campside Media.
In 1999, when Michaela DePrince was just 4 years old, she saw an image that would change her life: a ballerina in a “glittering pink skirt,” standing en pointe on the cover of a magazine. Born Mabinty Bangura, she was living in the Safe Haven orphanage in Sierra Leone, having been sold there by an uncle after the deaths of both of her parents during the country’s civil war.
At the orphanage, Mabinty stood out. She spoke five languages, having picked them up at the marketplace and with her father, and had an indefatigable curiosity; her neck, clavicle and arms were speckled with tiny light-colored patches, a result of vitiligo. The orphanage ranked the children — 24 girls and three boys — based on which were thought to be the most adoptable; because of her skin condition, Mabinty was ranked last. Her time in the orphanage was very hard: She suffered physical abuse by the “aunties” who were in charge and survived an attack by rebel soldiers who murdered her favorite teacher. To make life bearable, she told stories, sang, created dancing games and became best friends with the girl ranked No. 26. She used her imagination as a shield against the place’s cruelties, playing pretend in order to ward off the aunties. “I am a witch,” she told one of them. “I will place a spell on you if you harm me.”
Months into her time at the orphanage, Mabinty was walking in a windstorm when the cover of the magazine blew into her face. “Someday I will dance on my toes like this lady,” she thought. “I will be happy too!” Shortly after, she and her best friend were adopted by Elaine and Charles DePrince, who took them home to Cherry Hill, N.J. At the time, the couple were the parents of three boys and had recently lost two others because H.I.V.-infected blood was used in the treatment of their hemophilia. The girls were renamed Michaela and Mia.
In her new home, DePrince repeatedly watched a recording of George Balanchine’s “The Nutcracker.” Just before turning 5, she asked her mother to enroll her in classes at the Rock School for Dance Education. With the determination of someone much older, DePrince committed to ballet — and with it the grueling training, cutthroat competitiveness and painful injuries. She was thrilled to be dancing but was also confused and dismayed by the racism of the classical-dance community, including the mother of a white classmate who said that Black girls should stick to modern or jazz. She encountered the absurd necessity of dyeing “nude” accessories from peach to dark brown to match her skin tone. In the documentary “First Position,” which chronicles a handful of young dancers’ bids to win the prestigious 2010 Youth America Grand Prix ballet competition, DePrince described the prejudice: “There’s a lot of stereotypes saying that if you’re a Black dancer, you have terrible feet, you don’t have extension, you’re too muscular, you’re not graceful enough.” She wouldn’t be moved, though. “I want to be known as a delicate, Black dancer who does classical ballet,” she said.
In her memoir “Taking Flight,” written with her mother, DePrince describes the formative dance technique of “spotting,” or “continuously keeping an eye on a distant point as you turn.” Despite the distractions of youth, there was always that distant point: professional success. A performance at the Grand Prix won her a scholarship to the Jacqueline Kennedy Onassis School at American Ballet Theater in New York. She graduated in 2012 to the Dance Theater of Harlem, where she became the company’s youngest principal dancer. In 2013, at 18, she moved to the Netherlands and danced with the Dutch National Ballet for seven years. She reached new audiences after being featured in Beyoncé’s “Lemonade” film in 2016. In 2021, she joined the Boston Ballet as a second soloist and remained there for the next three years. Her dreams came to pass.
Once, when DePrince was about 5, her mother told the anxious Michaela that her vitiligo spots “looked like a sprinkling of pixie dust or glitter.” It’s the job of a ballet dancer to inhabit fairy tales and their archetypes: the witches, ingénues and wretches; the princesses who outwit villains and overcome grueling obstacles. But fairy tales have troubling origins; as in the real world, there are few safe havens. (DePrince died at age 29 from undisclosed causes.)
Throughout her career DePrince embraced the mythic quality of her biography, crafting a story of uplift and transcendence. In her book and many public appearances, DePrince had the poise of someone anointed with magic dust. She was persistently put together, unassailably optimistic. “I would never allow the audience to see how I was feeling,” she wrote of her stoicism. “Even at 13, I believed that the audience should think that the hardest combination of steps was effortless, and a ballerina’s personal trials shouldn’t show on her face.”
Niela Orr is a story editor for the magazine. She wrote about the actor Angus Cloud from ‘‘Euphoria’’ in last year’s The Lives They Lived issue.
People who met Bob Newhart weren’t always sure why he looked so familiar. Women on elevators often told him that he reminded them of their ex-husbands; men sometimes asked him if they served together in the war. Newhart just had that kind of face, an average white Midwestern sort of face, a face that always looked a little disappointed in itself.
For a long time, Newhart felt that way. His childhood home was as impersonal as a boardinghouse, he once said in an interview. More often than not, his father spent his evenings at a neighborhood bar near their house in Chicago; even when he was around, he never said much at meals. (“Maybe now Dad will notice me,” Newhart wrote in his memoir, “I Shouldn’t Even Be Doing This!” describing his thoughts as he walked up to collect one of the three Grammys he received in 1961.) Growing up, Newhart filled a void in himself, he once told a literary magazine, by peopling his imaginary world with characters who would entertain him. “I think that’s true of all comedians,” he said.
Of the family’s four children, Newhart was the weakest student. Law school didn’t work out; he tired of accounting. At 29, when his friends were already married, having children and settling into careers, Newhart was single and living unhappily at home, too broke to pay rent. Somehow he summoned the self-confidence to try something new. He was funny, people always told him. Maybe he could make a living off that.
Newhart used to kill time at work by calling a friend of his in character, pretending to be an average Joe trying to maintain calm in the face of some ludicrous predicament — a pilot, for example, who’d fallen out of his plane. Someone encouraged him to tape those routines, and an executive at Warner Brothers ended up with the recordings. The executive suggested that Newhart record a live performance for an album.
“The Button-Down Mind of Bob Newhart,” released in 1960, made him an overnight star. It was the best-selling album on the Billboard charts for 14 straight weeks. “You knew that album by heart,” says the legendary sitcom producer Chuck Lorre, whose father bought it for him when he was 10. Richard Pryor told Newhart that he stole “The Button-Down Mind” from the record store as a kid. Newhart joked that he was owed some royalties, and Pryor forked over a quarter.
While brash comics like Lenny Bruce and Don Rickles were blasting their way into the new decade, Newhart satirized the strait-laced culture of 1960 using its own conventions, which sometimes meant leaving a lot unsaid. Having held down a number of part-time jobs, Newhart often turned his humor on the workplace, exposing its obligatory and artificial politeness, the unnatural self-control it often demanded in the face of absurdity or incompetence.
He applied that sensibility to routines that invited the audience to eavesdrop on one-sided conversations, bringing a fresh kind of sketch comedy to stand-up. In one, a modern-day adman urges his client, Abe Lincoln, not to cut “four score and seven years ago” from his speech: “Abe,” he says, irritated, “we test-marketed that, and they loved it.” In another, an American submarine commander tells his crew that the press made too much of the boat’s accidental attack on Miami Beach, especially “since it was the off season down there.”
Even after the money started rolling in, Newhart, like so many of his personas, remained prudent and reasonable: He went on to marry and have four children, and he left the itinerant life of stand-up comedy for a more stable career in TV. In 1972, Newhart debuted “The Bob Newhart Show,” in which he played the first psychologist on a sitcom, a role perfectly suited to his gift for showing restraint in the face of folly. When a ventriloquist who has gone to Newhart for help says his dummy would like a few minutes to talk to the doctor alone, Newhart, clearly suppressing his response, pauses, still and straight-faced, for four loaded seconds before offering a hesitant nod — an exercise in understatement that characterizes so much of his comedy.
Newhart never professed to have much faith in psychology or its precepts. Even the most casual viewer of the show, he wrote in his memoir, knew that his patients were “no better off” after a visit to his character’s office. And yet at some point, Newhart saw a psychologist himself, according to his son Tim. Tim and his siblings sometimes wondered what their father talked about in therapy. They knew he loved them, but they could see how hard it was for him to express his feelings. At home, off script, he was remarkably silent and could seem distant.
Eventually, Newhart’s psychologist decided to retire, and he suggested that Newhart join some of his other famous patients for group sessions that they could run mostly by themselves. The plan fell apart quickly, once Newhart got wind of who the other celebrities were. “They’re all crazy!” he told his family.
In his 80s, Newhart accepted his final recurring role, appearing on Lorre’s “The Big Bang Theory” as a version of his perennial character: the last sane man in a world gone mad. Newhart made a career of highlighting other people’s foibles by playing the straight man, but he was also sending up his own extreme reserve, perhaps the lingering relic of a disconnected childhood.
At the end of his life, when his health was failing, he continued to rely on humor during dark moments, said his daughter Jennifer. “You’re either going to laugh or cry,” she said. “I think most of us would rather laugh.”
Susan Dominus is a staff writer for the magazine. She has written about treatment for menopause symptoms, the efficacy of therapy and declining male enrollment at colleges.
Eleanor Coppola met Francis Ford Coppola when she was 26. He was directing his first feature, a 1963 horror film titled “Dementia 13,” and Eleanor was his assistant art director. Months later, she found out she was pregnant. They married the following weekend in Las Vegas. She imagined that they would continue to work together. Instead, she found herself supporting her husband’s career. “My dad was this traditional Italian who had an idea of a wife’s role,” Eleanor’s daughter, Sofia, told me.
Eleanor knew how people saw her because they often told her. “Too small for such a big husband,” a crew member on “Apocalypse Now” told Eleanor, who weighed 99 pounds. Diane Keaton told Eleanor that she was her model for Kay, the quiet WASP who married into the Corleone family in “The Godfather.” And yet it was Eleanor who so sharply recorded her husband in “Hearts of Darkness,” a 1991 documentary about the making of “Apocalypse Now.” Francis’s epic struggle to get his film made, driving himself to the edge of bankruptcy and his own ego, appears in stark contrast to Eleanor’s calm, steady narration. “He gathers up his Oscars and throws them out the window,” she wrote in “Notes,” a 1979 collection of diary entries that also served as the film’s voice-over. “The children pick up the pieces.”
In a second memoir, “Notes on a Life,” Eleanor described her “internal war” between being an artist and a wife and mother. While Francis made his films, Eleanor wrote, “I attend to little tasks.” She shopped for children’s shoes, mops, frying pans, kitchen towels, firmer pillows, fresh flowers, groceries, wastebaskets, trash bags, laundry detergent, doormats, shampoo, duplicate keys. She tended to faucets, fridges, sinks, gardens, heaters, landscapers and sick children in hotel rooms where the windows never opened. Often she found herself descending into a depression. She went to psychologists and asked what was wrong with her. Not one, she wrote, diagnosed her as a creative person.
In 1975, Eleanor paid Sofia and her brother, Roman, to watch Sofia’s birth video in Sofia’s bedroom, part of an art exhibit she staged inside the family’s 22-room Victorian home in San Francisco. Back then, when a man won an Oscar, his wife received a miniature version to wear as a necklace charm. In another room, Eleanor removed her husband’s Oscars from a lighted display case and replaced them with her tiny ones, turning an indignity into art. Francis didn’t get it. “He thought I was making fun of him,” she wrote.
Sofia’s 27th birthday: Eleanor watched Francis mentor Sofia as she prepared to shoot “The Virgin Suicides,” her directorial debut, and felt a “hot, aching jealousy.” Eleanor had graduated from the University of California, Los Angeles, with an art degree. Her own father, a political cartoonist for The Los Angeles Examiner, died when she was 10. She imagined how her life might have unfolded had he been around to mentor her. Sofia’s life was going to be so different. On a trip to Tokyo in 1991, Eleanor observed Sofia, then 19, inspect a portable dishwasher and buy a wide-angle camera instead.
Eleanor didn’t share her frustrations with Sofia until her daughter was older. It’s not as if she guilt-tripped her children. The opposite: She was so present, always trying to keep their lives as normal as possible. As Francis worked on location and the family followed, Eleanor moved Sofia’s bedroom furniture to Tulsa, Okla., hid Easter eggs in Manila, dragged an entire Santa Claus suit to Tokyo. “We would have been messed up if it weren’t for my mom,” Sofia told me.
Eleanor worried about whether she had made an impact on her children. But she did, Sofia said. She made a huge impact. There is Eleanor in Sofia’s films, all of them such carefully constructed worlds, with women gazing out from inside them. Sofia learned from her mother that a film director doesn’t have to yell; that she can be both quiet and strong. One of Sofia’s favorite stories was about how her mother once found herself copying Agnes Martin drawings — just making steady lines over and over and not knowing why. But then she arrived on the set of “Apocalypse Now” and picked up a 16-millimeter film camera, and it turned out she had trained herself to have a steady hand.
The family’s home in Napa had a giant valley oak tree in its front yard. The children swung from its branches, had birthday parties beneath it. In the 1980s, an enormous limb snapped off and grazed the room of Eleanor’s elder son, Gio. Eleanor was looking at that very tree a few years later when she got the call that Gio, 22, had been killed in a boating accident. Eleanor was 50. She had expected the next decade to be a time of freedom. “A time to pick up the threads of my creative life left behind at age 26,” she wrote, adding, “I never expected a knockout blow.”
Years later, the whole damn tree would fall on the house, roots and all. When Eleanor became ill in her 70s with a rare cancer, Sofia remembers her aunt saying that Eleanor was like an oak tree — a grounding force under which the family built their lives. Eleanor declined chemotherapy. She wanted to work instead, making two feature films in her 80s and writing about that choice in a third memoir, which Sofia hopes to have published.
Perhaps what Sofia wants to share most is how much Eleanor taught her about motherhood. That it’s about those simple things: taking your kid to the dentist, taking her to buy shoes. The older Sofia gets, the more she feels as if she understands her mother. Eleanor was there not just when Sofia accepted her first Oscar but also when she became a mother herself. And she remembers the advice Eleanor gave her.
“Get a great babysitter,” Eleanor said, “so you can do your work and not feel guilty.”
Irina Aleksander is a contributing writer for the magazine. She last wrote about Tom Sandoval, a reality star on Bravo’s ‘‘Vanderpump Rules.’’
The night Frankie Beverly died, his grandson, Brandon Beverly, left the hospital where he had been at his grandfather’s side and drove to the singer’s California home. On a normal night, Brandon would have put his phone on “do not disturb” before going to sleep. But exhausted, he forgot.
At 7 a.m., the incessant vibrating of that phone woke him up. Just an hour after the announcement of Beverly’s death was posted to social media, thousands of fans across the world were pouring out their memories of being touched by Beverly’s music.
Even though in the coming days, some obituaries of the Philadelphia-born singer noted that he had never achieved mainstream success, Brandon knew that his grandfather never really sought it. Beverly made music for Black people, and Black people loved him for it.
If Black gatherings have an anthem, without question, it’s Beverly’s R&B song “Before I Let Go,” which he wrote for his band, Maze. It doesn’t matter the region of the country, or the age of the crowd — there isn’t a reunion or barbecue or house party or graduation where the revelers at some point don’t line up, as if on command, to do the electric slide as they join Beverly in crooning:
“You know I think the sun rises and shines on you/You know there’s nothin’, nothin’, nothin’ I would not do/Whoa, no/Before I let you go … /Ohhh … /I would never, never, never, never, never, never, never/Never let you go, before I go.”
It’s hard to pinpoint exactly what it is about this song that makes it so special. It wasn’t a megahit when it came out in 1981. But somehow in the years following its release, “Before I Let Go,” with its upbeat, almost giddy tempo and sing-out-loud-inspiring lyrics, became an enduring, cross-generational essential, the opening chords giving way to Beverly’s whoa whoa hoooaaa acting as a summons to the dance floor. As my colleague J Wortham said on the “Still Processing” podcast, when the song comes on, “We run toward it, literally and psychically.”
Today, the centrality of “Before I Let Go” in Black America is undeniable. “Is this song the biggest song in our culture?” DJ Envy, a host of “The Breakfast Club,” one of the top hip-hop morning shows in the country, asked the day after Beverly’s death. “It might be No. 1.”
Brandon, of course, knew the song — he first remembers hearing his grandfather sing it onstage when he was 4 — but he didn’t fully realize its cultural power until he started going to parties at Howard University in the early 2010s. At the end of a party, the D.J. would play “Before I Let Go,” immediately shifting the mood among people who were born at least a decade after the song first came out. “Literally, every time I went to Howard, and it never didn’t happen — like, it always happened — they would play ‘Before I Let Go,’” Brandon, who is 32, told me. “I’m like, y’all listen to my grandpa like that?”
It was a testament to how even as musical styles changed, Beverly’s music tapped into something that remained integral to the Black experience: the heartache of having endured so much loss, but also optimism that better days will always come. As the lyrics of another of his hits go, “Joy and pain are like sunshine and rain. … When the world is down on you, love’s somewhere around. … Over and over you can be sure, there will be sorrow but you will endure.”
Beverly “did not care about selling 10 million records,” Brandon says. “He didn’t care about, you know, Billboard this and Billboard that. He was like, ‘Look, if I can make the most beautiful music that I can, and take my blood, sweat and tears and put it in my music for people to love, that’s all I care about.’”
That dedication to expressing his ups and downs through music is what kept Beverly filling outdoor venues and concert halls across the country, where crowds tended to wear white, as Beverly had come to do over the years, while experiencing what always felt more like revival than entertainment.
Brandon told me that Beverly was a private person who didn’t talk much about his musical process. But during the pandemic, when Brandon was spending time with his grandfather, he asked him about his best-known song. Beverly’s response surprised him.
“He was like, ‘Man, I cried so much writing that song,’” Brandon says with a chuckle. “I’m like: ‘This is everyone’s party song. How are you crying when you write something like this?’”
Go back and play it again, Beverly told him. Brandon, like so many of us, had been experiencing the song but not actually listening to it. “And then I realized, Oh, he was telling the girl, ‘Before I let you go, I am sorry,’” Brandon says. In a 2020 Essence magazine interview, Beverly explained that he wrote the song about a woman he still loved, while he was in a relationship with someone else.
The song that evokes sheer happiness at parties across the land is actually a song about heartbreak and love lost. The fact that so many of us failed to realize it gets to why Black Americans so cherished Frankie Beverly: He did not just make you feel a certain way. He made you want to feel a certain way.
Nikole Hannah-Jones is a staff writer for the magazine. In 2020, she won a Pulitzer Prize for her essay about Black Americans and democracy. She is the creator of The 1619 Project.
It was the summer of 1969, and Jerry West couldn’t sleep.
He was one of the greatest basketball players on Earth, an athlete whose grace and drive had transported him from the dirt roads of rural West Virginia to the glitz of Los Angeles arenas.
But he was a loser. There was no polite way to say it, and he wouldn’t hear otherwise. He had just led his team, the Los Angeles Lakers, to the N.B.A. Finals for the sixth time. And for the sixth time, he and his team had lost.
And if that wasn’t cruel enough, he lost each championship to the same opponent: the smug, invincible, hated Boston Celtics. The 1969 series was tied 3-3 when they played the seventh game in Los Angeles. Despite West’s heroics, the Lakers lost, again, and there’s a photograph of him leaving the court in stunned defeat. He’s walking slowly, one foot in front of the other, shoulders squared. His hamstring bound in tape, his eyes sunken sockets — a prisoner on his way back to his cell. In his memoir, “West by West,” he captioned the photo: “Where do I go?”
West played so spectacularly well in that losing series, averaging nearly 38 points per game, that he was awarded the Finals M.V.P. — the first and only time the losing star has ever been so recognized. As a prize, the league gave him a brand-new Dodge Charger. He fantasized about blowing it up with dynamite.
For any competitive athlete, losing would have been misery. But for Jerry West, a man who measured his childhood in beatings and his career in defeats, it was intolerable. “Every night I went to bed I thought about it,” he wrote of that summer. “Every night. Every goddamn night.”
Growing up poor in West Virginia, West was envious of his classmates’ Christmas presents and resentful of his watery soup dinners. His father, a bully and a tyrant, beat him mercilessly. His mother was a haunted, spectral figure. “I never learned what love was and am still not entirely sure I know today,” he wrote. When West was 13, a revered older brother, David, was killed in the Korean War. Shooting hoops late into the night, West imagined that if he made one more basket, it might bring David back.
West led his high school team to the state championship and could have gone to college anywhere he wanted. He chose West Virginia University, securing his status as a hometown hero in a state that often felt left behind.
In the N.B.A., West became a star who could shoot, pass, play defense and take a punch. He turned the Lakers into a force — but a stoppable force. In 1970, his team reached the Finals for a seventh time, only to lose — again — this time to the New York Knicks. They had another shot in the Finals two years later. West played badly, missing easy shots that he had made all his life. He turned to passing, and finally — finally! — the Lakers prevailed. Jerry West was an N.B.A. champion.
He had a hard time enjoying it. “I played terrible basketball in the Finals,” he said years later. “I was playing so poorly that the team overcame me.”
Here’s how the world saw West: a Hall of Fame athlete with ferocious tenacity and grit who so reliably hit the big shots that he was nicknamed Mr. Clutch.
Here’s how West described himself: “A tormented, defiant figure who carries an angry, emotional chip on his shoulder and has a hole in his heart that nothing can ultimately fill.”
Karen, his wife of 46 years, said that when she and West were introduced, she thought he seemed like “the saddest man she had ever met.”
After his playing days, West became an executive with the Lakers, and he proved a genius in this role, too. He helped bring in Kareem Abdul-Jabbar and Magic Johnson, two of the greatest players ever, and won five championships in the 1980s — including two against the still-hated Celtics. Later he signed Kobe Bryant and Shaquille O’Neal, and the Lakers won another championship in 2000, West’s sixth as an executive. And yet happiness remained out of reach.
The morning after winning the title, a Lakers colleague found him brooding in his silent, dark office, alone but for a dimly lit lamp. “He was not in a good spot,” the colleague remembers.
Losing had made West miserable; now winning did, too. He quit and left the team that summer as he struggled with another bout of depression. “It was not only providing me with zero joy but also affecting — ruining, really — every aspect of my life,” he wrote. The Lakers team he built would win two more championships without him.
If West, with his anguish and his brilliance, had lived in ancient Greece, he might have become a constellation. In midcentury America, he became a logo. In 1969, the N.B.A. replaced its schlocky emblem with a new one featuring a silhouette of West midstride, basketball on his hip, shoulder angled for speed. As a bid for eternal life, it’s a good bet that more people would recognize the N.B.A. logo than they would Cassiopeia.
There he is, forever frozen in motion, sprinting away from his opponents as well as his demons, resplendent with the rarest of talents and kissed with the gift of immortality, but tortured, forever tortured, never able to savor the fruits of his singular life.
Sam Dolnick is a deputy managing editor for The Times. He wrote about Marianne Mantell, a pioneer of spoken-word audio, in last year’s The Lives They Lived issue.
“Every cell remembers where it was touched by the mother,” Naomi Feil says in a clip from a 2007 documentary, viewed millions of times, as she tends to a patient with severe dementia. The patient, an elderly woman, seems entirely cut off from the world, incapable of interaction. Feil sits across from her in a senior-living facility and holds both sides of her face, delicately, like an adoring parent with a young child. Eyes open only to slits, the woman taps one hand weakly on the arm of her chair in a repetitive motion. Feil, a social worker, believed that for patients with advanced dementia, a “desperate need for connection” is locked inside, a need often focused on the distant past. Feil sings a children’s hymn to the woman, who Feil knows was a longtime churchgoer. The woman smacks the chair arm faster, louder. What appeared to be meaningless motion now seems responsive. She clutches Feil’s jacket, pulling her close. Feil sings a second hymn, and the woman, her eyes opening more fully and meeting Feil’s, whispers a phrase with her: “whole world in his hands.”
Feil’s approach to people with Alzheimer’s and other forms of dementia, which she called the Validation Method, aims for this kind of breakthrough and bond, no matter how fleeting. The method builds on her sense that such people are frequently in a state of yearning, even if they appear to be totally shut down or randomly, aggressively acting out. The technique strives to validate and meet their longing — and it takes a radical approach to problematic patient behaviors.
“Let’s say a disoriented old man sees me as a love object and wants to have sex with me,” Feil wrote in an unpublished autobiography. She did not shy away from engaging with the most uncomfortable aspects of caring for patients who are not only disoriented but also disinhibited. She described making sure that her own “judgments and fears” didn’t dictate her reactions. “If he’s patting me on my breasts, I might say, ‘Do you miss her very much?’ using the ambiguous ‘her’ that can mean mother, or wife, or lover, whoever.” Without encouraging the behavior, she suggested, a caregiver could turn errant sexual gestures into a meaningful connection.
In life’s final phases, the mind tends to reel backward, Feil maintained, and this should be honored, not fought. As she developed her approach, beginning in the late 1960s, she stood in opposition to a dominant practice, Reality Orientation, which puts a premium on reminding and correcting patients about current facts, from the day of the week to the truth that a beloved family member is no longer alive. Feil’s ideas have been a crucial force in an evolution away from Reality Orientation in dementia care.
The Validation Method’s stress on acute empathy, its sensitivity to the emotional desperation of patients, may be rooted in Feil’s extreme vulnerability as a child. In 1937, as a Jewish 4-year-old in Nazi Germany, she was hidden by nuns until her mother managed to escape with Feil and her younger sister to the United States, where they joined Feil’s father, who had already fled.
Her father, a psychologist, ran a Jewish home for the elderly in Cleveland. The family lived there, and growing up, Feil, who had no close friends at school, treasured her relationships with the residents. In her autobiography, she recalled sneaking off the grounds repeatedly at night and venturing to a drugstore and soda fountain with her “best friend,” a patient with “velvety eyes” whose teeth were black or missing.
In her early 30s, Feil returned to work at the home. There, the Validation Method started to take shape. In 1982, she founded the Validation Training Institute, and by the 2000s, she was leading at least 80 workshops a year, from China and Japan to the United States and almost every country in Europe. Her eldest child, Vicki de Klerk-Rubin, who now runs the institute, told me about her mother’s fierce drive — but also, toward the end of her life, her cognitive decline and descent into dark suspicions.
De Klerk-Rubin and two of her siblings recounted that as Feil became less competent, and as de Klerk-Rubin shared in leading workshops or led them solo, Feil began accusing her daughter of stealing her life’s work. The allegation inflicted a painful wound.
But de Klerk-Rubin reminded herself that her mother might well be in the grip of the distant past. She might be overtaken by the suspiciousness sewn into her psyche as she hid from the Nazis and then settled in an alien land. Feil had long insisted on being paid for workshops in cash. She kept $20,000 hidden in her house. It was as if her mother always felt, de Klerk-Rubin said, that “you never know when the Nazis are going to come, and you can’t get to the bank — you should have a stash.”
This, the daughter told herself, was Feil’s reality. It made her mother’s accusations easier to bear. It made her mother’s love easier to feel.
Daniel Bergner is a contributing writer for the magazine who previously wrote about alternate ways of treating psychotic disorders. He is the author of “The Mind and the Moon: My Brother’s Story, the Science of Our Brains and the Search for Our Psyches.”
Gena Rowlands
She worried that marriage would ruin her acting career, but the opposite happened. By Ismail Muhammad
On the set of the 1968 film “Faces,” Gena Rowlands was sure she saw the end of her marriage looming. The director — John Cassavetes, a pioneer of American independent cinema — was her husband, and in her memory, he hated what she was doing. “The first day you realize your husband doesn’t love you anymore,” she remembered. “The second day that he never loved you, the third that he hates you … and after that it gets bad.”
Rowlands’s relationship with her husband turned out to be altogether sturdier and richer than she imagined. By the time Cassavetes died in 1989, they had made 10 films together, including “A Woman Under the Influence” (1974) and “Gloria” (1980), each of which earned Rowlands an Oscar nomination for best actress. In Cassavetes, Rowlands found a partner who shared her vision of film as a way to convey the delirious turbulence of experience. And in Rowlands, Cassavetes found an actor who imbued his movies with their instinctive dynamism.
The daughter of a politician father and a stay-at-home mother (who would herself go on to act), Rowlands suffered from various chronic illnesses in childhood. “I wasn’t much of a handful,” she recalled to NPR’s Terry Gross in 2016. “I was lying around and looking pale and reading books and things.” Reading taught her that one life could encompass many lives, that she didn’t have to be one person. Acting, to her, was like reading: an entry point into the multiplicity of fiction and the explorations it made possible.
She did not intend to marry — marriage was, to her, a prelude to a woman’s artistic death. But as she was finishing her studies at the American Academy of Dramatic Arts, she met Cassavetes. He had recently graduated from the academy but returned there to see a play in which she happened to be performing. It was love at first sight, or the closest thing to it. They went for coffee at the Russian Tea Room and wed four months later.
A trim, conventionally beautiful blonde at a time when roles for women had severely narrowed, Rowlands might have been doomed to portray doting wives and alluring love interests in harebrained television shows and cinematic romps. She described one such film, “The High Cost of Loving,” in which she played a pregnant housewife, as a middling “old-fashioned man-woman-style comedy.” But even there her elastic face communicated a sardonic mischievousness that undercut its otherwise bland story of middle-class anxiety.
Roles like that one, along with mortgages on the family home, helped finance the independent projects she pursued with Cassavetes. Their marriage was both a romantic whirlwind and a union of two aesthetically like-minded individuals. “It was more satisfying to do the kind of pictures that we did that tried to examine life, tried to figure out a little, tiny bit of it, perhaps,” she said. “Love, the loss of love, the beginning of love, what you give up for what.”
Rowlands examined these questions by playing women who struggled — or refused — to be traditionally womanly or maternal. Cassavetes’s exploration of female psychology existed alongside her own, one artist’s thought deepening the other’s. “His own personal feeling was that he didn’t know how any woman could not be crazy in this society,” Rowlands recounted. She said that the depth of her husband’s empathy for women’s psychological suffering allowed her to thrive as an actor. Cassavetes’s actor-centered approach gave her the space she needed to take stock of just how restrictive American womanhood could be and the dark places to which it could lead you.
In “A Woman Under the Influence,” Rowlands plays Mabel, the wife of a construction worker (Peter Falk), whose life is defined by an anxious commitment to her husband and children. She is by turns hilarious, endearing and unhinged, her entire body conveying a distress that can be painful to watch. In a scene in which Mabel waits for her children to be dropped off after school, Rowlands paces back and forth, her arms swinging with a corked intensity as she approaches passers-by to ask for the time. You can’t blame the extras for avoiding her gaze. Once she sees her children’s bus approaching, she leaps up and down, raises her arms in victory, suddenly childlike. With her convulsive movement and tendency to speak in something approaching tongues, Mabel is as frightening as Linda Blair’s Regan in “The Exorcist.” The way Rowlands plays her, though, we recognize her malady as entirely of this world, part of a distinctly female experience, and therefore more frightening than demonic possession. Collapsing with the children onto the front porch of the family home, Mabel tells them, “I never did anything in my whole life that was anything except I made you guys.”
For Rowlands, family only deepened her engagement with acting. Her mother, Lady Rowlands, appeared in several of Cassavetes’s films. Her three children — Nick, Zoe and Alexandra — grew up in the house where their parents made and edited their films and went on to make films in which Rowlands appeared. Most notable, she starred as the dementia-afflicted Allie in Nick’s 2004 hit, “The Notebook.” Rowlands confounded her own youthful assumption that motherhood would be the end of her career as an artist. In life, as in her art, she couldn’t help rewriting the rules of what women were allowed to be.
Ismail Muhammad is a story editor for the magazine. He has written about waves of migration to New York, diversity in publishing and the painter Mark Bradford.
Marianne Boesky was in college in Vermont when federal marshals delivered her infamous 51-year-old father, Ivan, to the Lompoc federal prison camp in central California in March 1988 to begin serving his three-year sentence for insider trading. It was a long trip from Burlington — especially on the budget airline People Express — but she made it as often as she could to visit the inmate who once played her Chopin’s nocturnes on the piano of their rambling Westchester estate at bedtime.
Ivan Boesky grew up comfortably, if unglamorously, in Detroit; his father, a Russian Jewish immigrant, owned a chain of local bars with strip shows called the Brass Rail. Ivan never graduated from college and needed five years to get through Detroit College of Law, but he married rich. His wife, Seema Silberstein, was the daughter of a Detroit-based real estate developer who owned the Beverly Hills Hotel. Her father thought Boesky was beneath her but nevertheless set the young couple up with an apartment on Park Avenue, and Boesky started working on Wall Street.
His ascent began in earnest in 1981, with the establishment of the Ivan F. Boesky Corporation. The timing was auspicious. The mergers-and-acquisitions business was just starting to explode with the emergence of junk bonds, which enabled corporate raiders to initiate hostile takeover bids, loading up on debt and then transferring it to the acquired company. Boesky specialized in arbitrage, an esoteric form of investing that involved gambling on these takeovers. Information on possible targets was of great value — the more “inside” the better.
Tall and lean, with neatly parted silver hair, he dressed in the manner of a Gilded Age financier, in black three-piece suits and starched white shirts, the chain of his antique gold watch draped across the vest. He and Seema moved to Westchester, and he started commuting into Manhattan in a chauffeur-driven limousine. He slept no more than a few hours a night and subsisted mostly on black coffee — “vampire’s plasma,” he called it. He famously assured a class of graduating business-school students at Berkeley that “greed is healthy,” inspiring the fictional Gordon Gekko’s line, “greed is good.” He was a man with something to prove: He made a large donation to Harvard in order to gain membership to the Harvard Club. Even as he cultivated his celebrity, he seemed to know that his days as the king of Wall Street were numbered. “I can’t predict my demise,” he told The Washington Post in 1985. “But I suspect it will occur abruptly.”
Within a year, he would be threatened with charges by the Securities and Exchange Commission and the United States Attorney’s office for the Southern District of New York — then overseen by an ambitious prosecutor named Rudy Giuliani. His lawyers worked out a secret deal: In exchange for a reduced charge, lenient sentence and $100 million in fines and repaid profits, Boesky would help build cases against other suspected white-collar criminals, most notably, Michael Milken. He received his instructions for his undercover work from a pack of lawyers and investigators in an Upper East Side hotel suite. Disciplined as ever, he didn’t touch the elaborate buffet. As the room grew stuffy, men started taking off their jackets and rolling up their sleeves; Boesky didn’t so much as loosen his tie.
He wasn’t authorized to tell Seema until just a few days before his secret deal was announced and splashed across front pages around the country. By now, their marriage had curdled — she had come to see him as cold and “transactional” — but they had four children and so had remained together. She sought advice from therapists about whether to change her name so that her children wouldn’t have to be “Boeskys” all of their lives but was counseled that this might only invite more feelings of shame.
Boesky pleaded guilty to a single count of securities fraud and served 18 months at Lompoc, tending to the prison camp’s farmland, lifting weights and growing out his hair and beard. The government, in a somewhat questionable effort to avoid a market crash, had allowed him to secretly sell hundreds of millions’ of dollars worth of stock over the weeks before his deal was announced, but Seema controlled their finances now. After his release, she filed for divorce, and he asked for $1 million in annual alimony and, she recalls, a “beautiful home on a mountaintop.” He ended up with a $2.5 million house in La Jolla, Calif., and $180,000 a year.
Boesky kept a low profile for the remainder of his life and never worked again. Over the years, he stayed especially close to Marianne, who went on to law school and is now a prominent New York gallerist. She remembers a very different man from the one whose image is fixed in America’s collective imagination: To her, he was a voracious reader and autodidact who had taught himself five languages. She had been in attendance at his “greed is healthy” speech — she was in high school then — and had heard it as being hungry for life, not simply money. It irked her that her father was still known as greed incarnate, even after the emergence of a new and seemingly more rapacious breed of financial criminals like Bernie Madoff and Sam Bankman-Fried. Milken had rehabilitated his name, bankrolling various charities while lobbying tirelessly for a presidential pardon, an effort that finally paid off with President Trump in 2020. Her father had pleaded guilty to far fewer charges and had been given a much shorter sentence. He, too, was a major philanthropist when he had the means during his Wall Street prime. Where was his pardon?
She tried repeatedly to persuade him to write a memoir that would round out the picture and complicate his legacy. “This is the rise and fall of the American dream,” she said. “He came from nothing and had the curiosity and capacity to learn about everything.” But Boesky refused. The public would be free to remember him however it wanted.
Jonathan Mahler is a staff writer for the magazine. He wrote about the director William Friedkin in last year’s The Lives They Lived issue.
“Unlike any woman in my family or anyone I’d ever actually known, I was going to become — something, anything, whatever that meant,” Hettie Jones once wrote. She was, from her earliest memories, an ambitious, daring person who shrugged off the conventions binding young women in 1950s Queens. When she left New York for college in Virginia, she was Hettie Cohen, and she had never set foot in the house of a non-Jewish person; but, she figured, the farther from home the better.
After college, she moved back to the city to attend graduate school at Columbia. Eventually, she grew interested in the downtown arts scene, took a job at a jazz magazine and dropped out. She soon met a young Black writer named LeRoi Jones. They fell in love and married in 1958, when interracial marriage was still an incendiary proposition: She was never again welcome in her parents’ home.
The Joneses fell in with the New Bohemia scene in Greenwich Village. At night, they went to hear poetry at the cafes and jazz at a club in Cooper Square called the Five Spot, where artists like Thelonious Monk and Wayne Shorter were regulars. “It seemed as if another, new language had been offered me,” she wrote in her memoir, “How I Became Hettie Jones.” The Joneses’ apartment became a hub for fellow artists. Hettie later joked that the entire avant-garde scene fit in her living room. The couple started a magazine, Yūgen, and published work by their friends — Jack Kerouac, Frank O’Hara, Allen Ginsberg, Diane di Prima and others. The dark tights, flat shoes and black turtlenecks Hettie and her friends wore became the signature of a countercultural style that designers soon tried to emulate. One day she realized that the mannequins in a store window were dressed to look like them. “It felt odd to have so prompted the culture,” she wrote, “to have many other women want to seem to be you, whatever they thought you were.”
The women of the Beat movement were often treated as second-class artists — sexual or domestic handmaidens to the real artists. As her husband’s career ascended, Hettie transcribed and edited his writing, helped run the publishing house they founded, worked as the subscriptions manager for The Partisan Review and cared for their two daughters. The family was frequently broke. It was Hettie who worked to pay the bills and sorted it out when there wasn’t enough money for food. To her frustration, her own writing mostly stalled. Referring to her women friends’ thwarted aspirations, she wrote, “There’s an old kitchen way to say what we did: You bury your talent in a napkin.”
The Joneses’ life was nevertheless abundant. Throughout their 20s, they threw boozy, amphetamine-fueled parties in their apartment and participated in the frank sexuality of a social world in which “everyone was hot and mixed up.” LeRoi, to Hettie’s aggravation, conducted open affairs and had a child with di Prima. But when he thought that Hettie had taken a lover, he was outraged. “This man is crazy!” she wrote, incredulous at the double standard.
By 1965, escalating racial violence exerted unsustainable pressure on the Joneses’ already fractured marriage. LeRoi had become a strident critic of white racism, but his own marriage to a white woman drew accusations of hypocrisy from his cadre of Black artists and activists. In the wake of Malcolm X’s assassination, LeRoi changed his name to Amiri Baraka, left Hettie and their daughters and moved uptown to Harlem. Jones was bereft but eventually reconciled herself to the decision. “Well, it was a necessary consolidation of identity,” she told the writer Joyce Johnson with characteristic grit. In the years that followed, she refused to criticize Baraka publicly.
To be a young Jewish single mother with two Black daughters was precarious, but she pressed forward, famously making anyone who spent time in their Cooper Square apartment feel like family. “She was a joyful person,” Kellie Jones, Hettie’s older daughter, told me. “She just did the hard things, and she was committed and purposeful.”
The end of Jones’s time as “the writer’s wife” also marked the beginning of her own prolific writing years, during which she published some 20 books of memoir, poetry and children’s fiction. Realizing that younger generations knew nothing of the women who made up the Beat scene, she wrote her lyrical 1990 memoir, “How I Became Hettie Jones.” Jones found a subject that was both political and intimate: her inner life and the extraordinary trajectory of her own becoming. The poet came, at last, out from under the napkin in 1998, when she was 64: Her first collection, “Drive,” won the Norma Farber First Book Award from the Poetry Society of America. She followed that with two more collections.
Kellie Jones told me about a photograph of Hettie at her desk in college, sitting before a typewriter in the evening. A gooseneck lamp illuminates her face. It’s clear the camera has caught her coming up with a sentence, and she looks serious and focused, vibrant in her black turtleneck. “Sitting there, writing herself into existence,” Kellie said. “She was one of the most determined people I ever knew.”
Jordan Kisner is a contributing writer for the magazine and the author of the essay collection “Thin Places.”
One afternoon in the mid-1960s at a U.S. Public Health Service clinic in San Francisco, Peter Buxtun heard an unnerving story. An Alabama doctor had been reprimanded by the federal government for giving a shot of penicillin — the standard of care — to a man with late-stage syphilis. The doctor, government officials complained, had tainted one of its subjects in a syphilis study.
A doctor in trouble for helping a sick man? Research in which patients are not treated for an infectious disease, which they may pass on to their sexual partners? Buxtun, who was in his late 20s, knew plenty about syphilis. As a venereal-disease investigator, he spoke to men about the risk of V.D. — he was known as the best interviewer in the office — and encouraged them and their sexual partners to be tested.
Buxtun contacted the Communicable Disease Center (later known as the Centers for Disease Control and Prevention), which was overseeing the study run by another federal agency. In response, he received a manila envelope on “The Tuskegee Study for Untreated Syphilis in the Negro Male.” The study, underway since 1932, was designed to track the untreated progression of syphilis.
The researchers had initially recruited roughly 600 male subjects: around 400 with tertiary-stage syphilis; another 200 in the control group who tested negative. The government persuaded the men to participate by offering hot meals during medical-exam days, free health care for minor ailments and burial insurance. The subjects were all Black men from Macon County and the surrounding region in Alabama. Many were sharecroppers and poor.
To Buxtun’s shock, the researchers never told the men they had syphilis. Instead, they said they had “bad blood,” a colloquial phrase that could refer to numerous ailments. Some of the subjects underwent painful spinal taps so doctors could learn if the infection had spread to their nervous system. No matter how sick they grew, despite the availability of penicillin beginning in the 1940s, the treatment was the same: tonics and aspirin.
Buxtun headed to the library to read about the “doctors’ trial” at Nuremberg, when Nazi physicians were prosecuted for performing gruesome medical experiments on prisoners. Buxtun’s own parents — his father was Jewish — fled Czechoslovakia in the late 1930s when Buxtun was an infant to escape the Nazis.
In a report to his supervisors, Buxtun outlined his concerns and compared Tuskegee to Nazism. Soon after that, he sent questions about the study to the C.D.C., which summoned him to a meeting in Atlanta. At a long mahogany table in a conference room, Dr. John Cutler, a leader of the study, chastised Buxtun. When Cutler insisted that the men were volunteers, Buxtun read aloud from one of Cutler’s own reports, which stated that the men would never have agreed to participate without the offer of burial expenses.
In some ways, Buxtun was an unlikely whistle-blower. He had never been involved in the study, never known any of the research subjects. He lived 2,000 miles away from Tuskegee, where, among other things, he had a passion for collecting and selling antique military guns and swords. His carefully organized artifacts covered the floors and surfaces of one of his apartments.
He wasn’t the only person to raise questions about the study — its results had appeared in medical journals. But he was the one who kept at it. He shared his concerns with friends and colleagues. He spoke to journalists. After he left his job for law school, he grilled his professors about potential legal actions.
Then, in November 1968, more than six months after the assassination of Dr. Martin Luther King Jr., Buxtun wrote C.D.C. officials once again. “The group is 100% negro,” he noted. “This in itself is political dynamite and subject to wild journalistic misinterpretation.” Months later, the C.D.C. convened a panel of doctors. Only one pushed to treat the men with penicillin.
Finally, one night in 1972, Buxtun chatted with a friend who was a reporter for the Associated Press. She agreed to come to his apartment to comb through his documents. An A.P. editor assigned the idea to a more experienced journalist, Jean Heller, who broke the story.
In coming months and years, a cascade of change followed: Senate hearings, a federal inquiry, more-ethical research practices, a class-action lawsuit, a presidential apology from Bill Clinton in 1997. But much of the damage was done: 128 of the subjects are estimated to have died from syphilis and related medical complications; wives and children of some of the men also got the disease. And Black America had yet another reason to mistrust the medical system.
As for Buxtun, he was not mentioned in the A.P. article and received little fanfare. Some books paid homage to his work, including “Bad Blood,” by James H. Jones, and “Examining Tuskegee,” by Susan M. Reverby. But, notes Carl Elliott, author of “The Occasional Human Sacrifice,” Buxtun never became a household name like other whistle-blowers of the era: Daniel Ellsberg, Karen Silkwood, Frank Serpico.
Not that Buxtun was bitter about it. He wasn’t self-aggrandizing about the fact that he had exposed the most notorious medical-research scandal in the United States. As he once said: Anyone with a moral compass would have done the same. And yet, only Peter Buxtun did.
Maggie Jones is a contributing writer for the magazine. She teaches writing at the University of Pittsburgh and has been a Nieman fellow at Harvard.
Imagine you are a young girl growing up in poverty in New York City in the 1950s. Your father died when you were a few weeks old, and your family has been without a social safety net ever since. But there is a salvation, maybe the only one — rock ’n’ roll. You love Elvis, and the Everly Brothers, and soon discover that you have a voice, too. You, your older sister and two friends, also sisters, start harmonizing on street corners. The group moves on to sock hops and local shows, and then by some small miracle gets to record a couple of songs. They are frothy but forgettable, an escape, however fleeting.
At that point Mary Weiss, her sister, Betty, and the twins Mary Ann and Margie Ganser could have gone back to high school and years later reminisced about “that time we got to make a record.” But one day in the spring of 1964, a former gang member turned songwriter named George Morton pulled over to the side of the road and scribbled out a few lines, lines that turned into a song in search of a female voice. Just not the kind of voice that had been singing about soldier boys and boys so fine, parties and dancing and baby loves. This voice had to allow pain, the kind of pain so unbearable that the song needed a second section in which the anguish was whispered, alluded to, with a chorus of sea gulls weeping in the background. The song found its way to 15-year-old Mary, who poured every bit of the struggle of her life into it. By the end of the summer, “Remember (Walkin’ in the Sand)” was shaking up the sunny, sparkly Top 40 of the Beach Boys and the Supremes.
The song’s success whisked Mary and her group, called the Shangri-Las, out of their Queens neighborhood and into the Oz of show business. Building on “Remember,” they created what can only be called a kind of pop noir, and not just because in a few of their biggest hits, like “Leader of the Pack,” people literally died. Their records had an authentic street sensibility, populated as they were by other-side-of-the-tracks characters — a repentant runaway, more than one noble greaser — who loved hard and suffered hard. While much of the credit for the Shangri-Las’ hits has gone to the creative team behind them, few other singers could have pulled off that material, that attitude. In her lead vocals, Mary turned on the pathos tap and kept it running full. Even in the upbeat “Give Him a Great Big Kiss,” she gave overemoting a good name.
It was all so exciting — sharing a bill with the Beatles, hanging out at James Brown’s house. But it wasn’t long before it seemed the group had traded one grind for another: the overnight bus rides, the come-ons from fans and other musicians. Mary would wake up and not know what state she was in. “You stop being a kid from Queens and what you become is . . . well, a professional teenager,” she said. She feared so much for her safety on the road that she started carrying a gun. “That’s no way to live,” she said.
By the time of the group’s final chart entry, “Past, Present and Future,” in 1966, a haunting monologue set to Beethoven’s “Moonlight Sonata,” the heavy emotionalism of their early hits had given way to a deep resignation. Margie Ganser left to get her high school diploma; the other members kept going until the group dissolved amid industry neglect and legal issues. Their records went out of print almost immediately. Mary Ann Ganser, who struggled with a drug habit, died in 1970. Mary married and divorced and remarried and went on to be a manager at a commercial architecture business. It would be 40 years before she released new music, a solo album called “Dangerous Game.”
The Shangri-Las’ premature exit, when Mary was still a teenager, left them somewhere between footnote and legend. It would have been easy for her to wonder if her singing career had ever happened, given how much of it seemed to have been scrubbed from pop consciousness. Then one night in the late 1970s, she walked into CBGB, the fabled downtown music club in Manhattan, for a rare reunion concert, and found Shangri-Las records on the jukebox. It turned out that groups like the New York Dolls and the Ramones were big fans, having been drawn to the Shangri-Las’ raw energy. “Without the Shangri-Las, there would have been no Ramones,” Joey Ramone told Mary. She would take it a step further: “The Shangri-Las were punk before punk existed.”
More recently, Shangri-Las DNA was all over Amy Winehouse’s classic 2006 album, “Back to Black.’’ “I love the drama, I love the atmosphere, I love the sound effects,” she said. After Winehouse died in 2011, Mary told the rock historian Greil Marcus: “She could not stand fame any more than I could. . . . I wish I could have helped her.” On some level, though, Mary probably did. In their too-brief career, the Shangri-Las gave voice to lost souls. And listeners who connected to the group’s music might have felt a little less lost because of it.
Rob Hoerburger is the copy chief of the magazine and the author of the novel “Why Do Birds.” He has written for the Lives They Lived issue 19 times
Among the many indignities of being Pete Rose — and he was, for the last 35 years of his life, eager to tally them — one seemed to strike him as especially cruel: Here was a Cincinnati kid-made-good so revered for his feats on the diamond that they named a street outside the ballpark Pete Rose Way. And yet he was not allowed in the clubhouse. “I mean, that’s the way you treat Al Capone,” Rose told a local TV station shortly before his death, seated before a Warhol silk-screen print of himself.
This was the Pete Rose way: defiant but pleading, self-regarding and self-pitying, semi-persuasive despite his persistent self-sabotage.
As a ballplayer, Rose was a blazing throwback, relentless to the point of mania — the man fans called Charlie Hustle. His uniform was perpetually caked in infield dirt. His Monkees haircut convulsed on impact with each headfirst slide. Sportswriters summoned near-carnal language to describe his dedication. “There’s something almost obscene about watching Pete Rose play baseball,” one account began. “Anything that someone enjoys that much must be illegal or immoral.” He was a serial All-Star across 24 seasons, most of them spent in Cincinnati, and the whirring motor of the Big Red Machine, as the team was known at its 1970s apex. His signature was the backbreaking single, swatted between fielders. His record of 4,256 career hits has stood for nearly four decades, as close to untouchable as Rose once was. (No active big-leaguer has more than 2,300.)
As a pariah, though, Rose was firmly ahead of his time. Allergic to introspection and undaunted by fact, he dissembled and bluffed through the game’s greatest crisis in generations: his banishment from Major League Baseball in August 1989 for gambling on several sports, including his own, while managing the Cincinnati Reds. Rose publicly denied betting on baseball until he didn’t, nearly 15 years later, when he had a book coming out. He denied betting on his own team before insisting that any worthy manager should have the conviction to do the same. He denied betting as a player before eventually wink-winking about his years as a “player-manager,” when he both set the lineup and carried a bat.
For most of his exile, Rose’s campaign for reinstatement — and for inclusion in baseball’s museum of greats in Cooperstown, N.Y. — devolved into a kinda-sorta-not-quite-apology tour that even he struggled to keep straight.
“No way did I bet on baseball,” he told Jane Pauley in 1991, after spending five months in jail for gambling-related tax evasion, “and I’ll never admit I did.”
“Yes, I did,” he amended in 2004, “and that was my mistake.”
Locked out of baseball, Rose subsisted on another strength: self-branding. In his reputational prime, he had pushed Wheaties, Kool-Aid, pancake houses in Cincinnati. The night he was banned in 1989, he made time to appear on a shopping channel hawking signed bats and balls.
This form of commerce would become a kind of lifetime sentence, a reel of collectors and trade shows and gawkers approaching him for a modest fee. It was a volume business. After moving to Las Vegas — little about late-stage Rose was subtle — he said he worked 15 days a month, five hours a day, signing his name and posing for pictures, often from a table inside Caesars Palace. At one point, he estimated his daily income from such labor to be $20,000. He also showed up to scribble in Cooperstown on Hall of Fame weekend, uninvited by the sport’s high priests, consorting with old teammates and rivals off-campus until they returned to the official business of being legends and left Rose to putter in a memorabilia shop.
Occasionally, the wider sports world cracked open its doors. In 2015, Fox Sports hired him as a studio analyst, bow-tied and hair-dyed, before he was forced out two years later amid allegations of sexual misconduct in the 1970s. (Rose acknowledged a relationship with a girl who was 16, the age of consent in Ohio, when he was in his 30s.)
Last year, Rose placed the ceremonial first legal sports bet at an Ohio casino. (“Cincinnati Reds to win the World Series!”)
Through it all, the debate over Rose’s bid for enshrinement was both reliable sports-talk filler and a knotty proxy for the nation’s attitudes toward misdeed and absolution. “Should forgiveness,” a Rose biographer, Kostya Kennedy, once asked, “be granted only to the contrite?”
It is too easy to imagine how Rose might have played this now, as the accused in 2024, navigating a post-shame cultural moment. He would pick winners on his hit podcast by day and Dance with the Stars in prime time. He would be the highest-paid spokesman for the betting apps with which virtually every major sports league is now in business. He would find America’s most powerful people celebrating his roguish arc. “Major League Baseball should have allowed him into the Hall of Fame many years ago,” President-elect Trump posted after Rose’s death in September. “Do it now, before his funeral!”
Alas, Rose’s sins date to the pre-post-shame age. His most consistent defense was a flailing whataboutism that could still sound coherent enough: Baseball history is rife with abusers and scoundrels whose plaques stand regardless. And none of them earned the “Hit King” title that Rose had stitched into his shirt collar as an old man. “I don’t think I hurt you,” he said, “I hurt me, and I hurt my family.”
That meditation came partway through a nearly four-hour docuseries released last summer (“Charlie Hustle & the Matter of Pete Rose”) — a slow-rolling study in lessons not learned.
In its closing frames, Rose considered the free market’s appetite for remorse. In the past week, he said, he had scrawled “Sorry I bet on baseball” on 1,000 balls. An order was out for 1,000 more. The demand baffled him.
“People buy that ball,” Rose said. “I don’t know why, but they buy that ball.”
Matt Flegenheimer is a correspondent for The Times. He last wrote for the magazine about Russell Brand, a British comedian and podcaster who has become a celebrated political voice on the American right.
For parents, cartoons are like weather. We can control plenty else in our children’s lives: their legal names and bedroom furniture and the color of the tiny shoelaces on their tiny shoes. But exactly which bright animated characters (SpongeBob, He-Man, Bluey) will colonize our homes and minds and souls during a child’s formative years — this is largely out of our control. We’re at the mercy of mysterious powers. So I consider myself extremely lucky that during my daughter’s preschool years, Janice Burgess was one of those powers.
Burgess created a show called “The Backyardigans.” The title refers to five chubby computer-generated animals: a moose, a hippo, a kangaroo, a penguin and an unclassifiable creature named Uniqua. In each episode, these friends meet up in their adjoining backyards to imagine their way into some deep adventure. They are polar explorers looking for a yeti, or pirates searching for treasure, and their imaginations are so strong that all the yards morph into that make-believe world — until, eventually, they get hungry and go back inside for snack time.
That’s it. That’s the whole show. It may not sound like much, but “The Backyardigans” was special. It was, on multiple levels, absurdly ambitious. The show was a musical, and each episode was scored in a completely different genre: funk, swing, tarantella, klezmer, rockabilly, Dixieland, Bollywood and dozens more. Plots were often drawn from classic film or literature. When the tubby little characters danced, it was not generic cartoon dancing — it was meticulously choreographed and performed by Alvin Ailey-trained dancers, before being translated into C.G.I.
The resulting show was a bizarre combination of childish delight and high culture. One episode, a quest through the jungle, is written in the style of a Gilbert and Sullivan operetta. (“Don’t worry, you fellas! I brought my umbrella, and this is specifically why: It’s big, it’s terrific, and it’s scientifically proven to keep us all dry!”) And that ambition — the show’s outlandish attempt to contain the whole world — came straight from its creator.
Janice Burgess studied art history at Brandeis before working her way up through menial jobs (craft services, travel logistics) in kids’ TV. Eventually, she was hired as an executive at Nickelodeon, where she worked on such hits as “Little Bill” and “Blue’s Clues.” But her signature triumph, which debuted on Nick Jr. in 2004, was “The Backyardigans” — a show that united her wide-ranging passions. “She really brought all her loves in,” Jonny Belt, the show’s art director, told me. “It was all Janice in there.”
Burgess’s colleagues speak of her with affectionate awe. She was a human encyclopedia: She could casually explain the different styles of Victorian carriages or quote the lyrics to “Guys and Dolls” or glance at a mock-up of a Japanese temple and tell you that one of the chairs was from the wrong period. “Today we use A.I. to look something up,” Belt said. “We would always just call Janice.” She also loved action movies and made sure to stuff “The Backyardigans” with chases and capers and twists.
Burgess herself came up with the frankly ridiculous idea to write every episode in a different musical genre. To execute it, she hired Evan Lurie, a celebrated pianist in New York’s downtown music scene.
“I remember the first week of recording,” Lurie told me. “We did a reggae episode first. And then the second one was country swing. And I began to realize: Oh, we’re really taking this all the way.” It became a kind of parlor game, pairing obscure genres with unexpected settings. An episode set in ancient Greece? Bossa nova. A trip to Mars? Kenyan highlife. Pizza delivery among the ancient Maya? College-football-style fight songs (drum rolls, cymbals, brass). “It got a little out of hand,” Lurie said, laughing.
“The Backyardigans” was maddeningly labor-intensive. There were long nights and early mornings. Budgets and software programs were pushed to their limits. “She wanted whatever her name was on to be as excellent as it could be,” said Brown Johnson, the founder of Nick Jr.
The kid at the center of “The Backyardigans” was Burgess herself. She grew up, literally, in Mr. Rogers’s neighborhood: Squirrel Hill, in Pittsburgh. (Fred Rogers lived nearby.) “In a way,” she once told an interviewer, “the entire adventure happens in Pittsburgh.” When it came time to cast “The Backyardigans,” Burgess chose a diverse group of children and encouraged them to speak and sing like themselves — not like what they’d heard on TV. It worked beautifully. Despite all the C.G.I., “The Backyardigans” radiates humanity. “Janice was a very complex person,” Belt told me, “but also really unable to pretend to be anybody else. She was just all out there. It’s disarming how true to herself she was.” And that’s what Burgess showed us: Even when we’re pretending, we are always authentically ourselves.
Sam Anderson is a staff writer for the magazine and the host and reporter behind the podcast “Animal.”
Angela Bofill sang, with equal élan, from the crest and canyon of her contralto. The singer and songwriter was admired for her three-and-a-half-octave range but beloved because she pulled from an even bigger quiver of emotion. A Latina from Harlem and the Bronx who began writing songs as a child, Bofill was a vocal prodigy who also played piano and viola. She was a scholarly and ambitious teenager, a standout among ruthlessly vetted standouts in elite ensembles like the New York All City High School Chorus and the Dance Theater of Harlem chorus. Bofill, who was of Puerto Rican and Cuban descent, also sang in a Latin R&B band. This affiliation led to a record deal. In her early 20s, she signed with a jazz-fusion label and began recording songs she had completed as homework while attending Hunter College High School and the Manhattan School of Music. It was a period of intense creativity.
On her debut album, “Angie,” in 1978, Bofill made stunning tonal choices, which stretch from pale apology to shuddering requests for faith. The album is a euphonic achievement, though only a relative few appreciated Bofill’s majesty. She quickly became successful in Black spaces and on Black radio. But because of the racial segregation of everything from music discovery to marketing to magazines to the aisles of record stores, winning in the larger and more lucrative world of pop was like hitting the lottery.
Bofill’s signature song, the ballad “I Try,” appears on her second album, “Angel of the Night,” from 1979. In the song, as on the smoldering cover photo, it’s twilight, and she’s entreating an ex-lover. Her lyrics are atremble with diaristic despair. The last three verses are nearly identically written, almost identically sung, and the tension comes from the plush pockets of air within lines: “You know … that I … tried … to be with you/You know … that I … wanted … to see it through.” In those rests is the promise that Bofill’s tone might switch from plain-spoken to pleading. But it never does. The effect is pure incantation. Even as she calls out, “I tried and I tried and I tried and I tried,” Bofill is blue but does not beg. This was also her approach to mainstream attention — she intended to excel, but she didn’t grovel for the industry’s approval.
Bofill’s voice, charisma and lush looks did not go unnoticed by Clive Davis, president of Arista Records, who signed her in 1981. Many label executives were in search of Black woman singers to take pop music to new heights. Davis was trying to produce a suite of stars, including the virtuosic Bofill — and then he signed Whitney Houston in 1983. It eventually became clear that, in the process of developing Houston, the label had decided to leave the brilliant Bofill behind. (Arista put out two more Bofill records, but neither cracked the Billboard 200.) Later, Bofill told a reporter: “When I started out in this business, I had a lot of creative input. That became limited in certain situations.”
Bofill did try again with other albums on other labels and enjoyed some success. She sang in venues packed with besotted fans who crooned songs back to her verbatim. She also created a warm space for listeners like me to feel what we felt. When I was an anxious overachieving teenager with a body that received more attention than I knew how to handle, Bofill came into my life radiating big-sister wisdom and empathy. I saw myself in the curvaceous singer with the moody palette. In the mid-1980s she married, moved to Northern California — not far from Oakland, where I was blasting her songs from the radio — and had a daughter. In the ’90s she toured, released two albums and did some background work for Diana Ross and others; in the ’00s, she appeared in plays. In 2006, Bofill had the first in a series of strokes that paralyzed half of her body and left her speech profoundly impaired. She spent three years in rehab. Bofill released a live album soon after she began her recovery and a compilation project in 2014. It was to be her last.
In 2021, I interviewed Bofill for my podcast, “Black Girl Songbook.” Bofill’s representative reminded us that she had difficulty speaking. I asked her if it was her idea to do less jazzy material and to switch over to sounds that were more likely to be played on mainstream radio stations. My informed opinion was that she had been overlooked. That she was just a soulful square peg that had been shoved, by crass executives, into a pop circle. But that was my teenage heart speaking, the innocent, who cried to “I Try.”
She made it clear that she wasn’t anyone’s victim, not now nor back then. Bofill was the girl, after all, who got into all the prestigious schools and Saturday conservatory programs and joined a band, and made a group with her sister Sandra called the Puerto Rican Supremes. No one made her do anything. She always wanted to know more, create more, sing more and be more.
I asked if she still sang.
“Well,” Bofill said, “only remember one song.”
“What is it?”
“ ‘Happy Birthday.’ ”
This was life. Not the business or politics or even the art of it. But the truth. Bofill, who mirrored our joys and devastations, was now singing only what she could. And after the taping ended, she told me that she enjoyed singing the birthday song to her grandchildren.
“I remember titles of songs,” Bofill said, meaning her own. “But after the stroke, 2006, the first stroke, I don’t sing no more.” She said she didn’t remember anything.
It was twilight. And as I told her in real time: The fans remember, Ms. Bofill. We remember every single note you sang.
Danyel Smith is a contributing writer based in Los Angeles and the author of “Shine Bright: A Very Personal History of Black Women in Pop.”
In 1972, Edward Stone was working as a physics professor at the California Institute of Technology when he was asked to become the lead scientist for a new, hugely ambitious NASA mission to the solar system’s outer planets. The mission involved two identical ships to be launched in 1977 on a trajectory to Jupiter and Saturn; after that, one ship might also go on to Uranus and Neptune. Eleven different teams would be responsible for the 11 science instruments on each spacecraft. Though Stone’s specialty was cosmic rays, the subatomic particles that emanate from exploding stars, his main job would be to manage a potentially unruly scientific corps of nearly 100 people.
Internal tensions are inevitable on a big space project. A ship might require a certain flight path or orientation, for instance, to ensure optimal viewing conditions for its cameras, but at the expense of some other scientific reading (an ultraviolet sensor, say). With the probes swooping past planets at upward of 30,000 miles per hour, observation opportunities would be brief, the trade-offs unpleasant. What Stone came to realize, he later told an oral historian of Voyager, as the project eventually came to be known, is that “if you decide to do this versus that, because you can’t do both, you’re basically deciding that this team gets to make a discovery and that one doesn’t.”
Stone’s first demonstration of organizational acumen came during an early meeting of the science team. His sharp intellect — “Whenever I talked to him,” his colleague Alan Cummings recalls, “I felt like I was talking to the smartest person I’d ever met” — did not preclude a sense of humility. He was, according to his daughter Janet Stone, “a true lifelong learner” who could devote extraordinary amounts of time to anything necessary for his job. Warren Keller, from NASA headquarters, once said that at the first meeting, when each of the scientists — the primary investigators, or P.I.s — stood up to explain their Voyager instruments, “Ed Stone knew more about every one of their instruments than the P.I.s themselves knew. This man was tremendous.”
His next move, to keep 11 teams from focusing only on their own agendas, was ingenious: He plucked out members to form “working groups” that hewed to scientific interest rather than scientific instrument — atmospheric science, for example, and planetary moons. These new teams could discuss common goals and how certain instruments or trajectories might yield superior results.
Through it all, Stone pressed his own brand of intellectual empathy on researchers: Advancing science meant considering a competitor’s point of view. And whenever Voyager reached a planet — Jupiter in 1979, Saturn in 1980, Uranus in 1986, Neptune in 1989 — Stone conducted workshops in the backrooms of the Jet Propulsion Laboratory, which ran the mission for NASA in Pasadena, Calif. The science teams presented their new data to help Stone decide what merited an announcement. Ultimately, Stone was judge and jury, yet he weighed every argument and counterargument.
Understanding the internal politics of Voyager might be akin to understanding the inner workings of a championship sports team. What was in their playbooks? How important was the coaching, compared with the players’ natural talents? Such questions would lack salience if the mission ended as a middling scientific effort. Yet it went on to become the farthest-reaching expedition in human history, with Voyager 1 now 15.4 billion miles from Earth and Voyager 2 now 12.8 billion miles away. Stone remained Voyager’s research chief for 50 years, overseeing this once-in-a-generation investigation that overturned many established ideas about the outer planets and their moons and gave us thousands of images that have defined how we see our solar system. Several missions have built on Voyager’s discoveries (one set out two months ago for Jupiter’s icy moon Europa), but Voyager is alone in passing near Uranus and Neptune.
The findings go beyond the planetary. By traveling toward the solar system’s edge, Voyager told us where the influence of our sun wanes (beyond a protective bubble known as the heliosphere) and where a stream of incoming particles (the interstellar wind) picks up. Those discoveries required a dexterous act of management, too — helping Voyager receive steady funding, for more than 20 years, after the ships passed Neptune and turned their cameras off. It helped that Stone had increasing political influence: Late in his career, he also became the director of the Jet Propulsion Lab.
The Voyagers are now ailing: geriatric, glitchy, unsteady. Visiting J.P.L. during a recent crisis, I watched engineers plot how to repair a computer on one of these far-off ships. With luck and effort, the Voyagers might last another five years. On a previous visit, I had met with Stone at Caltech, and his reminiscences drifted to a row of journals near his desk that chronicled every Voyager meeting he had attended. “I think this helped my credibility with the team as I made those hard decisions,” Stone told me. “They saw that I was actually recording what they were saying.” In asking his colleagues to explain their differences, in other words, he grasped one thing from the start: If you seek to lead, you need to be the most attentive listener of all.
Jon Gertner has been writing about science and technology for the magazine since 2003.
He was “Famous” before he was famous — but names have power, and it didn’t take long for reality to catch up. In March 1975, Wally Amos sent out 2,500 invitations to the opening of the first Famous Amos shop, on a somewhat seedy corner of Sunset Boulevard in Los Angeles. There was valet parking and Champagne. Amos printed glossy “headshots” of his new product, luridly bulging with chunks of pecan and chocolate. The conceit was that the former talent agent was promoting his newest client, the Cookie. But it was Amos who was about to become a star.
Raised in Tallahassee, Fla., with a teenage diversion to live with his Aunt Della in Harlem, Amos had already spent over a decade in show business, after dropping out of high school. In the mailroom at the William Morris Agency in New York City, he used his lunch hours to practice his typing skills, eventually becoming the company’s first Black agent. He was responsible for signing Simon and Garfunkel. Passed over for a promotion, he set out on his own in 1967, opening a talent-management company in Los Angeles. Along the way, Amos came up with a gimmick: bringing bags of homemade chocolate chip cookies to his meetings. He credited the inspiration to the ones Della used to bake for him. The resulting warm feelings paid off: Among the first investors in Famous Amos were Marvin Gaye and Helen Reddy.
With his beard, Panama hat and flowing, embroidered shirts, Amos was the picture of 1970s fabulosity, even managing to make his ever-present kazoo seem sort of groovy. For a stretch, he was everywhere — in the Macy’s Thanksgiving Day Parade; on the cover of Time, featured as one of the “Hot New Rich”; on “Taxi” as a hallucination of the character Latka.
Shawn Amos, one of Wally’s four children, called his father part of the “Great American Negro Hustler Generation,” alongside such figures as his onetime office neighbor Quincy Jones: “Young Black men, born into Jim Crow America now shaping American culture.” Wally Amos himself resolutely avoided discussing race, even if it’s difficult now to not read sly jokes into the T-shirts and bumper stickers he printed reading “Have a Very Brown Day,” or the event he hosted with Andy Warhol titled “Cookies & Milk with Amos & Andy.” Still, it was no small thing to have the face of an actual Black business owner join the Uncle Bens, Aunt Jemimas and Cream of Wheat men on supermarket shelves. Fifty years later, it’s still hard to think of any Black food personality who has surpassed him in the public consciousness.
At his shops, which quickly proliferated, Amos’s small and irregular cookies were scooped from gold trays onto a butcher’s scale and sold by weight, $3 per pound. In upscale groceries and department stores, they came in brown paper bags. Each was suggestive of a health-food store or gourmet shop, though the cookies didn’t really belong in either. His recipe had in fact come from the back of a bag of Nestlé Toll House chips, albeit with some tweaks. Amos added coconut and pecans, swapped margarine for butter and, crucially, upped the amount of chocolate. “He wanted there to be a chip and a nut in every bite,” said Shawn, who helped his father open the first store when he was 7 and often worked behind the register.
Amos and his cookies were indivisible, right until they became all too divisible. He was a far better showman than businessman. In 1982, Famous Amos took in $12 million in revenue. Only a few years later, the company was deep in the red. In 1985, it was bought for a song by the first in a long series of corporate owners, each of which diminished Amos’s equity and involvement. Eventually he was legally barred from using the very name that had set his destiny.
It became a sort of existential capitalist ghost story: the man who lost his own name. Amos told it often, spinning the details of his failures into an improbable second career as a motivational speaker, author and literacy advocate. “Deprivation of anything which you think is rightfully yours is no more than a detour to a higher plateau,” he wrote in “Man With No Name: Turn Lemons Into Lemonade,” one of 10 books of mixed memoir and inspiration he published. This new identity proved more successful than Amos’s sporadic attempts to get back in the baking game, including with one company cheekily named the Uncle Nonamé Cookie Company.
“It was the great pain of his life,” Shawn said of his father’s losing his company. “But ironically, it was also his next act.” That is, by the end, Amos may have been most famous for no longer being Famous. Great American hustler indeed.
Brett Martin is a writer in New Orleans and the author of “Difficult Men: Behind the Scenes of a Creative Revolution.”
No one knows how Rosa was orphaned in September 1999, in Santa Cruz County, Calif. She was nameless then, just 4 weeks old and tiny, a pair of big black eyes and a five-pound ball of tawny fluff. Where was her mother? Waylaid by stormy seas? Mortally wounded by a great-white-shark bite? Entangled in fishing nets? Floating face down somewhere, her brain infected by parasites?
There are so many ways for a sea-otter pup to be separated from its mother, and yet until it reaches 6 months, its survival in the wild is impossible without a mother. A pup’s fur coat is so buoyant that Rosa couldn’t dive for food on her own without popping back to the surface like a beach ball. Such a little otter needs a mother for milk and comfort and the training needed to gain essential skills like fur grooming and bivalve cracking. And if that baby grows tired or cold, she needs to be pulled up to dry off and sleep on the lifeboat of her mother’s belly.
After a beachgoer spotted the stranded baby otter, Rosa ended up at the Monterey Bay Aquarium, where the mission goes beyond just public exhibitions; aquarium scientists and policy experts work on all kinds of conservation issues. It was there, inside a building visited by two million people a year, that this motherless otter became a sort of super-auntie for the struggling population of Southern sea otters, which are listed as threatened under the Endangered Species Act.
As many as 300,000 sea otters once populated the coastline of the North Pacific rim, stretching from Hokkaido, Japan, to Baja California. By the late 1800s, fur traders had hunted them to near extinction. The 3,000 or so Southern sea otters alive in California today are descended from several dozen that, protected by the rough waters and rugged cliffs of Big Sur, survived the maritime fur trade.
The U.S. Fish and Wildlife Service is currently studying the feasibility of reintroducing sea otters to their historic range along the coasts of Northern California and Oregon. By eating creatures that destroy algae, like sea urchins, otters help maintain towering kelp forests and thick sea-grass beds. Those underwater algal worlds absorb and store carbon, create habitats for fish and prevent coastline erosion from rising seas and extreme storms.
For about two years, the aquarium staff tried repeatedly to release Rosa back into the Pacific Ocean, but by then she was overly familiar with humans. She would swim right up to kayaks. One day she scared a group of divers in Monterey Bay when she yanked at their gear and gnawed on their wet suits. Scientists concluded that Rosa was a recidivist, not fit for release.
Around the same time, staff members first succeeded in pairing an orphaned pup with a female otter inside the aquarium. Such long-term surrogacy doesn’t happen in the wild, where motherhood is too physiologically taxing: A mother otter must nearly double her daily caloric intake just to keep herself and her own baby alive. But scientists began to wonder if pups raised by surrogates would be more likely to thrive when released back into the ocean. Maybe they wouldn’t be plagued by the kind of hyperactive human orientation that drew Rosa to those divers.
Thus Rosa became one of the founding members of the aquarium’s otter-surrogacy program, the first of its kind. Despite never having had offspring of her own, she was an instinctive caregiver. Over her lifetime, Rosa fostered 15 pups, most of whom were released back into the ocean. Rosa’s pups went on to parent their own pups, who in turn made more pups.
The aquarium staff describes Rosa’s parenting style as gentle, unflustered, even chill, which is saying a lot for an animal closely related to wolverines and badgers. Some surrogates can appear to be anxious, yanking young otters around, insisting on constant proximity, which can be a lot for a pup with abandonment issues. Rosa let the mother-child bond develop over time. She possessed a tender tolerance for beginners. (She became an even-tempered mentor to the novice biologists and trainers too; she was often the first otter people worked with.) She schooled each adoptee in how to comb and blow air into its fur to maximize fluffiness and repel water. She was always quick to show pups how to dive and snatch red-rock and Dungeness crabs, flip them belly up onto her chest and then gobble them claws first to avoid being pinched. And despite a preference for crab, she would demonstrate how to pry into prickly urchins, mussels, oysters, starfish, clams and snails too. No picky eaters allowed: Survival in the ocean requires a generalist’s palate.
Because of the caregiving prowess of Rosa and her cadre of crab-cracking foster moms, scientists now look to surrogacy as an important aspect of sea-otter-reintroduction efforts. It is hard to move adult sea otters from one place to another; they often don’t make it in part because they try to swim back to their home territory. But research shows that surrogate-raised otters tend to stick around near where they’re released and survive at rates comparable to their wild-born counterparts. The offspring of Rosa’s adoptees are no doubt out there even now, showing their own pups how to forage in the salty depths and how to keep a tiny baby alive in the vast cold ocean.
Malia Wollan is a contributing writer for the magazine. The last feature she wrote was about freight-train heists.