Navigating Uncharted Territory

As a research manager at Google’s incubator Jigsaw, Jackson alum Beth Goldberg has gotten used to pivoting between disciplines and reinventing her role in a rapidly changing field.

When Beth Goldberg was a graduate student, she envisioned a future career related to her interest in the ethical governance of technology. But her current position as a research program manager at the Google incubator Jigsaw has exceeded even her loftiest expectations.

“This is beyond my wildest dreams,” says Goldberg, who landed the job shortly after completing her joint degree in global affairs and business administration at Yale in 2018. “I didn’t know this job existed when I was in school. Part of what I love about it is it’s evolving with the threats online and it has changed dramatically just in the three and a half years that I’ve been in it,” she said.

In her role at Jigsaw—a unit within Google whose mission is to ‘explore threats to open societies and build technology that inspires scalable solutions’—Goldberg manages a team of interdisciplinary researchers focused on countering violent extremism and disinformation.

The data-gathering process involves interviews with former white supremacists, trolls, and conspiracy theorists. The work is not for the faint of heart.

“We’re working across the full lifecycle from deeply understanding the problem by talking to people, understanding how multidimensional they are, and actually generating some empathy for where these folks came from and how they ended up as extremists, and then working towards preventing more people from following that same pipeline into extremism,” she explains.

Her team conducts ethnographic research and surveys to analyze threats, reviews a range of social science findings, and develops technical tools that aim to mitigate the problem in different ways. One intervention that seems particularly promising is based on inoculation theory—which posits that preemptively explaining why an idea is false can bolster one’s cognitive resilience against future propaganda.

“We asked ourselves, ‘what if we took this and made it internet-ready and made a catchy little video?’ So, I worked with a team of researchers who were really interested in inoculating or pre-bunking against extremist propaganda, and we ended up doing a series of studies now where we’ve pre-bunked those white supremacist and male supremacist narratives.”

For the vast majority of people, it’s really effective, she said, noting that after inoculation, study participants were less likely to amplify a hateful meme and were less likely to trust the person who shared it. Interventions like this work best when conducted early in the radicalization journey, Goldberg explained. “It’s really hard to dislodge an idea once someone has bought into it.”

For Goldberg, knowing that she and her team are developing solutions to the problem helps her stay optimistic despite the difficult subject matter. But it’s not just her small team that has a role to play in addressing the problem of extremism online.

“Jigsaw is an incubator, so we test stuff, we bring it through from an idea to proving the concept, and then the goal is for existing platforms to scale it to help millions of their users,” she said. Other big tech companies—who have some of the smartest engineers and researchers in the world—can do more to mitigate harmful content through warnings, tips, and pre-bunking, Goldberg adds.

Policymakers also have an important role, Goldberg says, but the rapidly changing nature of technology makes it challenging to draft effective legislation. Consider the example of the auto industry:

“It was decades before we required cars to have seat belts. We had an unspeakable number of road accidents, we didn’t mandate that cars have taillights or mirrors. So, there were all these things that, over many years, after we had terrible atrocities and deaths, then government started to regulate things.

Goldberg was a featured speaker at the Yale Cyber Leadership Forum in a panel discussion on disinformation and the future of democracy. Watch the video to hear her thoughts about deep fakes and how artificial intelligence tools can help detect them.

“I think we’re maybe in a similar situation where we’re starting to really catalog online harms. For example, the leaked Facebook Files illustrated the impact that Facebook is having on teens’ mental health, or the role it played in the genocide in Myanmar,” Goldberg said. “But we don’t yet have those seat belts for the internet.”

While Jigsaw doesn’t interface directly with lawmakers, Goldberg sees herself as a bridge between tech and other sectors, namely civil society and academia. Her Jackson education offered a solid foundation for the interdisciplinary nature of her job.

“On a day-to-day basis, I’ll jump from a call with data scientists and a physicist to a call with psychologists, to a call with a bunch of lawyers and policy people, and I have to code-switch between all these different disciplines. Then on my own team, we have statisticians and ethnographers and survey methodologists that all use totally different research methods to triangulate at these problems,” Goldberg explains. “I’m really grateful that Jackson let me be a generalist and learn the basics of all those different approaches.”

Jackson’s required statistics class, which offered training in experiment design, was foundational to her work. Being comfortable with ambiguity is also an important job skill she honed at Jackson. Senior Fellow Eric Braverman’s Ethical Decision-Making class simulation exercises and Stan McChrystal’s leadership course both pushed students to “navigate uncharted territory as decision-makers,” Goldberg says.

Recently, Goldberg has witnessed her global affairs and technology worlds colliding. “Project Shield,” a product Jigsaw developed years ago to help defend websites against distributed denial-of-service attacks, is now being used by journalists and NGOs in Ukraine to fend off Russian cyber intrusions. It’s gratifying to see how her company’s work is helping people in real-time, albeit in grim circumstances.

“We’re helping people be more resilient and savvier and in charge of their own online journeys. We’re never going to totally eliminate the supply of disinformation or hate speech or whatever, we’re going to have to live in a universe where that exists in the public square both on and offline, but how can we make ourselves more resilient to it? That’s the type of research that I feel really excited about.”