In this blog post, Yale global affairs major Isabella Panico ’26 shares her summer research: creating a digital safety curriculum designed to meet the urgent need for youth-centered, emotionally intelligent, and structurally aware online education.

This summer, I worked to co-create Internet Street Smarts for Teens (ISS Teens) for Cyber Collective, a U.S.–based NGO that has long focused on on-the-ground awareness and implementation of digital safety programming for general audiences. While reaching broad populations remains crucial, ISS Teens marked an intentional step toward engaging adolescents directly. It was designed as the beginning of a broader effort to treat digital safety as civic education, grounded in the belief that young people deserve to understand the systems shaping their lives and to demand accountability from those who design them.

Too often, digital education reduces safety to rules: don’t overshare, be kind, think before you click. Rules like these assume the systems being used are neutral and the responsibility is on the individual. But this is far from the case. In the online world, harm is encoded into our tools. Teaching young people to follow rules set by platforms only reinforces the illusion that safety is their burden alone.

From the beginning, ISS Teens followed the participatory design ethos that defines Cyber Collective. We partnered with Headstream Innovation to create a space where young people could co-design a program that reflected their lived realities. Our work was shaped by the generosity and insights of Emma Leiken (TikTok), Natalie Shoup (End Violence), and Charlton McIlwain (NYU Steinhardt), whose guidance helped us ask not only what harms exist, but how education can legitimize the experiences of those most impacted.

In workshops across New York City, I led students through the hidden mechanics of the platforms they use every day. We traced how algorithms reward inflammatory content, why companies profit from data extraction, and how harm is designed rather than accidental. By the end of our sessions, students were not only more confident in protecting themselves—confidence rising from 81% to 96%—but also more critical of the systems themselves. “I learned why people do the things they do online;” “A lot of it is because of algorithms;” “How companies sort users’ information and use it against us,” were among some of the feedback students gave me, making me realize that this had real impact beyond the safety elements. These realizations could help empower young people to make cascading change in our political systems, as well.

For me, the creation of ISS Teens was far more than just about teaching protective strategies. It was about legitimizing the idea that young people deserve more than survival tactics in a rigged system. They deserve to see the architecture of their digital lives as constructed and therefore contestable. That agency is the first step toward accountability—the refusal to let companies automate processes, deploy technologies, and restructure daily life without scrutiny. And they deserve to be empowered with the confidence to use their voices productively.

This work also reminded me of its intergenerational dimension. Young people are often the go-to tech troubleshooters in their families, bridging younger siblings and elders alike. By treating them as leaders in digital literacy, we plant knowledge that ripples outward across generations—reminding all of us that technology must be questioned and reshaped collectively.

At Yale, I’ve dedicated my time to studying how technology transforms societies and warfare. In class, I analyze how disinformation corrodes democracies, how surveillance suppresses dissent, and how authoritarian regimes weaponize digital platforms. But this summer reminded me that the same dynamics are lived long before they reach the level of state power. They begin in the confusion of a teenager whose reputation is destroyed in a group chat, or in the admission of an eleven-year-old who tells me they could not live without Instagram. These small moments form the foundation of the larger problem.

The more we cultivate the habit of questioning technology at its most intimate scale, the more we may hesitate before embracing its most destructive forms. If a teenager can recognize that algorithms exploit outrage, perhaps policymakers can pause before deploying AI in war. If young people can learn to see that digital safety is political, maybe society can begin to treat technology not as inevitable, but as intentional.

This summer at Cyber Collective, co-creating ISS Teens was my way of beginning that work. To me, it went far beyond digital safety—it was about cultivating the civic instinct to question, to pause, to expect more in young people. Because digital resilience is not just survival, but the insistence on seeing and reshaping the systems that govern us.