What a Year of Building an AI-Powered Video Library Taught Us About Scaling Empathy

Collage of different people representing the different lived experiences captured by the Social Lens Library

At the end of 2025, I gave a keynote at a leading nonprofit’s AI summit on using AI to achieve more inclusive and meaningful segmentation. It was an opportunity to reflect on everything we learned as we built the Social Lens Library, using AI to collect over 20,000 videos of real people sharing their everyday lives, needs, and perceptions.

We started this project because we saw a gap. Deep ethnographic video research has long been available to large organizations with big budgets. Nonprofits and smaller organizations rarely get access to that level of insight. We wanted to change that.

I started the year barely able to write a high-quality prompt. By the end, I was using AI coding tools to build our own analysis tools and custom video repositories. The learning curve was steep, but the payoff was real.

I wanted to share some of my key takeaways from hands-on experimentation with top LLMs like ChatGPT, Gemini, and Claude, as well as insights-specific tools like Dovetail and CoLoop.
Spoiler alert: humans are still very much needed.

AI scales pattern recognition. It does not scale interpretation.
AI can analyze hundreds of videos simultaneously, surfacing patterns across demographics and lived experiences that would take months manually. But it needs clear direction.We give AI detailed context about our frameworks and objectives. We require evidence for every pattern, with links to original sources. We trained it to differentiate universal themes from meaningful differences.
Deciding what a pattern means, whether it matters, and what to do about it still requires human judgment.

AI’s big unlock is understanding when people are ready to act.
A year of experimentation taught us this: AI can unlock a more precise understanding of when people are ready to act (make a decision, change a behavior, purchase) and what will move them (messages, products, services), at scale and speed. We trained AI to pinpoint the signals and conditions that make people more receptive and likely to act. That is the foundation of the Lens Predictive Empathy framework:
WHEN: The moment someone becomes ready to act
WHY: The emotion driving them in that moment
HOW: The message or offer that will resonate

AI finds the patterns. Humans decide what to do with them. Together, they show you when to reach someone and what will actually spark action.

AI enables inclusion as a design choice.
AI does not just add speed. It adds reach. Traditional qualitative research caps at 20 to 50 participants. That limits who gets included. When AI helps you scale to 500, you can extend into communities that research typically misses. But scale alone is not inclusion. It requires intentional recruitment, culturally adaptive questions, and monitoring of those who are still not showing up. AI makes scale possible. Humans decide who gets included. Given the pressure to reduce community-specific efforts amid DEI backlash, this may be the diversity hack we need most right now.

Make empathy actionable to scale.
Empathy does not scale through technology alone. It scales when organizations commit to the full process. Four layers that work together to build empathy at scale:

  1. Foundation: Diverse, representative data. Human-designed.
  2. Cognitive empathy: AI identifies patterns at scale.
  3. Affective empathy: Humans interpret meaning and implications.
  4. Active empathy: Organizations act on what they learn.

The teams that succeed move from interesting patterns to organizational action. The real value is not finding the pattern. It is when the pattern changes what happens next.

Closing thought
Last year taught me that AI is not a shortcut. It is an amplifier. It can help you move faster and go deeper. But the decisions that matter most, who gets included, what insights mean, and what action to take, those remain human. AI, in the right hands with the right intentions, can be an amazing way to be more inclusive and innovative in helping under-resourced communities. That is what makes this work worth doing.

0 Comments

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also Like

Collage of different people representing the different lived experiences captured by the Social Lens Library

What a Year of Building an AI-Powered Video Library Taught Us About Scaling Empathy

At the end of 2025, I gave a keynote at a leading nonprofit’s AI summit on using AI to achieve more inclusive and meaningful segmentation. It was an...

When Generosity Gets Complicated: What 223 Americans Taught Us About the Future of Giving

“I’m not opposed to donating, but there’s just too many stories of it being mishandled, and frankly, I don’t trust it.” The Hispanic man who shared this wasn’t...

AI Trust Crisis: Why Users Draw the Line at News and Politics

While AI adoption surges across everyday tasks, a clear boundary emerges for serious information. Insights from the Social Lens Library reveal users across all generations share unified skepticism:...