Designing AI with Humanity at Heart: Lessons from WICSEC 2025

At WICSEC 2025, we had the privilege of leading the session alongside Mike Leach (Leach Consulting Group), Greg Tipping (TipCo), Katie Morgan (Protech Solutions), and Nisha Garimalla (Protech Solutions).
Bringing together consulting, technology, policy, and design gave us a conversation that balanced practicality with vision. We knew those perspectives were needed now more than ever: AI can push us to solutions quickly, but the real question is: faster toward what?
Because without human-centered guidance, speed alone can lead us astray. If we don’t center people, we risk solving the wrong problems more efficiently. Our session invited participants to consider a different approach — one that starts with listening, equity, and trust.
Key Learnings from Our Session
1. Empathize: Listen First
Human-centered design begins with empathy — listening to stories, feelings, and lived experiences.
- AI pitfall: Jumping straight to the data.
- Correction: Start with people. AI can help surface patterns later, but it can’t replace human listening.
2. Define: Frame the Right Problem
A clear definition keeps families, not systems, at the center.
- AI pitfall: Optimizing for system efficiency instead of human outcomes.
- Correction: Define success around dignity, fairness, and trust. Then ask: Should AI even be part of the solution?
3. Ideate: Generate Possibilities
Here, AI can spark creativity and broaden the idea space.
- AI pitfall: Letting AI narrow ideas too quickly.
- Correction: Use AI as a brainstorming partner, but let people choose what aligns with values.
4. Prototype: Make It Real
Prototypes help us see how solutions work in practice.
- AI pitfall: Treating AI outputs as “finished.”
- Correction: Prototype with usability, clarity, and accessibility in mind before scaling.
5. Test: Ask Hard Questions
Testing reveals whether solutions actually build trust.
- AI pitfall: Measuring only accuracy or efficiency.
- Correction: Always test for fairness, transparency, and inclusion. A tool that works technically but erodes trust isn’t success.
The Big Takeaway
AI can make us faster. But without people at the center, speed just takes us in the wrong direction more quickly.
So use these three guiding principles as your compass:
- People first, technology second.
- Explore widely before committing to AI.
- Always test for trust, fairness, and inclusion.
Child support has always been about families. Technology — no matter how advanced — must be held to the same standard because at the end of the day, AI will never replace the human heart. But designed with care, it can help us serve it better.
If your agency is exploring AI readiness or wants to dive deeper into human-centered design thinking, please feel free to reach out. We’d be glad to answer questions and help you build a roadmap of possibilities.
Leave a Reply