Navigating AI Risk in Every Organization

In this episode of Human in the Loop, we sit down with Meredith from Applied Information Sciences to explore what it truly takes to bring AI into organizations responsibly. From securing sensitive data to navigating shadow AI, she shares practical guidance shaped by years of working with highly regulated industries. Together, we unpack the evolving role of AI policies, the importance of human‑in‑the‑loop safeguards, and how emerging agentic systems are reshaping expectations for security and governance. It’s a grounded, insightful conversation for anyone looking to innovate with AI while keeping trust and safety at the center.

What You Will Learn:

  • Trustworthy AI is crucial for responsible innovation.

  • Data security and governance are often underestimated risks.

  • Zero Trust principles are essential in AI security.

  • AI amplifies existing cybersecurity risks.

  • Organizations must address shadow AI usage.

  • Policies around AI usage are lacking in many organizations.

  • Everyone in the organization shares responsibility for AI governance.

  • Human oversight is necessary in AI decision-making.

  • Experimentation with AI tools can lead to valuable insights.

  • Curiosity and willingness to try new AI tools are vital.

Guest bio:
Meredith Dost is Vice President of Infrastructure and Security Solutions at Applied Information Sciences (AIS), where she leads AI-first delivery across cloud solutions and cybersecurity. She helps regulated organizations modernize and scale with an AI-powered services model that pairs automation, intelligent operations, and agent-assisted workflows to improve speed, resilience, and governance. Meredith is known for translating complex risk into clear executive decisions and building high-performing teams that deliver measurable outcomes in high-stakes environments.

Enjoy,

Chris Huntingford 👉 LinkedIn | X | YouTube

Ioana Tanase 👉 LinkedIn

Keegan Chambers 👉 LinkedIn

Meredith Dost👉 LinkedIn

Next
Next

Unlocking AI's Promise Responsibly With Saidot