Smarter Compliance Through Transparent AI Deployment - with Miranda Jones of Emprise Bank

Unknown Source July 09, 2025 20 min
artificial-intelligence generative-ai investment
19 Companies
34 Key Quotes
3 Topics
4 Insights

🎯 Summary

Comprehensive Summary: Smarter Compliance Through Transparent AI Deployment - with Miranda Jones of Emprise Bank

This 19-minute podcast episode features Miranda Jones, Special Vice President and Data & AI Strategy Leader at Emprise Bank, discussing the critical need for a deliberate, collaborative, and transparent approach to deploying AI within regulated financial services, particularly concerning compliance and operational workflows. The central narrative emphasizes shifting focus from a β€œdata gold rush” mentality to prioritizing high-quality, explainable outcomes built in partnership with frontline employees.

1. Focus Area

The discussion centers on AI deployment strategy in regulated financial services, specifically focusing on:

  • The necessity of co-designing AI systems with frontline teams (Subject Matter Experts or SMEs).
  • The strategic value of Explainable AI (XAI) in building trust and accelerating adoption, rather than viewing it as a compliance hurdle.
  • Cultural shifts required to move beyond measuring excessive data points toward defining and measuring meaningful Key Performance Indicators (KPIs) aligned with core business goals.
  • The strategic benefit of slowing down initial scoping to build reusable processes and shared ownership.

2. Key Technical Insights

  • Explainability as an Accelerator: Integrating explainability from Day 1 speeds up final execution because it allows frontline staff to confidently use and communicate model decisions to customers and regulators, preventing adoption failure due to lack of trust or understanding.
  • Challenging Data Bias through Diversity: Including diverse voices from frontline teams in the design process helps challenge potential biases in data selection, ensuring that the variables analyzed reflect true underlying patterns rather than narrow, biased customer lenses.
  • Focusing on True Outcome Correlation: Teams often measure too many data points (a byproduct of the β€œdata gold rush”). The technical focus should shift to identifying and measuring only the 4-5 data points truly indicative of progress toward the defined business outcome.

3. Business/Investment Angle

  • Employee Experience Drives Customer Experience: If employees cannot understand or explain the AI-driven decisions they are tasked with executing, the system will fail organizationally. Investing in employee education and model transparency is the best path to superior customer experience.
  • Strategic Value of Deliberate Pacing: The long-term advantage of slowing down to scope problems properly is the creation of reusable processes and the avoidance of reactive rework required when oversight, data dependencies, and usage are not documented upfront.
  • Data Quality over Data Quantity: The industry is moving past the notion that β€œall data is the new oil.” Banks must be thoughtful about which data truly correlates with desired outcomes, recognizing that not all data is created equal.

4. Notable Companies/People

  • Miranda Jones (Emprise Bank): Guest expert providing practical insights from a data science strategy leadership role within a regulated bank.
  • Emprise Bank: The institutional context for the discussion, representing a regulated financial services environment.
  • Sears: Sponsor of the special series on scaling AI.

5. Future Implications

The conversation suggests the industry is moving toward a phase of mature, responsible AI adoption where:

  1. Collaboration is mandatory: Data science teams must deeply integrate with SMEs to bridge the gap between technical capability and operational reality.
  2. Transparency is non-negotiable: Explainability will become a standard requirement, not an optional add-on, for successful deployment in high-stakes fields like finance.
  3. Pragmatism over Hype: There is a growing acceptance that manual or paper-based processes may be strategically smarter than immediate, poorly scoped automation, especially when dealing with sensitive customer data and strict regulations.

6. Target Audience

This episode is highly valuable for AI/ML Leaders, Data Science Managers, Compliance Officers, and Technology Strategy Executives within highly regulated industries (especially Financial Services). It is also relevant for FinTech vendors seeking to understand the practical adoption hurdles faced by their banking clients.

🏒 Companies Mentioned

Yasha Abengio βœ… ai_research/pioneer
Sears βœ… general_sponsor
Yasha Abengio βœ… unknown
Goldman Sachs βœ… unknown
All AI βœ… unknown
So I βœ… unknown
What I βœ… unknown
And I βœ… unknown
PSO Terrick βœ… unknown
Prize Bank βœ… unknown
Miranda Jones βœ… unknown
Emerge AI Research βœ… unknown
Matthew D βœ… unknown
Business Podcast βœ… unknown
Raytheon πŸ”₯ big_tech/defense_ai_user

πŸ’¬ Key Insights

"Finally, slower initial development focused on scoping problems and defining meaningful KPIs ultimately leads to reusable frameworks and faster higher-quality deployment over time."
Impact Score: 10
"Secondly, explainability accelerates adoption. Investing in transparent understandable AI models up front reduces rework and enables employees to confidently communicate decisions to customers and regulators."
Impact Score: 10
"First, co-design with the front line in mind. All AI and compliance requires input from people closest to the work to surface hidden variables and build models that reflect real customer needs."
Impact Score: 10
"if we can slow down and we can say, we can articulate, as we're doing, as we can communicate back to our partners in the business, how well our model learned, what it is using in its decision making or predictions that enable our teams to make decisions where they are more comfortable, they can articulate outcomes, and they don't necessarily have to ask us to do that for them because they are partners in that model that's been built"
Impact Score: 10
"employee experience might be the best path to customer experience because if the employee can't explain it, then who will? If the employee can't understand it, how can possibly the regulator understand it, can the customer understand it?"
Impact Score: 10
"I think explainability is a core component of how we train and deliver models. So I would argue that in fact helps. It may feel like we are slowing down at the beginning of a project, but ultimately, the execution of it is faster because in the end, we can more easily communicate what is this model using as decision factors."
Impact Score: 10

πŸ“Š Topics

#artificialintelligence 46 #investment 1 #generativeai 1

🧠 Key Takeaways

πŸ’‘ be analyzing, what really are the underlying patterns, and having a broad perspective
πŸ’‘ be abiding by those things
πŸ’‘ not be targeting particular segments in a way that may harm them
πŸ’‘ be going faster than most industries, but we actually don't want to be peddled to the metal in the name of being a very regulated industry

πŸ€– Processed with true analysis

Generated: October 05, 2025 at 03:37 AM