This case study explores the critical role of data quality management in large-scale industrial operations, using Chevron’s drilling and completion functions as a focal point. Despite the growing reliance on artificial intelligence (AI) and advanced analytics, poor data quality—riddled with errors, inconsistencies, and fragmentation—poses significant risks to efficiency, safety, and decision-making. Nikki Chang, a product manager at Chevron, is tasked with improving data integrity within the company's digital platforms, facing challenges ranging from legacy systems to cultural resistance. The case highlights the economic and operational impact of data quality, the complexities of driving organizational change, and the strategies needed to secure leadership buy-in for long-term improvements. It encourages participants to explore practical solutions for shifting mindsets, overcoming silos, and ensuring data is treated as a strategic asset rather than an afterthought.
This case study aims to help participants understand the strategic importance of data quality in AI-driven decision-making and large-scale operations. It explores the challenges of managing fragmented, error-prone data in a complex organization like Chevron, highlighting the risks of poor data integrity and the barriers to cultural change. Through the lens of leadership, technology, and governance, participants will develop strategies to secure executive buy-in, implement sustainable data quality initiatives, and leverage AI effectively. The case encourages broader reflection on how these lessons apply across industries where data is a critical asset.
- Data Quality
- Artificial Intelligence
- Chevron
- Oil and Gas
- Machine Learning
- Data Management
- Benchmarking
- Automation
- Operational Efficiency
- Decision-Making
- Organizational Change
- Legacy Systems
- Leadership Buy-In
- Risk Management
- Q32025