Editor's Note: The SCM capstone Optimizing Procurement Analytics with Generative AI and Automated Data Visualization was authored by Shen Yeong Loo and Mariana Dias Pennone and supervised by Dr. Thomas Koch ([email protected]). For more information on this research, please contact the thesis supervisor.
A leading multinational pharmaceutical and medical corporation with multi-billion-dollar annual revenue faced mounting challenges in procurement analytics due to the scale and complexity of its operations. The proliferation of over 200 dashboards led to reporting inefficiencies, increased maintenance overhead, and slowed down data-driven decision-making. While strategic procurement is vital for optimizing costs, mitigating risks, and ensuring supply continuity, the team’s reliance on static dashboards and fragmented data tables across multiple reporting sites made it difficult to generate timely, actionable insights.
Building a chatbot that understands procurement
Through interviews with key users and stakeholders, several inefficiencies were identified in the existing data acquisition and analysis processes. Routine tasks such as downloading recurring reports, merging datasets, and auditing for consistency consumed excessive time and were highly error-prone. These repetitive activities were recognized as areas where automation and AI could provide meaningful improvements.
Three core analytical needs consistently emerged from conversations with procurement professionals: summary data analysis, trend analysis, and exploratory data analysis (EDA). These use cases served as the foundation for designing a more intelligent, user-friendly solution. Instead of overhauling the entire business intelligence ecosystem, the initiative focused on augmenting existing workflows through a natural language interface. Open-source tools were reviewed and selected to support these goals, emphasizing flexibility, transparency, and scalability.
Unearthing procurement insights with a Generative AI chatbot
The result was a Generative AI-powered chatbot capable of interpreting natural language queries and generating dynamic visualizations — including charts annotated with trendlines, averages, and explanatory comments. This approach significantly reduced dependency on static dashboards and enabled self-service analytics within the procurement function.
The chatbot was trained to understand procurement-related terminology and context, providing answers in a conversational format. Accuracy and contextual alignment were ensured through the integration of Retrieval Augmented Generation (RAG) and prompt engineering strategies, allowing the model to reference company-specific data while maintaining consistency and control.
During testing, the chatbot delivered accurate and relevant responses for 96% of queries across the three identified use cases. Response times were reduced from hours to seconds, eliminating the need for manual data wrangling or complex pivot tables. Additional testing in adjacent functions further demonstrated the potential to support more advanced analytical tasks, suggesting broader applicability beyond procurement.
Early user feedback emphasized the shift in how insights were accessed and discussed. By enabling users to engage directly with data through natural language, the solution facilitated faster alignment and improved collaboration during procurement planning and reviews.
Business takeaways: Toward a conversational analytics model
This project reflects a growing trend in enterprise analytics: moving from passive, report-driven processes to interactive, AI-supported dialogue with data. By leveraging large language models’ ability to understand natural language, perform analytical reasoning, and generate code, combined with procurement-specific expertise integrated through prompt engineering, we streamline and speed up routine data analysis.
In practice, this makes LLMs powerful assistants for structured analysis. When paired with robust governance and secure data retrieval methods, they enable non-technical users to interact with complex datasets through intuitive prompts.
That said, implementation must be approached responsibly. Legal compliance, role-based access controls, and privacy safeguards are critical when deploying generative models in enterprise settings. Misuse or unintentional exposure of sensitive data can occur if appropriate controls are not embedded into the AI architecture.
For organizations exploring the integration of Generative AI into procurement or other operational domains, the key lesson is this: success depends on embedding AI where it reduces friction in existing data workflows and enables faster, more informed decisions. When implemented thoughtfully, AI can streamline insight generation, simplify access to complex datasets, and make analytical tools more accessible to business users — not by adding complexity, but by removing barriers to action.
SC
MR

More Artificial Intelligence
- 20 GPT prompts every procurement professional needs
- How AI helped a retailer prevent stockouts
- The Invisible Handshake: AI to AI procurement negotiations
- Beyond resilience: How AI and digital twins are rewriting the rules of supply chain recovery
- How AI can streamline peak season post-purchase logistics operations
- More Artificial Intelligence
What's Related in Artificial Intelligence

Explore
Topics
Procurement & Sourcing News
- Unlocking the last mile: A strategic framework for in-store fulfillment
- 20 GPT prompts every procurement professional needs
- How AI helped a retailer prevent stockouts
- Paying for it: 4 ways to reduce equipment lease expenditures
- Building globally resilient value chains for sustained operations
- 2025 Warehouse/DC Operations Survey: Tech adoption marches on
- More Procurement & Sourcing
Latest Procurement & Sourcing Resources

Subscribe

Supply Chain Management Review delivers the best industry content.

Editors’ Picks
