Transform Your Supply Chain Planning and Marketing Strategies with Google Cloud and SAP Integration
April 3, 2023 | Manju Devadas
Blog / Maximizing Forecasting Accuracy: A Guide to Leveraging Google Cloud and Pluto7’s Data Platform
In today’s fast-paced and highly competitive business landscape, companies are increasingly recognizing that traditional demand forecasting methods may not be sufficient to achieve optimal results. Organizations need to leverage three critical components to enhance forecast accuracy, optimize inventory, and improve overall business operations.
Planning in a Box (PiaB), a versatile data platform on Google Cloud, offers a comprehensive solution that brings in advanced forecasting techniques, empowering businesses to harness the power of machine learning (ML) and external data sources.
Planning in a Box is powered by Google Cloud Cortex Framework, enabling seamless integration of multiple data sources, sophisticated data modeling, and robust machine learning capabilities, leading to superior supply chain outcomes.
Planning in a Box utilizes a wide range of data models to provide a holistic view of the supply chain. These data models are designed to integrate seamlessly with external datasets, such as weather data, Google Trends, and AdTech, to enrich the available insights and drive better decision-making.
Pluto7’s planning platform utilizes Google Cloud Cortex to seamlessly integrate with both SAP and non-SAP data sources, such as CRM, social media, and web analytics. By blending internal and external data sources, including Google Adtech and third-party data, the platform provides real-time planning insights. This enables businesses to optimize their planning processes, streamline workflows, and make data-driven decisions that enhance profitability and accelerate growth.
Planning in a Box offers a variety of advanced analytics capabilities that help businesses optimize their supply chain operations.
Planning in a Box leverages Google Cloud external datasets, along with internal datasets, to create a richer, more accurate understanding of consumer behavior and market trends.
SAP Business Technology Platform (BTP) and Google Cloud Marketplace are platforms that enable businesses to deploy and manage cloud-based applications and services.
Planning in a Box is available on both platforms, offering click-to-deploy solutions that simplify and accelerate the deployment process.
The Cortex framework seamlessly integrates Planning in a Box and SAP systems. This enables businesses to leverage their existing SAP infrastructure and data sources, streamlining the implementation process and minimizing the need for additional investments.
With click-to-deploy solutions and seamless integration, businesses can deploy Planning in a Box and start generating actionable insights within 14 days. This rapid deployment allows businesses to capitalize on the platform’s advanced analytics capabilities, driving tangible improvements in supply chain operations and marketing effectiveness.
As a platform on a platform, Planning in a Box offers an innovative approach to harnessing the power of data, providing businesses with the tools and infrastructure needed to drive insights, improve decision-making, and gain a competitive edge in today’s data-driven world.
While the concept of a data platform like Planning in a Box may be relatively new, it has rapidly gained attention for its ability to simplify the data management process, integrate seamlessly with a wide range of data sources, and provide a scalable, secure, and cost-effective solution for organizations of all sizes.
Piab can be trained to excel at a handful of things, such as transforming SAP data, deploying custom AI/ML models at scale, and leveraging a vast range of Google Cloud tools to improve business use cases. Some of the main USPs are:
By leveraging the power of Google Cloud Platform, Piab empowers businesses to unlock the full potential of their data without the complexities and challenges typically associated with traditional data management systems.
The journey begins with a collaborative workshop between the Pluto7 team and the client’s team. This workshop serves as an essential step in understanding the client’s unique forecasting requirements and challenges.
This initial workshop lays the foundation for the entire data transformation journey. During the workshop, every aspect of the client’s data ecosystem is thoroughly audited and understood. By clearly understanding the current state of the client’s data infrastructure, the Pluto7 team identifies areas for improvement and potential opportunities for innovation.
It also helps the client understand where they currently stand in terms of data management and forecasting capabilities, and it provides a roadmap for where they can go from here, helping them set realistic expectations and goals for their data transformation journey.
Watch: Tacori’s Supply Chain Data Transformation Journey with Pluto7 and Google Cloud
Once the workshop is completed, the Pluto7 team helps the client set up a data ingestion pipeline, which is a series of processes that collect, prepare, and store data for analysis. This pipeline is crucial to ensure that the ML models have access to accurate, up-to-date information when generating forecasts.
Transferring data from SAP, Salesforce, and CRM systems to BigQuery has become seamless, thanks to the Google Cloud Cortex Framework. At Pluto7, we have been among the first few companies to enable Cloud Cortex into our data platform solutions.
Having implemented hundreds of use cases for SAP data replication to BigQuery using the Cortex Framework, we can say with certainty that Cortex is a revolutionary concept in data integration. It enables faster data movement and rapid access to insights, making it easier to plug external insights into SAP & Salesforce systems.
With the data ingestion pipeline in place, the next step is to develop a data science workflow tailored to the client’s goals. This workflow serves as a blueprint for designing, building, and evaluating ML models that address the client’s forecasting challenges. The data science workflow typically involves data exploration, feature engineering (wherever required), model selection, training, and evaluation.
Once the data science workflow is established, the Pluto7 ML team creates custom-built models for the client’s specific use case, whether it’s demand forecasting, inventory planning, marketing analytics, or another area.
During the POC phase, the team tests the models on real-world data to demonstrate their effectiveness and validate their performance.
Success in the POC can be replicated across business units, as shown by AB InBev, where the data science model created to improve their filtration process at one manufacturing unit could be replicated across other territories.
Business Challenge: Optimizing the K-Filtration Process
The K Filter plays a crucial role in the brewing process, serving as the final “filtration” step that guarantees the product meets the highest quality standards. AB InBev collaborated with Pluto7 and Google Cloud to employ machine learning to enhance the precision of the filtration process.
Pluto7’s Solution: ML-Enabled Predictive Maintenance
Pluto7 developed a prototype solution enabling AB InBev to optimize the beer filtration process with much greater accuracy — reducing costs, increasing efficiencies, and perhaps most importantly, for beer lovers, ensuring taste. The Pluto7 solution combines TensorFlow, Cloud Machine Learning Engine, Cloud SQL, and BigQuery.
After validating the ML models’ performance in the POC, the Pluto7 team works with the client to scale and deploy the models across their organization. This process involves fine-tuning the models to account for any unique challenges or requirements at different business units and integrating the models into the client’s existing systems and processes.
The final step is to stream the results back to the client’s SAP system, allowing them to make data-driven decisions based on the improved forecasts.
Traditional forecasting methods, such as straight-line, moving average, and linear regression, have limitations in dealing with demand fluctuations and the complexities of increasingly connected supply chain networks [1]. This inability to adapt to variations in demand can contribute to the Bullwhip effect, a phenomenon caused by lead-time between order and delivery, and changes in forecasted demand [2].
To address these challenges, an intelligent demand forecasting system based on AI and ML is essential for minimizing the Bullwhip effect and adapting to demand fluctuations.
According to leading researchers in demand forecasting, the three main requirements for improving forecast accuracy while reducing waste or delays are [3]:
Machine learning models can process and analyze vast amounts of data from various sources, capturing hidden patterns and relationships that traditional forecasting methods might overlook.
By leveraging the power of ML and advanced analytics in demand forecasting and inventory optimization, businesses can create more accurate and adaptive forecasts, which in turn can help to reduce the impact of negative data and downward changes in statistical forecasts [4].
Piab, with its seamless integration with Google Cloud, is at the forefront of helping customers access better insights by leveraging ML for demand forecasting, inventory management, and other areas. By tapping into the potential of ML and AI, businesses can significantly improve their forecasting capabilities and address the challenges posed by fluctuating demands.
Time Series Models: Time series models, such as ARIMA, Exponential Smoothing State Space Model (ETS), and Seasonal Decomposition of Time Series (STL), are popular in demand forecasting due to their ability to capture trends, seasonality, and noise in historical data.
Regression Models: Regression models, including linear and logistic regression, are widely used for demand forecasting, as they can incorporate various factors that influence demand, such as marketing campaigns, promotions, and economic indicators.
Ensemble Models: Ensemble models, like Random Forest and Gradient Boosting Machines (GBM), improve forecasting accuracy by combining the predictions of multiple base models. This approach helps to reduce the risk of overfitting and leads to more robust predictions.
Deep Learning Models: Deep learning models, such as Long Short-Term Memory (LSTM) and Convolutional Neural Networks (CNN), can capture complex patterns in data and are particularly effective for large-scale, high-dimensional data sets.
The four levers in ML that are responsible for improving forecasting accuracy are as follows:
The typical success rate of ML models in demand forecasting is around 80% or higher. This level of accuracy can significantly improve inventory positioning, reduce stockouts and overstocks, and ultimately lead to cost savings and increased customer satisfaction.
Pluto7 is revolutionizing its customers’ data landscape by continuously integrating cutting-edge technologies and machine learning capabilities.
By pushing the boundaries of what’s possible, Pluto7’s Cortex-Enabled Data Platform Piab is transforming supply chain planning, sales and operations planning, and marketing analytics, enabling businesses to harness the full potential of their data.
With its deep integration with Google Cloud and commitment to innovation, Piab empowers organizations to make data-driven decisions, improve forecasting accuracy, optimize inventory management, and drive top-notch business performance.