Running simulations: base model vs. lookahead model I see people posting on the use of “simulations” for planning inventory policies. If you are using a lookahead model (which is typical for most real-world inventory problems), there are two models where simulation can be used: 1. The base model, which can be a simulator or the real world. 2. The lookahead model, which is used in the policy for planning the future to make a decision now. See the figure below - I use the same notational style for both models, but the lookahead model uses tildes on each variables, which also carry two time subscripts: the point in time we are making the decision, and the time period within the lookahead model. The base model is used to evaluate the policy, and is needed to perform any parameter tuning. The base model can be based on history or a simulation of what you think the future can be. When simulating inventory policies, special care has to be used because we do not have historical data on market demand – we typically just have sales, which can be “censored” (a topic that has been recognized in the inventory literature for over 60 years). For example, if we run out of product (and there is no back ordering), we lose the sales, which typically means that we do not see (or record) them. I find it is generally best to run simulations using mathematical models of uncertainty so that we can run many simulations, testing different policies. Stockouts depend on properly simulating the tails of distributions, along with market shifts, price changes and supply chain disruptions. There are, of course, settings where you have no choice but to test your ideas in the field. It is expensive, risky, and slow, but sometimes you just have no choice, especially when you have to capture human behavior. If your policy requires planning into the future, you really need to be using a stochastic (probabilistic) model of the future which properly captures the tails of distributions. With long lead times, you should also plan for the possibility of significant disruptions, which can mean that you also have to capture the decisions you might make in the future. See chapter 19 of: https://lnkd.in/dB99tHtM (“tinyurl.com/” with “RLandSO”) for an in-depth treatment of direct lookahead policies. #supplychain #inventory Nicolas Vandeput Joannes Vermorel
Data Analytics in Supply Chain Decisions
Explore top LinkedIn content from expert professionals.
-
-
As supply chain managers look towards manufacturing industry verticals that may see volume growth in 2024, in addition to looking at trends in industrial production indexes (which measure physical unit output), it is also useful to look at the Federal Reserve Board’s capacity utilization indexes for these industries. The reason is simple: an industry that is operating at a high level of capacity utilization likely has little upward room for output growth given strains placed on equipment and labor. To illustrate, below are two capacity utilization indexes from the FRB. Thoughts: •The top index shows seasonally adjusted capacity utilization for plastics & rubber products manufacturing (https://lnkd.in/gPYc_GMb). Capacity utilization has fallen very sharply starting in Q4 2022 and has yet to recover. Looking at year-over-year industrial production (https://lnkd.in/gB3PaQKu), the decline has been about 5%. Thus, if conditions improve, we may could see an increase in output of ~5% as an upper bound (best-case scenario). •The bottom index shows capacity utilization for nonmetallic mineral product manufacturing (https://lnkd.in/gYe4piEh). Capacity utilization since early 2022 has been running 8-10 percentage points above 2018 and 2019 levels, suggesting a sector where we have little room for additional growth in output unless additional capacity is added. Thus, while industrial production has remained strong (https://lnkd.in/gtgD-YM4), I see fewer opportunities for output volume to grow in 2024 even if demand manifests (since these factories are likely to begin to encounter supply constraints). •For anyone interested, here is a link (https://lnkd.in/gy2ukdvr) to all the capacity utilization indexes the FRB publishes. Implication: augmenting industrial production data measuring output with capacity utilization data can provide a more comprehensive picture of how a manufacturing sector is performing and its potential to increase output if demand conditions were to improve. More free competitive intelligence data to incorporate into strategic decision making (whether as a trucking manager or sourcing professional trying to anticipate supplier lead times). #supplychain #supplychainmanagement #manufacturing #economics #freight #trucking
-
𝗗𝗮𝘁𝗮 𝗤𝘂𝗮𝗹𝗶𝘁𝘆 𝗶𝘀𝗻'𝘁 𝗮 𝘀𝗶𝗻𝗴𝗹𝗲 𝗰𝗵𝗲𝗰𝗸 -it's a continuous contract enforced across the various data layers to avoid breakage. Think about it. Planes don’t just fall out of the sky when they land. Crashes happen when people miss the little signals that get brushed off or ignored. Same thing with data. Bad data doesn’t shout; it just drifts quietly—until your decisions hit the ground. When you bake quality checks into every layer and, actually use observability tools, You end up with data pipelines that hold up. Even when things get messy. That’s how you get data people can trust. Why does this matters? Bad data costs money → Failed ML models, wrong decisions. Good monitoring catches 90% of issues automatically. → Raw Materials (Ingestion) • Inspect at the dock before accepting delivery. • Check schemas match expectations. Validate formats are correct. • Monitor stream lag and file completeness. Catch bad data early. • Cost of fixing? Minimal here, expensive later. • Spot problems as close to the source as you can. → Storage (Raw Layer) • Verify inventory matches what you ordered. • Confirm row counts and volumes look normal. • Detect anomalies: sudden spikes signal upstream issues. • Track metadata: schema changes, data freshness, partition balance. • Raw data is your backup plan when things go sideways. → Processing (Transformation) • Quality control during assembly is critical. • Validate business rules during transformations. Test derived calculations. • Check for data loss in joins. Monitor deduplication effectiveness. • Statistical profiling reveals outliers and distribution shifts. • Most data disasters start right here. → Packaging (Cleansed Data) • Final inspection before shipping to warehouse. • Ensure master data consistency across all sources. • Validate privacy rules: PII masked, anonymization works. • Verify referential integrity and temporal logic. • Clean doesn’t always mean correct. Keep checking. → Distribution (Published Data) • Quality assurance for customer-facing products. • Check SLAs: freshness, availability, schema contracts met. • Monitor aggregation accuracy in data marts. • ML models: detect feature drift, prediction degradation. • Dashboards: validate calculations match source data. • Once data is published, you’re on the hook. → Cross-Cutting Layers (Force Multipliers) • Metadata: rules, lineage, ownership, quality scores • Monitoring: freshness, volume, anomalies, downtime • Orchestration: dependencies, retries, SLAs • Logs: failures, patterns, early warning signs Honestly, logs are gold. Don’t sleep on them. What's your job? Design checkpoints, not firefight data incidents. Quality is built in, not inspected in. Pipelines just 𝗺𝗼𝘃𝗲 data. Quality 𝗽𝗿𝗼𝘁𝗲𝗰𝘁𝘀 your decisions. Image Credits: Piotr Czarnas 𝘌𝘷𝘦𝘳𝘺 𝘭𝘢𝘺𝘦𝘳 𝘯𝘦𝘦𝘥𝘴 𝘪𝘯𝘴𝘱𝘦𝘤𝘵𝘪𝘰𝘯. 𝘚𝘬𝘪𝘱 𝘰𝘯𝘦, 𝘳𝘪𝘴𝘬 𝘦𝘷𝘦𝘳𝘺𝘵𝘩𝘪𝘯𝘨 𝘥𝘰𝘸𝘯𝘴𝘵𝘳𝘦𝘢𝘮.
-
Inflation isn’t just an economic challenge—it’s a test of agility for businesses. As costs rise and purchasing power shifts, companies that rely on gut instinct risk falling behind. The real winners? Those who use data-driven insights to navigate uncertainty. 1️⃣ Understanding Consumer Behavior: What’s Changing? Inflation reshapes spending habits. Some consumers trade down to budget-friendly options, while others delay non-essential purchases. Businesses must analyze: 🔹 Spending patterns: Are customers shifting to smaller pack sizes or private labels? 🔹 Channel preferences: Is there a surge in online shopping due to better deals? 🔹 Regional variations: Inflation doesn’t hit all demographics equally—hyperlocal data matters. 📊 Example: A retail chain used real-time sales data to spot a shift toward economy brands, allowing it to adjust promotions and retain price-sensitive customers. 2️⃣ Pricing Trends: Data-Backed Decision-Making Raising prices isn’t the only response to inflation. Smart pricing strategies, backed by AI and analytics, can help businesses optimize margins without losing customers. 🔹 Dynamic pricing models: Adjust prices based on demand, competitor moves, and seasonality. 🔹 Price elasticity analysis: Determine how much a price hike impacts sales before making a move. 🔹 Personalized discounts: Use customer data to offer targeted promotions that drive loyalty. 📈 Example: An e-commerce platform analyzed customer behavior and found that small, frequent discounts led to better retention than infrequent deep discounts. 3️⃣ Demand Forecasting & Inventory Optimization Stocking the right products at the right time is critical in an inflationary market. Predictive analytics can help businesses: 🔹 Anticipate demand surges—especially in essential goods. 🔹 Optimize supply chains to reduce excess inventory and prevent stockouts. 🔹 Reduce waste in perishable categories like F&B, where price-sensitive demand fluctuates. 📦 Example: A leading FMCG brand leveraged AI-driven demand forecasting to prevent overstocking of premium products while ensuring budget-friendly variants were always available. 💡 The Takeaway Inflation isn’t just about rising costs—it’s about shifting consumer priorities. Companies that embrace data-driven decision-making can optimize pricing, fine-tune inventory, and strengthen customer loyalty. 𝑯𝒐𝒘 𝒊𝒔 𝒚𝒐𝒖𝒓 𝒃𝒖𝒔𝒊𝒏𝒆𝒔𝒔 𝒂𝒅𝒂𝒑𝒕𝒊𝒏𝒈 𝒕𝒐 𝒊𝒏𝒇𝒍𝒂𝒕𝒊𝒐𝒏𝒂𝒓𝒚 𝒑𝒓𝒆𝒔𝒔𝒖𝒓𝒆𝒔? 𝑨𝒓𝒆 𝒚𝒐𝒖 𝒖𝒔𝒊𝒏𝒈 𝒅𝒂𝒕𝒂 𝒕𝒐 𝒓𝒆𝒇𝒊𝒏𝒆 𝒚𝒐𝒖𝒓 𝒔𝒕𝒓𝒂𝒕𝒆𝒈𝒚? 𝑳𝒆𝒕’𝒔 𝒅𝒊𝒔𝒄𝒖𝒔𝒔 𝒊𝒏 𝒕𝒉𝒆 𝒄𝒐𝒎𝒎𝒆𝒏𝒕𝒔! #datadrivendecisionmaking #dataanalytics #inflation #inventoryoptimization #demandforecasting #pricingtrends
-
We developed a tool that takes any city's Transit Data (called GTFS, Generalized Transit Feed Specification), detects and automatically corrects errors, analyzes the data, and reformats it for planning purposes. GTFS data contains structural errors, such as missing links between trips and stop times; temporal errors, such as non-monotonic arrival and departure times along a trip; and spatial inaccuracies, such as route shapes that do not align with their associated stops. It is also structured for analysis rather than design. We call this new data structure TPFS (Transit Planning Feed Specification), demonstrated in the tool below. Read white paper on TPFS: https://lnkd.in/gRAHx7MV By Vahed Barzegari Bafghi Interactive-or.net #TransitData
-
🚀 Just published an in-depth article exploring how AI and predictive analytics are revolutionizing demand forecasting in supply chain management! 🌐 From my years of experience with Sony's in-car navigation and AI-based robotics to the challenges of the coffee industry at MaxiCoffee, I’ve seen firsthand how demand forecasting can make or break supply chain efficiency. 💥 With AI, companies can move beyond traditional methods and leverage vast data—like social media trends and weather forecasts—for unmatched accuracy in predicting demand 📈. This shift to AI-powered forecasting not only optimizes inventory but also boosts customer satisfaction and reduces costs. Early adopters will undoubtedly hold a competitive edge. 👇 Dive into the article for insights and real-world examples! #SupplyChain #AI #PredictiveAnalytics #DemandForecasting #DigitalTransformation #SupplyChainManagement #BabinBusinessConsulting #Innovation #Technology #ArtificialIntelligence
-
🚛💡 How do you spot revenue leaks, fix logistics delays, and keep customers happy—all through data? That’s exactly what I explored in my latest 4-page Supply Chain & Logistics Report. I wanted to go beyond dashboards and uncover insights companies can act on. Hey 👋 #datafam I'm thrilled to share this 4-page report on supply chain and logistics I built, I started this project by first understanding the dataset I was working on then proceed to drafting project objectives which you could check it out here 🔗: https://lnkd.in/dQ_2sSUd This report is structured into 4 pages which are: Sales and Demand overview, Inventory ad and Production, Logistics and delivery then Quality Control and Efficiency. Here’s the breakdown: 📍 Page 1 – Sales & Demand → Identified top revenue drivers and seasonal demand shifts. ✅ Recommendation: Focus resources on high-demand products, reposition low-performers. 📍 Page 2 – Inventory & Production → Found stockouts in fast-movers and excess in low-demand items. ✅ Recommendation: Use forecasting + JIT practices to balance supply and demand. 📍 Page 3 – Logistics & Delivery → Tracked delivery delays and cost inefficiencies in certain routes. ✅ Recommendation: Optimize routes, renegotiate carrier costs, and use hybrid shipping. 📍 Page 4 – Quality & Efficiency → Calculated hidden revenue loss from defective products. ✅ Recommendation: Improve early-stage quality checks and automate inspections. 💡 Why this matters: These aren’t just numbers. They’re business decisions waiting to be made—cutting costs, saving time, and boosting customer trust. 👉 If you’re in supply chain, logistics, or retail, you’ll recognize these challenges. This is how data analytics transforms them into growth opportunities. Tool: Excel,Power Query,DAX, Power Pivot #DataAnalytics #BusinessIntelligence #Supplychain #Logistics #Dataviz
-
+2
-
Sales and Operations Planning is my favorite domain for data analysts. Here is why analytical skills are in high demand in S&OP teams. In Sales and Operations Planning (S&OP) it’s all about aligning the company’s sales, marketing, finance, and supply chain functions to ensure that demand and supply are balanced, and business goals are met efficiently. S&OP helps organizations make better decisions by integrating data, forecasting, and cross-functional collaboration to optimize resources and meet customer needs. 𝗗𝗮𝘁𝗮 𝗔𝗻𝗮𝗹𝘆𝘀𝘁𝘀 𝗧𝗮𝘀𝗸𝘀 𝗶𝗻 𝗦&𝗢𝗣: 1. 𝗗𝗲𝗺𝗮𝗻𝗱 𝗙𝗼𝗿𝗲𝗰𝗮𝘀𝘁𝗶𝗻𝗴: You’ll analyze historical sales data and market trends to forecast future demand. This helps the business align its marketing campaigns, inventory, and work force with customer needs. 2. 𝗦𝘂𝗽𝗽𝗹𝘆 𝗖𝗵𝗮𝗶𝗻 𝗢𝗽𝘁𝗶𝗺𝗶𝘇𝗮𝘁𝗶𝗼𝗻: By analyzing data across the supply chain, you’ll identify bottlenecks, inefficiencies, and opportunities to improve operations, by ensuring that products are delivered on time and within budget. 3. 𝗦𝗰𝗲𝗻𝗮𝗿𝗶𝗼 𝗔𝗻𝗮𝗹𝘆𝘀𝗶𝘀: S&OP requires constant adjustments. You’ll create "what-if" scenarios to predict the impact of changes in demand, supply, or market conditions, enabling the business teams to make better decisions. 4. 𝗖𝗼𝗹𝗹𝗮𝗯𝗼𝗿𝗮𝘁𝗶𝗼𝗻: You’ll work closely with sales, marketing, finance, and operations teams to ensure everyone is aligned and working towards common goals. Your ability to translate data into recommendations for the different stakeholders is important in this cross-functional collaboration. 5. 𝗣𝗲𝗿𝗳𝗼𝗿𝗺𝗮𝗻𝗰𝗲 𝗠𝗼𝗻𝗶𝘁𝗼𝗿𝗶𝗻𝗴: Tracking KPIs and metrics is the baseline for continuous improvement. You’ll provide the data-driven insights that help the business stay on track and adapt to changing conditions. I worked on the network planning team responsible for the S&OP process of Zalando's fast-growing network and built the first steps of S&OP processes for multiple companies as a consultant at RizonX. In every project, data analysts have played an important role in bridging the gaps between the different teams by providing aligned reports, dashboards, and forecasts. Does your company run a structured S&OP process utilizing data analytics to its full potential? Have you worked on S&OP teams or are considering it for your future career? ---------------- ♻️ Share if you find this post useful ➕ Follow for more daily insights on how to grow your career in the data field #dataanalytics #supplychainmanagement #salesandoperationsplanning #demandplanning #careergrowth
-
The most important competence for building a sustainable DTC strategy: Data-Driven Customer Insights. Over the last decade direct-to-consumer marketers have suffered a 15% CAGR in CPM inflation for digital #advertising, according to research by Frederic Fernandez & Associates, dramatically increasing cost per acquisition. #DTC companies hence need to much better understand their target consumers, their path-to-purchase metrics, barriers/ drivers/ triggers & 4Ps preferences, and design a new omnichannel acquisition strategy. In my view, its time for DTC companies to build truly immersive and personalized customer acquisition strategies based on data driven customer insights. Data-driven customer insights are essential in the following 5 marketing areas: 🙋 Understanding Customer Behavior: To create personalized experiences, brands need to understand their customers' behaviors, preferences, and pain points. #Data analytics enables companies to track and analyze customer interactions across all touchpoints, providing deep insights into their journey and decision-making processes. 🎯 Personalization at Scale: Leveraging customer data allows brands to segment their audience and deliver tailored content, offers, and recommendations. This level of #personalization can significantly enhance customer satisfaction and loyalty, as consumers are more likely to engage with content that is relevant to their needs and interests. 📢 Optimizing Marketing Efforts: Data insights help brands to optimize their #marketing strategies and campaigns. By analyzing which tactics are most effective, companies can allocate resources more efficiently and improve their return on investment. ❤️ Enhancing Customer Engagement: Real-time data analysis enables brands to engage with customers at the right moment with the right message. This timely #engagement can drive higher conversion rates and foster a stronger emotional connection with the brand. 📈 Continuous Improvement: Data-driven #insights provide a feedback loop that allows brands to continuously refine their products, services, and customer interactions. This iterative process helps in adapting to changing customer expectations and market trends. By investing in data collection, advanced analytics, and skilled personnel, #DTC companies can create truly immersive and personalized customer experiences that drive engagement and loyalty.
-
Which heatmap methodology makes more sense 3x3 or 5x5? RAW@AI: Great question—choosing between a 3x3 or a 5x5 heatmap is something risk managers often debate. Let's first explore your context briefly: Are you using heatmaps because stakeholders or auditors expect something visual and colorful once a year, or are your executives actively making decisions based on these outputs? If it's the former—simply providing a colorful visual to tick an audit or compliance box—either 3x3 or 5x5 can serve that purpose, though a 3x3 is typically simpler and quicker for stakeholders to grasp. If your risk assessment is used for genuine strategic decisions, however, neither approach is ideal. Here's why: The core issue with qualitative matrices like 3x3 or 5x5 heatmaps is that they suffer from subjectivity and inconsistency. Different people interpret "high" or "medium" differently, and there's little meaningful difference between a risk rated as "4" versus one rated as "3". In practice, these ratings seldom translate directly into actionable insights or inform critical decisions clearly. A better approach—and here's where the evolution of your risk management practice comes into play—is to gradually shift towards more quantitative methods that directly integrate into decision-making. For example, rather than trying to fit a complex reality into arbitrary categories, you could use a decision tree or Monte Carlo simulation to quantitatively evaluate actual outcomes and their probabilities. This approach provides clearer insight into how uncertainties directly affect your objectives and shows decision-makers the real-world implications of their choices. Consider this practical example: Imagine you're evaluating the risk of operational downtime in your supply chain. Rather than rating it as "medium likelihood, high impact" on a heatmap, you could model potential downtime scenarios using Monte Carlo simulations. You'd quantify how often downtime might occur, how long it would last, and what its financial impact would be. Decision-makers then receive clear, numeric outputs such as "there's a 40% chance annual losses will exceed $500,000 under our current maintenance schedule." That kind of insight directly informs whether investing more in preventive maintenance is justified. I recall from our previous exchanges that you mentioned the importance of clearly communicating risks to executives and stakeholders. Decision trees, tornado diagrams, and simulations don't just provide clarity—they communicate risk information in the language that executives speak: dollars, timeline impacts, and strategic trade-offs. But I know switching entirely overnight might be challenging. So, perhaps consider a hybrid approach: continue briefly using your heatmap (3x3 for simplicity), while gradually introducing these more quantitative methods on a key project or decision. Over time, stakeholders will start experiencing firsthand the value of more precise and actionable data.