Time Series Forecasting Models

Explore top LinkedIn content from expert professionals.

Summary

Time series forecasting models are techniques used to predict future values based on past data that is ordered in time, helping businesses and researchers anticipate trends and patterns. Recent advancements include neural networks, transformer architectures, and models that combine causal reasoning for deeper insights and more flexible analysis.

  • Explore neural networks: Try using models like LSTMs or graph neural networks to capture complex and longer-range temporal patterns that traditional methods may miss.
  • Balance accuracy and speed: Consider transformer-based or sequence models such as Chronos and Mamba for scenarios where fast, scalable forecasting is important.
  • Integrate causal reasoning: Use causal generative models like DoFlow when you need to answer “what if” questions and understand the impact of interventions in time series data.
Summarized by AI based on LinkedIn member posts
  • View profile for Kristen Kehrer
    Kristen Kehrer Kristen Kehrer is an Influencer

    Mavens of Data Podcast Host, [in]structor, Co-Author of Machine Learning Upgrade

    103,130 followers

    Modeling something like time series goes past just throwing features in a model. In the world of time series data, each observation is associated with a specific time point, and part of our goal is to harness the power of temporal dependencies. Enter autoregression and lagging -  concepts that taps into the correlation between current and past observations to make forecasts.  At its core, autoregression involves modeling a time series as a function of its previous values. The current value relies on its historical counterparts. To dive a bit deeper, we use lagged values as features to predict the next data point. For instance, in a simple autoregressive model of order 1 (AR(1)), we predict the current value based on the previous value multiplied by a coefficient. The coefficient determines the impact of the past value on the present one only one time period previous. One popular approach that can be used in conjunction with autoregression is the ARIMA (AutoRegressive Integrated Moving Average) model. ARIMA is a powerful time series forecasting method that incorporates autoregression, differencing, and moving average components. It's particularly effective for data with trends and seasonality. ARIMA can be fine-tuned with parameters like the order of autoregression, differencing, and moving average to achieve accurate predictions. When I was building ARIMAs for econometric time series forecasting, in addition to autoregression where you're lagging the whole model, I was also taught to lag the individual economic variables. If I was building a model for energy consumption of residential homes, the number of housing permits each month would be a relevant variable. Although, if there’s a ton of housing permits given in January, you won’t see the actual effect of that until later when the houses are built and people are actually consuming energy! That variable needed to be lagged by several months. Another innovative strategy to enhance time series forecasting is the use of neural networks, particularly Recurrent Neural Networks (RNNs) or Long Short-Term Memory (LSTM) networks. RNNs and LSTMs are designed to handle sequential data like time series. They can learn complex patterns and long-term dependencies within the data, making them powerful tools for autoregressive forecasting. Neural networks are fed with past time steps as inputs to predict future values effectively. In addition to autoregression in neural networks, I also used lagging there too! When I built an hourly model to forecast electric energy consumption, I actually built 24 individual models, one for each hour, and each hour lagged on the previous one. The energy consumption and weather of the previous hour was very important in predicting what would happen in the next forecasting period. (this model was actually used for determining where they should shift electricity during peak load times). Happy forecasting!

  • View profile for Arjun Jain

    Co-Creating Tomorrow’s AI | Research-as-a-Service | Founder, Fast Code AI | Dad to 8-year-old twins

    34,700 followers

    𝗪𝗵𝗮𝘁 𝗶𝗳 𝘆𝗼𝘂 𝗰𝗼𝘂𝗹𝗱 𝗳𝗼𝗿𝗲𝗰𝗮𝘀𝘁 𝘁𝗵𝗲 𝗳𝘂𝘁𝘂𝗿𝗲 𝗯𝘆 𝘁𝗿𝗲𝗮𝘁𝗶𝗻𝗴 𝗻𝘂𝗺𝗯𝗲𝗿𝘀 𝗹𝗶𝗸𝗲 𝘄𝗼𝗿𝗱𝘀? That's exactly what Amazon did with Chronos. They took T5 (yes, the language model) and taught it to "read" time series data. The trick? Tokenize continuous values into ~4096 discrete bins. Suddenly, forecasting becomes next-token prediction. 𝗘𝘃𝗼𝗹𝘂𝘁𝗶𝗼𝗻: 📍 Chronos (Feb 2024) — Original release 📍 Chronos-Bolt (Nov 2024) — ~250× faster inference 📍 Chronos 2.0 (Oct 2025) — Multivariate support 𝗛𝗼𝘄 𝗶𝘁 𝘄𝗼𝗿𝗸𝘀: 🔹 𝘛5 𝘌𝘯𝘤𝘰𝘥𝘦𝘳-𝘋𝘦𝘤𝘰𝘥𝘦𝘳 — Bidirectional encoder captures dependencies; autoregressive decoder generates multi-step forecasts 🔹 𝘛𝘰𝘬𝘦𝘯𝘪𝘻𝘢𝘵𝘪𝘰𝘯 — Mean-scale values → quantize into bins → regression becomes classification. Now you can use all the LLM tricks. 🔹 𝘗𝘳𝘰𝘣𝘢𝘣𝘪𝘭𝘪𝘴𝘵𝘪𝘤 𝘖𝘶𝘵𝘱𝘶𝘵 — Outputs a distribution over bins per timestep. Sample → get prediction intervals with calibrated uncertainty. 🔹 𝘊𝘩𝘳𝘰𝘯𝘰𝘴-𝘉𝘰𝘭𝘵 — One-shot decoding (all future timesteps in one forward pass). ~250× speedup + ~5% accuracy gain via knowledge distillation. 𝗣𝗿𝗲𝘁𝗿𝗮𝗶𝗻𝗶𝗻𝗴: • Large corpus: energy, traffic, economics, weather, web traffic • Heavy augmentation: scaling, jittering, warping 𝗪𝗵𝘆 𝗶𝘁 𝗺𝗮𝘁𝘁𝗲𝗿𝘀: ✅ Bridges time series & NLP—use mature LLM infrastructure ✅ Native probabilistic forecasting ✅ Chronos 2.0: multivariate + cross-variable learning ✅ Multiple sizes (Mini → Large) ✅ Apache-2.0 license 𝗣𝗲𝗿𝘀𝗼𝗻𝗮𝗹 𝗻𝗼𝘁𝗲: Syama Sundar Rangapuram is one of the co-authors on this work. He taught me ML during my grad school days and helped me out more than I can say. Seeing his work shape the field like this — super proud. 🙌 The LLM playbook works for time series. Who knew? #TimeSeries #MachineLearning #Forecasting #AI #FoundationModels

  • View profile for Sione Palu

    Machine Learning Applied Research

    37,872 followers

    Predictive learning on time series, crucial for applications like traffic and weather forecasting, often utilizes spatio-temporal graphs (STGs) to represent complex relationships. Spatio-temporal graph neural networks (STGNNs) effectively capture spatial and temporal dependencies within these graphs, typically by combining graph neural networks (GNNs) for spatial information with recurrent units or convolutions for temporal data. However, STGNNs struggle with long-range temporal dependencies. While self-attention mechanisms (like those in transformers) can address this, they introduce significant computational overhead. Despite recent advances in modeling complex STG dynamics, effectively capturing long-range spatio-temporal dependencies continues to be a major obstacle, hindering substantial performance improvements. Structured state space sequence (S4) models offer an alternative with linear scaling, but traditionally lack input-dependent information selection. The Mamba model, a recent S4 variant, addresses this by incorporating a selection mechanism, demonstrating superior performance over transformers in various sequence data types and showing promise for integration with graph frameworks. Essentially, graphs are used to represent the data, and the challenge lies in efficiently capturing both short and long-term temporal relationships within these graphs. Solutions range from improved attention mechanisms to novel sequence models like Mamba, each attempting to balance performance and computational cost. Building on Mamba's success in capturing long-range dependencies, the authors of [1] introduce SpoT-Mamba, a novel STG forecasting framework designed to overcome the aforementioned challenges. SpoT-Mamba generates node embeddings by scanning various node-specific walk sequences. Based on these node embeddings, it conducts temporal scans to capture long-range spatio-temporal dependencies. Experimental results on a real-world traffic forecasting dataset demonstrate that SpoT-Mamba's accuracy is comparable to or better than that of the baselines (STAEformer, PDformer, STID, STNorm, MTGNN, STGCN, DCRNN, etc.). Links to the paper [1] and the #Python GitHub repo [2] are posted in the comments.

  • View profile for Igor Shuryak, MD, PhD

    Quantitative Radiation Biologist/Oncologist | Machine Learning & Causal Inference Practitioner | Columbia University Professor | 140+ Publications | Advancing Cancer Treatment Through AI & Mathematical Modeling

    4,794 followers

    🔬 DoFlow: When Time-Series Forecasting Meets Causal Reasoning I recently read an interesting paper describing a flow-based generative model that unifies observational prediction with interventional and counterfactual forecasting for time-series data: "DoFlow: Causal Generative Flows for Interventional and Counterfactual Time-Series Prediction" https://lnkd.in/eSdbyGRA * I think this method has lots of potential applications in healthcare and biomedical research! 🎯 The Problem Most time series forecasting models are purely observational - they predict correlations but cannot answer causal questions like: Interventional: "What if we change the treatment dose now?" Counterfactual: "Would a different past treatment have prevented this outcome?" These are exactly the questions clinicians often need answered for precision medicine. 💡 The Innovation DoFlow combines three key components: 1. Continuous Normalizing Flows (CNFs) - Neural ODEs that create invertible mappings between noise and data distributions, enabling likelihood computation 2. Causal DAG Structure - Each variable has its own flow conditioned only on its causal parents, respecting temporal dependencies 3. RNN History Encoding - Captures temporal context for conditioning the flows The invertibility is useful as follows: the model can encode factual observations into latent noise, then decode under counterfactual conditions to generate "what if" scenarios. 🏥 Cancer Treatment Applications DoFlow was successfully implemented on the Bica et al. cancer treatment data set as an example. 🔍 Why It Matters ✅ Generates full system trajectories, not just point estimates ✅ Provides explicit likelihoods for anomaly detection ✅ Handles time-varying treatments in complex causal structures ✅ Scales to real-world multivariate systems ⚠️ DoFlow requires a pre-specified causal DAG - in clinical settings this relies on existing domain knowledge #CausalInference #MachineLearning #PrecisionMedicine #TimeSeriesForecasting #Oncology #HealthcareAI #NeuralODEs #ContinuousNormalizingFlows #DeepLearning #CancerResearch #BiomedicalML #ClinicalAI #CounterfactualReasoning #DynamicTreatmentRegimens #ComputationalBiology

  • View profile for Thorsten Wuest

    Professor | Director | Author | Learner | Consultant

    6,853 followers

    Check out the brand-new paper "time-series forecasting in #smartmanufacturing systems: An experimental evaluation of the state-of-the-art algorithms" published today in RCIM (with Mojtaba A. Farahani, PhD, Fadi El Kalach, Austin Harper, M. R. McCormick, Ramy Harik, & Thorsten Wuest). This is the third paper in our exploration of time-series applications in manufacturing and thus adds an important piece to the puzzle! The paper is reports on the largest empirical study of #timeseries #forecasting algorithms in the #manufacturing domain to-date and includes different scenarios to evaluate models using combinations of two problem categories (univariate & multivariate) and two forecasting horizons (short- & long-term). The results show that transformer and MLP-based architectures demonstrated the best performance across different scenarios. For univariate TSF problems, PatchTST emerged as the most robust algorithm. While for multivariate TSF problems, MLP-based architectures like N-HITS and TiDE achieved best results. The paper is available #openaccess here: https://lnkd.in/eiJ9m_2J And we also prepared a neat #podcast version of the paper - so you have no excuse not to engage with it - be it in the car, gym, or beach. Check out this ~30min podcast version here: https://lnkd.in/eDursH4Z This would not be possible without the support of the National Science Foundation (NSF) and (which can not be highlighted strong enough!) the essential work of many dedicated researchers making available data sets publicly. Without their hard work and generosity, this study would not have been possible. We encourage everybody to follow their lead and make research data available - and a big THANK YOU to all that already do! Eamonn Keogh NASA - National Aeronautics and Space Administration UCI Machine Learning Repository et al. Jeff Winter CESMII McNAIR Center for Aerospace Innovation and Research Dean Bartles Jim Davis Knudsen Institute

  • View profile for Christian Martinez

    Finance Transformation Senior Manager at Kraft Heinz | AI in Finance Professor | Conference Speaker | LinkedIn Learning Instructor

    62,668 followers

    Here are 5 machine learning algorithms used for FP&A and #finance time series analysis: ✅ ARIMA/SARIMA: Forecast future revenues and expenses by identifying trends and seasonality. ✅ LSTM: Analyze complex patterns in cash flow or sales data to improve financial planning. ✅ Prophet: Handle unpredictable markets and still make reliable forecasts. ✅ GARCH: Assess and predict market volatility to make more informed investment or budgeting decisions. More detail below ↓ 1. ARIMA (Auto-Regressive Integrated Moving Average) ARIMA helps predict future values by analyzing past data to identify patterns like trends or seasonality. For example, you can use ARIMA to forecast next year’s monthly revenue by recognizing historical trends and seasonal variations, such as higher sales during holiday seasons. 2. LSTM (Long Short-Term Memory) Networks LSTM is an artificial intelligence technique that learns from past data and remembers long-term patterns. It can be used in FP&A to forecast cash flow by identifying recurring inflows and outflows over time, like specific project payments or seasonal cash patterns. 3. SARIMA (Seasonal ARIMA) SARIMA extends ARIMA by incorporating seasonality, making it ideal for forecasting data with regular patterns. For example, you can predict quarterly expenses more accurately if certain quarters have consistently higher costs due to contracts or seasonal demand. 4. Prophet Prophet, developed by Facebook, handles missing data and outliers well, making it useful for complex datasets. To get the code and example for implement it, go here: https://lnkd.in/eJKcHzqU You could use Prophet to forecast annual sales even when your data is incomplete or affected by irregular events like economic shifts. 5. GARCH (Generalized Autoregressive Conditional Heteroskedasticity) GARCH models volatility and is great for predicting how much financial data varies over time. You can apply it in FP&A to assess and predict the volatility of stock prices in your investment portfolio, helping in risk management and budgeting.

  • View profile for Niels Rogge

    Machine Learning Engineer at ML6 & Hugging Face

    65,828 followers

    TimesFM, the Time Series Foundation Model by Google is now available in the Hugging Face Transformers library! TimesFM is very similar to a GPT model like ChatGPT, except it is trained to predict the next values in a time-series, as opposed to the next text token. The model is pre-trained on a large collection of time-series collected by Google. This involves both synthetic data as well as 100 billion actual time-points from various domains like Google Trends and Wikipedia Page views. After pre-training, we can use it to forecast new values of a given time-series, like predicting the value of temperature or sales values. This is called "zero-shot prediction" since we don't update the weights of the model, we just use it out-of-the-box for making predictions on new, unseen data, much like prompting a large language model (LLM) or vision-language model (VLM). Interestingly, the authors claim that it outperforms models like DeepAR which are explicitly trained on time-series. The blog post linked below is a great explainer for knowing how TimesFM works. The idea is to "patchify" a time-series into smaller pieces which serve as the "tokens". We've uploaded notebooks on both inference as well as fine-tuning on custom data. Resources: - blog post: https://lnkd.in/eGDmyhxj - docs: https://lnkd.in/ePDmePDR - notebooks: https://lnkd.in/eF-KYBic

Explore categories