I’ve been using n8n to connect my Learning Record Store (LRS) with real-world user interactions. Right now, when an xAPI statement (learner interaction data) comes in, it can trigger a robot to dance when it scans for specific data in the LRS. Next, I’m layering in Slack messages that respond to specific learner interaction data. It’s a simple way to demonstrate a bigger idea. When we collect granular xAPI data from learning in the flow of work, we can actually do something with it. For example, a customer service simulation could be delivered directly in Slack as a link or interactive chat. The rep completes the scenario right where they work. Each response, such as how they phrase answers, how quickly they respond, and whether they resolve the issue, sends detailed xAPI data to your LRS. That data does not stop there. It could connect with performance data from real customer calls. If those calls show that a rep struggles with empathy or tone, the system can automatically generate a custom simulation to practice that specific skill. After completing it, the rep receives personalized feedback or follow-up practice in Slack based on what the system detected. This could be done in so many different ways like with GenAI to create adaptive practice or add an agent with memory that connects chat data, call insights, and internal systems to deliver coaching that feels timely and contextual. This moves learning from a single event to a continuous, adaptive experience that fits naturally into how people already work. #xAPI #learningdesign #learningintheflowofwork #LRS #GenAI #n8n #instructionaldesign #learninganddevelopment #futureoflearning
Using Data to Create Tailored Learning Experiences
Explore top LinkedIn content from expert professionals.
Summary
Using data to create tailored learning experiences means collecting and analyzing information about how people learn, then using those insights to deliver personalized content and practice that fits each learner’s needs. This approach makes learning more relevant and engaging by adapting to individual skills, preferences, and goals.
- Track learner interactions: Collect detailed data on how people engage with training, such as quizzes, discussions, and hands-on activities, to better understand their strengths and areas for growth.
- Personalize learning paths: Use the collected data to offer custom content, feedback, and practice scenarios that address each person’s unique challenges and preferred learning styles.
- Connect learning to outcomes: Link personalized training data to job performance and business results to see how tailored experiences help people grow and contribute to organizational goals.
-
-
For the past decade many #learning organizations dreamt of adopting the Netflix model. But it was just a dream and frankly, that model would have been a bad-fit in the corporate learning space anyway. However, something interesting just happened that I believe tech minded Learning Leaders / LXM, LXP solution providers should pay attention to. Netflix just published a really interesting Tech Blog article on their foundation model for personalized recommendations. It offers valuable insights that can inform the redesign of learner experiences. Key Takeaways: Centralized Learner Modeling: Netflix transitioned from multiple specialized models to a unified foundation model, centralizing user preference learning. * Application in Learning: Develop a centralized learner model that aggregates data from various learning activities, enabling consistent personalization across different courses and modules. Data-Centric Approach: Emphasizing high-quality, large-scale data over intricate feature engineering, Netflix’s model benefits from end-to-end learning. * App. In Lrg: Prioritize collecting comprehensive learner interaction data (e.g., quiz attempts, forum participation) to inform adaptive learning paths and content recommendations. Interaction Tokenization: Netflix tokenizes user interactions to capture meaningful sequences, similar to language models. * App. In Lrg: Implement tokenization of learning activities to identify patterns (e.g., common misconceptions, preferred learning sequences) that can guide personalized content delivery. Scalable Personalization: The foundation model allows for scalable personalization across Netflix’s vast user base. * App. In Lrg: Design learning systems that can scale personalization efforts, accommodating diverse learner profiles and adapting to evolving educational needs. Interaction tokenization (IT) sounded very similar to LRS (Lrg Record Store). IT is the process of converting user activities (e.g., watching a video, taking a quiz, clicking “next,” participating in a forum, pausing content, revisiting materials) into “tokens”—discrete data units. These tokens form sequences that can be analyzed like language to model and predict learner behavior or preferences. It’s like treating a learner’s journey as a sentence, where: Each “word” is an interaction (e.g., “view_video,” “attempt_quiz,” “fail_question_2”). The “sentence” is a learning path. The model learns from many such “sentences” to predict and personalize future experiences. LRS is the source: It captures and stores granular learning data in a structured format using xAPI statements. Tokenization is the next layer: Once you have data in the LRS, tokenization transforms these raw interactions into meaningful sequences for: Personalization Predictive analytics Content recommendation Learner path modeling (like Netflix does) Really interesting stuff. Give the article a read- link in comments.
-
I was reviewing quarterly reports with a client last month when they asked me a question that stopped me in my tracks: "Scott, we have all this learning data, but I still don't know which programs are actually improving performance." After 12 years as CEO of Continu, I've seen firsthand how organizations struggle with this exact problem. You're collecting mountains of learning data, but traditional analytics only tell you what happened - not why it matters. Here's what we've learned working with thousands of organizations: The real value isn't in completion rates or assessment scores. It's in the connections between those data points that remain invisible without the power of tools like AI. One of our financial services clients was tracking 14 different metrics across their onboarding program. Despite all that data, they couldn't explain why certain regions consistently outperformed others. When we implemented our AI analytics engine, the answer emerged within days: specific learning sequences created knowledge gaps that weren't visible in their traditional reports. This isn't just about better reporting - it's about actionable intelligence: - AI identifies which learning experiences actually drive on-the-job performance - It spots engagement patterns before completion rates drop - It recognizes content effectiveness across different learning styles Most importantly, it connects learning directly to business outcomes - the holy grail for any L&D leader trying to demonstrate ROI. What's your biggest challenge with learning data? Are you getting the insights you need or just more reports to review? #LearningAnalytics #AIinELearning #WorkforceDevelopment #DataDrivenLearning
-
I know everyone is tired of hearing about AI generated content for L&D, but can we talk about it for a moment? We’re on the verge of a massive shift in how learning content is created and delivered - enabling a personalized learning experience. As I mentioned in a post earlier this week, headless applications are paving the way for agents to create custom content that meet associates where they are in their learning journey. I had multiple conversations at DevLearn with vendors who are laying the foundation to make this possible. With Generative AI capabilities, these tools can tailor content based on identified skill and knowledge gaps, attitudes, persona information, deliver it in preferred channels, and even enhance accessibility while reducing bias. Add in the ability to create custom learning paths that guide associates through exactly the right sequence of content, and the potential impact becomes even greater. But scaling this level of personalization raises questions 🤔: How do we manage and track the countless individual assets being created? How do we ensure the business impact of personalized content is measurable? Traditional LMS platforms may not be equipped for this scale, which begs the question: Do we need entirely new systems to handle personalized learning asset management? How do we ensure proper segmentation and representation? And, perhaps most importantly, should all personalized content even be tracked? These are critical challenges we must address as an industry. Finally, personalized media experiences could further enhance engagement. Tools like Notebook LM already enable the creation of tailored podcasts based on specific information. Listen to my podcast on Personalized Learning with GenAI: (https://lnkd.in/eu84XvBG) Imagine combining this type of experience with custom learning paths and chatbot integration to create an ecosystem that motivates associates, supports performance, and drives learning outcomes—all delivered in the channels they prefer. We’re entering an era where scalable, tailored learning solutions—powered by AI—could redefine the L&D landscape. I’m excited to see how these tools evolve and how organizations adapt. What’s your take? Are you starting to see these trends surface in your organization?
-
#Smartworkforce The Future of Learning Is Personalized I still remember one of my past employee from our marketing team-brilliant, creative, but struggling through our standard Excel training while excelling at everything visual and strategic. She'd sit in those generic workshops, checking her phone, clearly disengaged. It broke my heart to see her potential being wasted on content that didn't speak to how she actually learns. That was a few years back. Today, she spearheads impactful digital marketing initiatives at a startup, having flourished through a personalized learning journey that catered to her unique style - visual learner, strategic thinker, and hands-on explorer. This is why personalized learning isn't just a nice-to-have-it's everything. Think about it: We've accepted for decades that learning happens in neat, identical packages. Same content, same pace, same approach. We've squeezed diverse, brilliant minds into cookie-cutter molds and wondered why engagement plummeted and potential remained untapped. But here's what I've witnessed when we flip that script: When an analyst, discovered she learns best through peer discussions rather than solo study sessions, her confidence soared. She went from avoiding team meetings to leading data storytelling workshops. When a developer s realized he needed bite-sized, mobile-friendly modules instead of hour-long seminars, he completed his leadership certification in half the expected time. His team noticed the difference immediately. When our Client’s customer service team got learning paths that adapted to their real challenges-not theoretical ones-customer satisfaction scores jumped 23% in six months. Data-driven personalization isn't about algorithms replacing human insight. It's about technology finally catching up to what great teachers have always known: every learner is unique, and that uniqueness is their superpower. Imagine walking into work knowing that your growth journey is designed specifically for you. That it recognizes you're a morning person who loves podcasts, or a night owl who thinks best with hands-on projects. That it understands you're three years into your career but new to leadership, or that you're switching industries and need confidence-building alongside skill-building. This isn't science fiction. This is happening now. And the ripple effects? They're profound. When people feel seen in their learning journey, they don't just acquire skills-they transform. They become advocates for growth, mentors to others, innovators who see possibilities instead of obstacles. The old way taught us to fit the mold. The new way recognizes that breaking the mold is where breakthrough happens. The future of learning is #personalized. The future of work is human. And when we get both right, there's no limit to what we can achieve together.
-
Before you walk into that training, teaching or lecture room, read this and your sessions will never be the same: Most people do not realize that teaching without data Is like driving at night without headlights. Have you ever finished a training session and wondered, “Did they really get it?” That’s the difference between guessing — and knowing. In today’s world, data isn’t just for analysts — it’s a facilitator’s best teaching companion. Note the following ways data can completely transform how you teach, train, and inspire: ➡️ Use pre-training surveys and diagnostic quizzes to help you understand participants’ skill levels, expectations, and learning preferences. This will help you turn generic sessions into personalized learning experiences. ➡️ Even if you do not have a learning management system (LMS) to help you measure engagements in real time, you can use other tools like kahoot and mentimeter to know who’s active, who’s struggling, and where learners drop off. With this insight, you can adapt instantly instead of waiting for post-training feedback. ➡️ Evaluate what truly works because attendance numbers don’t tell the whole story. Data from assessments, polls, and reflections reveal which activities drive understanding which would help you refine your methods with confidence. ➡️ Use dashboards that show progress and completion rates to illustrate your progress. ➡️ Cultivate the culture of continuous learning using feedback. This is important because with each session’s feedback data, you don’t just teach better — you evolve. Every course becomes smarter, sharper, and more human-centered. Please remember data doesn’t replace intuition — it amplifies it. Because when we teach with data, we don’t just inform minds — we transform outcomes. How do you currently use data in your teaching or training sessions? #DataAnalytics #TeachingWithData #LearningAndDevelopment #Facilitation #LMS #EducationInnovation
-
Medical education has traditionally followed a one size fits all model, yet clinical competency develops differently for every trainee. A new New England Journal of Medicine perspective highlights how AI enabled precision education can continuously assess performance, identify skill gaps early, and personalize training across medical school, residency, and lifelong learning. By aggregating learner data, AI can map learning curves, guide targeted simulation and rotations, and provide real time coaching and feedback. The goal is more consistent training, earlier competency development, and ultimately safer patient care. Read more: https://lnkd.in/gxVmDKK2 Follow Zain Khalpey, MD, PhD, FACS for more on Ai & Healthcare. #MedicalEducation #AIinHealthcare #PrecisionEducation #MedEd #AIinMedicine #DigitalHealth #FutureOfMedicine #PhysicianTraining #HealthcareInnovation #MedicalTraining #ClinicalExcellence #HealthTech #AcademicMedicine #LifelongLearning #MedicalAI #HealthcareLeadership #InnovationInMedicine #NextGenMedicine #MedTech #MedicalInnovation
-
"...Digital Personalized Learning (DPL) emerges as a promising and cost-effective alternative for math remediation. DPL leverages Artificial Intelligence (AI) and machine learning to provide students with adaptive instruction tailored to their competency levels, known as "Teaching at the Right Level" (TARL). The basic principle of TARL is to adapt instruction to match students' needs based on their prior knowledge. This adaptation enhances knowledge retention and motivation, while providing a strong foundation for future learning. Adaptive Learning is a promising mechanism to improve student skills and their perceptions about those skills, known as perceived self-efficacy, which is often associated with academic performance, especially in mathematics. DPL also offers pedagogical strategies and regular data for assessment, accessible through various devices with internet access." https://lnkd.in/dM5YBRti
-
One of the premises of AI for education is the opportunity to create a more engaging and customized learning experience. Today we are introducing a new research experiment, Learn Your Way, which uses generative AI to transform static educational content into a learner-driven engaging experience. For textbook material, it generates multiple representations based on the source material - from mind maps and audio lessons to immersive text with interactive quizzes. Our recent efficacy study shows this approach can lead to improved learning outcomes on both short and long term recall tests. The system is grounded in learning science and powered by our pedagogy-infused family of models, LearnLM, which is now integrated directly into Gemini 2.5 Pro. Try the experience via Google Labs: https://lnkd.in/drGfTZpw Read more about the research on our blog: http://goo.gle/3KqM8i0 And in technical paper: https://lnkd.in/dZuUeKpa
-
How are you using your program data? If your answer is reporting and then some silence....read on. Your program data should be used for multiple purposes beyond accountability and reporting. Here are some tips to apply to start utilizing your program data for learning and adaptive management; 1) Assess if you are answering the correct questions. ↪ Is the data you are collecting what you need to learn? For example, will the number of farmers trained tell you what you need to do to improve the intervention? 2) Design learning questions with the program decision-makers. ↪ Bring together the critical program decision-makers and go through an exercise to determine what information they rely on to assess if things are working as expected. 3) Review and redesign your data collection and analysis system to address the learning needs. ↪ Think beyond quantitative data collection methods. Incorporate participatory M&E and qualitative inquiry approaches. 4) Provide evidence on time and in the correct format to the different decision-makers. ↪ Have more than 1 format for presenting the evidence gathered and ensure it comes at the right time to influence decision-making. 5) Support evidence translation. ↪ Sharing the evidence in written formats is not enough. Consider evidence synthesis and sensemaking activities that help the team understand what the evidence is 'saying.' 6) Set up follow-up systems ↪ Design systems to track how the evidence is used and how the adaptations affect program outcomes. PS: What would you add to the list? Follow me, Florence Randari, for more tips and resources on learning and adaptive management!