90% of startups don’t fail because of: Bad marketing, a weak team, or even a poor product. They fail because they lack a repeatable decision-making process. Here’s the framework I use to make better, faster decisions in business. I call it “The Iteration Loop.” It’s a structured way to identify what’s working, what’s broken, and what to do next, without getting stuck in endless guesswork. It gives you a systematic way to eliminate bottlenecks, optimize execution, and scale with clarity. Here are the 6 phases: 1. Bottleneck Identification 2. Clarifying the Goal 3. Solution Brainstorming 4. Focused Execution 5. Performance Review 6. Iterate & Improve 1️⃣ Bottleneck Identification Before you can fix anything, you need to identify the real problem. Most entrepreneurs spin their wheels solving the wrong issues because they never dig deep enough. To get clarity, ask: + What's the biggest constraint stopping growth right now? + What metric, if doubled, would create the biggest impact? + What’s preventing us from getting there? If you don’t identify the root problem, every solution you apply will be wasted effort. 2️⃣ Clarifying the Goal Once you know the problem, define the exact outcome you’re solving for. I use a simple Three-Part Goal Formula: 1. What are we trying to achieve? 2. By when? 3. What constraints do we have? Vague goals lead to vague actions. Precision forces progress. 3️⃣ Solution Brainstorming Now, generate every possible solution—without filtering. Most people limit themselves to their existing knowledge, which is why they get stuck. Instead, ask: “If there were no rules, what would I do?” This opens up better, faster, and often simpler solutions you wouldn’t have otherwise considered. 4️⃣ Focused Execution Don’t test everything at once—test one variable at a time. Most teams waste months by making too many changes at once, leading to messy, inconclusive results. Instead, break it down: 1. Test one key assumption. 2. Measure one KPI that proves or disproves it. 3. Execute for a set period, then review. 4. Speed matters. Complexity kills momentum. 5️⃣ Performance Review Your data isn’t just numbers—it’s feedback on your decision-making process. Your job is to analyze: + Did the solution work? + Why or why not? + What does this tell us about our business? Every test refines your ability to make better future decisions. 6️⃣ Iterate & Improve Most companies don’t fail from making the wrong move—they fail from making no moves at all. The only way to win long-term is to keep iterating. Instead of fearing failure, build a culture that rewards learning. Failure + Reflection = Progress. If you aren’t improving your decision-making process, your business will eventually hit a ceiling. That’s why I built The Iteration Loop—so every problem becomes an opportunity for better, faster execution. P.S. If you want the scaling roadmap I used to scale 3 businesses to $100M and beyond, you can get it for free from the link in my profile.
Intuition in Decision Making
Explore top LinkedIn content from expert professionals.
-
-
The Co-Founder Dynamic: How Varun Alagh and I Navigate Disagreements "Show me your numbers." That's become our default response whenever we disagree. Not "you're wrong" or "trust me on this", just "show me your numbers." This approach was born from a heated 2017 argument in our living room, in front of our son, over a product launch decision. Varun wanted to delay, I wanted to ship. We were both passionate, both convinced we were right. But we were both arguing from gut feelings, not facts. Now, years later, here's how we handle disagreements: 1. Data Wins, Egos Lose When we disagree, we each gather our strongest data points within 24 hours. Market research, consumer feedback, financial projections, competitor analysis: whatever supports our position. Then we compare. The stronger data set wins. 2. Define Decision-Making Domains We divided responsibilities clearly to minimize overlap conflicts. And while some decisions we still take together, the overall result is 80% fewer conflicts because we know who has the final say. 3. The 24-Hour Rule for Major Disagreements If the data is inconclusive or we can't agree after reviewing the numbers, we sleep on it. Emotions cool down, egos step aside, and new perspectives often emerge. Our best decisions come from our second conversation, not our first argument. The deeper truth: Our different perspectives make us stronger. Varun's analytical approach balances my intuitive decisions. My market instincts complement his operational rigor. But data grounds both of us. What we've learned: • Two founders agreeing all the time means one is unnecessary • Healthy conflict leads to better decisions—if it's fact-based • Respect for data matters more than being right • The best arguments are won with evidence, not emotion #CoFounderDynamics #Entrepreneurship #StartupLessons
-
I Was Wrong—And You Are Too (Or, The Lies Experience Tells You) I started as an engineer, hired for my ability to build things and solve problems. But when I moved into management, something shifted. I wasn’t writing code anymore—I was making decisions that shaped teams, projects, and strategies. Over time, what people valued most wasn’t my technical skills, but my judgment: recognizing patterns, making good decisions, and bringing perspective. That’s what we usually mean by wisdom—not just knowing things, but knowing what works and what doesn’t. But here’s the catch: technology moves fast. And some of the patterns I relied on turned out to be wrong. Not because they were never true, but because I wasn’t questioning them often enough. For years, I believed speed and reliability were in conflict. Move fast, and you’d sacrifice stability. Optimize for reliability, and you’d slow down. And at the time that was mostly true—continuous deployment was immature, testing was inconsistent, observability was limited. But things changed. Today’s top teams balance speed and reliability by designing for resilience, automating deployments, and utilizing advanced monitoring and rollback mechanisms. My thinking hadn’t kept up with this evolution. That’s the real danger of wisdom—it feels like truth when it’s just outdated experience in disguise. You’ve seen something fail before, so you assume it always will. You’ve seen something work, so you treat it as a universal rule. Just because something didn’t work five years ago doesn’t mean it won’t work now. And just because a pattern held true in the past doesn’t mean it always will. The real problem isn’t having strong opinions. It’s not revisiting them. I’ve caught myself doing this more times than I’d like to admit. A new idea comes up, and my first reaction is skepticism: I’ve seen this before. I know how this ends. But if I’ve learned anything, it’s this: being too sure is dangerous. The best engineers and leaders I know aren’t just wise—they’re curious. They don’t just rely on past experience; they keep learning, questioning, and re-examining their assumptions. They use experience as a guide, not a rulebook. Wisdom is valuable—until it isn’t. The best decisions come not just from experience, but from staying curious, challenging assumptions, and staying open to new ideas. Technology moves forward. If you’re not rethinking old assumptions, you’re falling behind.
-
𝗟𝗟𝗠𝘀 𝗰𝗮𝗻 𝗯𝗲 𝗯𝗶𝗮𝘀𝗲𝗱. 𝗔𝗡𝗗 𝘁𝗼 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝘄𝗵𝘆, 𝘄𝗲 𝗳𝗶𝗿𝘀𝘁 𝗻𝗲𝗲𝗱 𝘁𝗼 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝗢𝗨𝗥𝗦𝗘𝗟𝗩𝗘𝗦! This chart shows 20+ cognitive biases, which are the invisible patterns driving our decisions every day. They shape what we notice, what we remember and what we trust. And yes, useful in some cases, but also disastrous in others. 𝗟𝗟𝗠𝘀 𝗮𝗿𝗲 𝘁𝗿𝗮𝗶𝗻𝗲𝗱 𝗼𝗻 𝗵𝘂𝗺𝗮𝗻 𝗱𝗮𝘁𝗮. 𝗦𝗼 𝘄𝗵𝗲𝗻 𝘄𝗲 𝗮𝗿𝗲 𝗯𝗶𝗮𝘀𝗲𝗱 — 𝘁𝗵𝗲𝘆 𝗹𝗲𝗮𝗿𝗻 𝗶𝘁 𝗮𝗻𝗱 𝘀𝗰𝗮𝗹𝗲 𝗶𝘁: 𝗬𝗼𝘂 𝗰𝗮𝗻’𝘁 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝗯𝗶𝗮𝘀 𝗶𝗻 𝗔𝗜, 𝘂𝗻𝗹𝗲𝘀𝘀 𝘆𝗼𝘂 ���𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝗯𝗶𝗮𝘀 𝗶𝗻 𝗵𝘂𝗺𝗮𝗻𝘀. 𝗛𝗲𝗿𝗲 𝗮𝗿𝗲 𝘀𝗼𝗺𝗲 𝗰𝗼𝗴𝗻𝗶𝘁𝗶𝘃𝗲 𝗯𝗶𝗮𝘀𝗲𝘀 𝘁𝗵𝗮𝘁 𝗟𝗟𝗠𝘀 𝗰𝗮𝗻 𝗿𝗲𝗳𝗹𝗲𝗰𝘁 — 𝗯𝗲𝗰𝗮𝘂𝘀𝗲 𝘁𝗵𝗲𝘆’𝗿𝗲 𝗲𝗺𝗯𝗲𝗱𝗱𝗲𝗱 𝗶𝗻 𝘁𝗵𝗲 𝗵𝘂𝗺𝗮𝗻 𝗱𝗮𝘁𝗮 𝘁𝗵𝗲𝘆 𝗹𝗲𝗮𝗿𝗻 𝗳𝗿𝗼𝗺: 🧠 Anchoring effect – Relying too much on the first piece of info 🧠 Availability heuristic – Overestimating what’s easy to recall 🧠 Bandwagon effect – Following majority views 🧠 Belief bias – Letting personal belief outweigh logic 🧠 Blind spot bias – Thinking others are biased, not us 🧠 Clustering illusion – Seeing patterns where there are none 🧠 Confirmation bias – Seeking data that confirms what we already think 🧠 Courtesy bias – Avoiding conflict by saying what's expected 🧠 Endowment effect – Overvaluing what we already have 🧠 Gambler’s fallacy – Believing past events affect future ones 🧠 Hyperbolic discounting – Choosing short-term reward over long-term gain 🧠 Illusion of validity – Trusting info just because it *feels* right 🧠 Ostrich effect – Ignoring uncomfortable truths 🧠 Post-purchase rationalization – Justifying past decisions 🧠 Reactive devaluation – Rejecting ideas just because of the source 🧠 Risk compensation – Taking more risks when we feel protected 🧠 Status quo bias – Favoring the familiar 🧠 Stereotyping – Making assumptions based on group identity Not all cognitive biases translate directly into LLM behavior. Some do, because they’re reflected in training data (like confirmation bias, stereotyping, status quo bias), while others are more deeply tied to human cognition, emotion, or self-awareness — which current LLMs don’t possess. 𝗛𝗼𝘄𝗲𝘃𝗲𝗿, 𝘁𝗵𝗲 𝗯𝗲𝘁𝘁𝗲𝗿 𝘄𝗲 𝘂𝗻𝗱𝗲𝗿𝘀𝘁𝗮𝗻𝗱 𝗼𝘂𝗿 𝗵𝘂𝗺𝗮𝗻 𝗢𝗦, 𝘁𝗵𝗲 𝗯𝗲𝘁𝘁𝗲𝗿 𝘄𝗲 𝗰𝗮𝗻 𝗯𝘂𝗶𝗹𝗱 𝘁𝗿𝘂𝘀𝘁𝘄𝗼𝗿𝘁𝗵𝘆 𝗺𝗮𝗰𝗵𝗶𝗻𝗲 𝘀𝘆𝘀𝘁𝗲𝗺𝘀.
-
When to trust your gut—& when it’s just a dodgy feeling in a suit Let’s clear something up. “#TrustYourGut” is either the most overused advice in leadership… or the most misunderstood. It’s quoted in boardrooms, scribbled in journals, whispered by mentors—& weaponized in bad decisions. Because sometimes your #Gut is a superpower. & sometimes… it’s just last night’s biryani trying to run strategy. 🧠 First, let’s talk about gut instinct If anyone ruined the “follow your gut” party (in the best possible way), it’s Daniel Kahneman—Nobel Prize winner & the godfather of how we think about thinking. He made it clear: 👉 You should only trust your #intuition if you've earned it. He called it "expert intuition"—not some magical sixth sense, but the result of years of experience in a specific, stable environment with regular feedback loops. 📊 In his research (alongside Gary Klein, no less), he outlined 3 conditions where intuition becomes reliable: 1. You operate in a predictable environment 2. You’ve had extensive experience in that environment 3. You’ve had frequent, accurate feedback to refine your instincts 😬 When #GutInstinct goes wrong Here’s when leaders get it wrong: • Making gut calls in new territory without pattern recognition • Confusing ego for instinct • Using “intuition” to shortcut hard conversations or analysis • Assuming speed = strength & gut = glory 📉 A Dunning-Kruger study showed that the people most confident in their intuition are often the least experienced. You don’t trust your gut because you feel confident. You trust it because you’ve earned the right to recognize the pattern. 💡 The #leader’s gut test: When to trust it You trust your gut when: • You’ve seen the movie before—& you know how it ends • The data is incomplete, but your lived experience fills the gaps • You’ve trained your intuition through failures, feedback, & friction • The logic says “maybe,” but your gut says “definitely not” You ignore it when: • You’re in a domain you haven’t navigated • You’ve had no feedback loops to sharpen your internal compass • Your gut is saying what your ego wants to hear 👉 In #leadership, intuition without #humility is just arrogance in autopilot. 🧭 So what do great leaders do? • They listen to their gut… but then validate it • They use experience to shortcut paralysis, not avoid due diligence • They surround themselves with truth-tellers, not yes-people • They know that timing matters—gut instinct often speaks before the data catches up 🎯 According to a PwC survey, #CEOs who combine instinct with structured decision-making outperform their peers by 33% in crisis response scenarios. Translation: the best decisions feel right & hold up under scrutiny. “#TrustYourGut” is not a get-out-of-decision-making-free card. It’s not strategy by spidey-sense. 👉 It’s earned intuition—sharpened by experience, validated by feedback, & backed by pattern recognition. So, trust your gut... Only if it’s been punched before.
-
Saying No is difficult. But when you realize saying Yes would close all doors, would not lead to any good and bring you back to square one, saying No becomes easy. During my Shark Tank India pitch, Flatheads was offered the amount I had asked for. BUT, with significantly higher equity ask than I offered. I was getting the amount I thought I would need, to go back to the drawing board and figure out how to revive Flatheads. Then why did I refuse the offer? Here’s why: 1. The ask for equity was huge. While the sharks had a reason for their offer, it also meant it closed further doors for raising funding later on. So it meant revive or lose, within Rs. 75 lakh. Not something I thought I was ready for. 2. Before going live on the pitch stage, I had decided an equity percentage that would be a walkaway point for me. Thus, making the decision for an equity ask beyond that, was made a little simpler. 3. During the pitch, I took some time from the sharks to think about the offer. That’s because I did not want to be impulsive about a yes or a no. When I composed myself, got out of that pressure of standing in front of sharks, and spent some time thinking in solitude, I had my answer. In life, you don’t have to make long term decisions - decisions for a lifetime - based on options that sustain you for a short term, no more than a few months maybe. When you have this clarity, the ability to say No comes to you inevitably. #startups #equity #funding #sharktankindia #downbutnotout
-
Confession time: As a leader, I often get asked if I'm more intuitive or calculative in my decision-making. The truth is, it's a bit of both. Recently, we were in the middle of expanding our vendor partnerships at Falabella, and an opportunity came up with a key supplier. The catch? We had only 72 hours to decide before a competitor could swoop in. My gut told me this partnership was the right move—it aligned with our long-term goals, and the supplier's reputation was solid. But I couldn’t just go off instinct. I called an emergency meeting with my team. We reviewed everything—from the supplier’s past performance to our budget forecasts and potential market shifts. I knew we had to move fast, but I wanted to make sure every angle was covered. In the end, the numbers confirmed what my instinct was already telling me—I made the call to sign the deal. Looking back, it wasn’t just about moving quickly—it was about being decisive with the right balance of instinct and analysis. In moments like this, I make sure to keep a few things in mind: > First, while my decisions are based on facts, I never forget the human side—how my choices impact my team, my partners, and the people around me. > Second, I’m constantly aware that leadership is as much about people as it is about strategy. > Finally, it's important to act swiftly but thoughtfully, blending instinct with calculated risks. What about you? Do you lean more toward intuition or calculation when making decisions?
-
Was Kahneman wrong when it comes to using data in strategy? 🧠Kahneman, in Thinking, Fast and Slow, taught us that our minds are negatively affected by biases. He says, fast thinking, and intuitive shortcuts lead to predictable errors. His advice, therefore, is to slow down, analyse carefully, and avoid being overconfident. ⚡Gigerenzer, in The Intelligence of Intuition, looked at how we make decisions and saw something else entirely. He suggests that heuristics (mental shortcuts, or rules of thumb) are not flaws but features that evolved to allow us to make good decisions in situations of uncertainty. He tells us not to fight intuition, but to tame it, to learn when it works and when it doesn’t. As a scientist, I used to wholeheartedly believe that data-driven decision-making is the best approach to any problem, including strategy. Collect good data, use the right formula and the correct answer will result. But then I realised, if that were true, big companies with huge resources and capabilities to gather data wouldn’t ever fail. But they do. This nagging doubt has always been in the back of my mind. And I kind of understood that it has to do with uncertainty. But Gigerenzer showed me a different way of thinking about our human intuition. Precisely because strategy lives in the world of uncertainty, where data is incomplete, time is short, and cause and effect are unclear. In that world, analysis starts to fail. Kahneman’s world is one of known risks, measurable probabilities, stable relationships and repeatable patterns. Perfect for finance, process optimisation or quality control. When the rules are clear, analysis wins. Gigerenzer’s world is one of true uncertainty, novel markets, disruptive technologies and ambiguity. Here, too much analysis can paralyse. Here, heuristics, the ability to simplify and act, can outperform complex models. Gigerenzer speaks about adaptive rationality, which really means using the right simplicity in the right situation. Intuition tends to outperform analysis, particularly when you face novel, fast-changing conditions (disruption, innovation… I’m looking at you AI) and you’re overloaded with information. But you need accumulated experience from similar situations. Analysis shines in stable, rule-based environments when cause and effect are understood, and everything is measurable and repeatable. That’s where models and forecasts deliver real value. But when the system itself is shifting, excessive analysis can mislead, giving the illusion of certainty where none exists. 💡Great strategists know when to switch between data and intuition. Kahneman helps us see where intuition fails. Gigerenzer helps us see when it works. That’s why I believe strategy will always remain a human art, not an analytical one, reducible to algorithms or statistics. ------------------- 👋 Hi, I’m Kerstin. I help organisations create strategies under uncertainty by balancing analysis with human judgment. #Strategy
-
What if trusting your gut is just as critical as crunching numbers? 🔍 As leaders, we’re faced with high-stakes choices every day. Balancing intuition with data has helped me make decisions that move our startup forward while staying adaptable in changing landscapes. Here’s what I’ve learned along the way that might resonate with you too: 1️⃣ Balance Intuition with Data: Trusting your gut can sometimes reveal insights numbers miss, but grounding those instincts with data ensures you’re looking at the full picture. Because sometimes you just know “It's gonna work for sure” . That confidence backed by data is all your business needs. 2️⃣ Practice Emotional Awareness: Business is business, it grows and fails because of your choices. So it should be backed by emotionally balanced, empathetic but practically feasible decisions. Recognize your emotions, but don’t let them cloud your choices. 3️⃣ Stay Adaptable: Flexibility is key. Whether a decision leads to success or becomes a lesson, each step builds resilience and better instincts for the future. We have seen many examples where pivoting at the right time made them super successful today. 4️⃣ Lean on Trusted Advisors: The best decisions often come with input from others. Mentors and peers, those who have been there and done that can provide insights that help you avoid pitfalls or see hidden opportunities. 🤝 Ultimately, effective decision-making is about blending intuition with insights. There is absolutely no place for ego and emotional biases. You gotta do what is right for your stakeholders no matter what. This is how we have come this far and going stronger to make a dent in the hiring world. 💪 What’s your go-to strategy for making tough calls? Let’s share insights! 👇
-
"Just curious, how's your forecast looking?" My CEO friend asked me. The weekly forecast review. The monthly pipeline call. The quarterly business review. All centered around one flawed model: Asking reps to predict the future based on gut feeling. "50% chance of closing." "Strong verbal commitment." "Just waiting on final approval." These phrases hide a painful truth: We have no idea what's actually happening inside our deals. I changed how we forecast last quarter: Instead of: "How do you FEEL about this deal?" We now ask: "What have they actually DONE?" - Has the economic buyer viewed pricing? - Have technical stakeholders reviewed security docs? - Have end users looked at implementation plans? - Is the champion actively sharing content internally? Behavior doesn't lie. Words do. We tracked content engagement across 200+ deals: Closed deals: Prospects engaged 7+ times in final two weeks Lost deals: Engagement dropped to 0-1 interactions before going dark The deals your team is most confident about? Often the ones with the least actual buyer engagement. Here's how we transformed our approach: Every opportunity now has a digital space where we can see: - Exactly who is engaging with what content - Which stakeholders are involved (even ones we haven't met) - Where deals are getting stuck - When interest spikes or drops Our forecast accuracy improved INSANELY. Stop asking reps what they "think" will happen. Start measuring what buyers are actually doing. The best indication of deal health isn't what prospects tell you. It's how they behave when you're not watching. Do you know what your buyers are really doing? Or are you still forecasting based on feelings? Agree?