AI can not just augment, but increase human creativity. Two recent extensive empirical studies on Humans + AI creativity yield a number of important lessons and insights. 🧭 Iteration doesn’t self-improve Across 10 rounds, human–GenAI pairs didn’t become more creative just by repeating the cycle. “More rounds” isn’t a strategy by itself. Creativity improved only when people were explicitly pushed into co-development behaviours (critique + refinement) rather than defaulting to fresh generation each time. 🛠️ Co-development is the real engine The strongest gains came from treating AI as a partner for sharpening an idea - stress-testing, reframing, combining, and tightening - not generating endless options. If you want creativity to rise over time, design the workflow so refinement is unavoidable. 🔁 Galleries beat blank prompts In a study of over 800 people, simply exposing them to “galleries of examples” increased engagement and led to better-quality outcomes. The intervention wasn’t “smarter prompting,” it was changing the interaction pattern so people could browse, compare, and build. 👀 Attention is impact, not just edits Simply viewing AI suggestions can influence the creative process even before any copying or modification happens. Action-based metrics alone undercount value. Evaluation should include attention measures, especially time spent reviewing suggestion galleries, to capture the cognitive engagement that’s driving outcomes. 🧩 Different people need different AI “shapes” The value of one generation strategy (e.g., AI generated vs random suggestions) varies based on the designer’s approach, and that approach can change mid-process. Static “one-size AI assistance” will underperform compared to systems that adapt to the user’s current mode. 🕰️ Don’t promise time savings Galleries increased engagement and led to longer sessions, and those longer sessions produced better outcomes. The win is quality, not speed. Treat these tools as creativity amplifiers, not efficiency hacks. Here is a repeatable Humans + AI creativity process directly derived from the research that you can implement: 1️⃣ Scan a shared gallery of AI suggestions/examples before prompting from scratch. 2️⃣ Select and lock 1–2 candidate ideas to refine instead of generating more. 3️⃣ Run 2–3 co-development passes: critique → strengthen → rewrite the same idea. 4️⃣ Re-open the gallery to compare, recombine, and upgrade the refined version. 5️⃣ Track time spent viewing suggestions and the ratio of refining to generating. 6️⃣ Repeat this cadence regularly and optimize for quality over speed.
Boosting Creativity With Retrieval-Augmented Thinking
Explore top LinkedIn content from expert professionals.
Summary
Boosting creativity with retrieval-augmented thinking means combining human imagination with AI-powered tools that present examples or suggestions, helping people generate, refine, and build new ideas in collaborative cycles. This approach transforms AI from a passive generator into an active partner, expanding the range of possibilities and deepening creative outcomes.
- Engage with examples: Browse galleries of AI-generated ideas before starting your own creative process to spark inspiration and broaden your perspective.
- Refine collaboratively: Treat AI suggestions as starting points and work through several rounds of critique and improvement for stronger, more meaningful results.
- Mix thinking modes: Alternate between human-driven brainstorming and AI-powered variation, blending both approaches to explore new directions and synthesize unique solutions.
-
-
If your AI brainstorming starts with an AI prompt such as “give me ideas about for X,” you’re limiting your imagination. I learned this while working through IDEO U’s Human-Centered Design and AI certificate program, which keeps reminding me that AI only supports creativity when humans stay actively involved. To test this, I ran a small experiment tied to my design challenge: how can nonprofit professionals use AI to augment their thinking so their work becomes more strategic, creative, and human-centered? Here’s what happened. When I began with human-only ideation (my own brain or a brainstorming session with other humans), the ideas were grounded in mission, constraints, and real community needs. When I switched to AI with a clear creative direction to generate ideas, I asked for absurdity. AI delivered: costume-based learning scenes, dramatic falling sequences, Play-Doh brains, even a human–AI tango. These weren’t solutions or a waste of time. They were creative provocations that loosened up the tight mental space we often operate within. The best ideas emerged only after I cycled through several layers of human grounding, AI variation, and human synthesis. It felt like a club sandwich of thinking modes. Humans brought mission and ethics. AI widened the possibility space. Humans shaped meaning. The infographic (created in Nano Banana) shows the practices that made this work: 💡Begin with human insight. 💡Give AI a clear creative direction. 💡Separate idea expansion from idea selection. 💡Use reflective checkpoints. 💡Treat AI as a partner, not a replacement. This experiment makes me think that the real value of AI in nonprofit brainstorming is less about efficiency and more about expanding imagination. When humans guide the process, AI becomes a thought-partner for more human-centered creativity. What would open up in your work if your organization treated AI as a creative partner instead of a shortcut?
-
Worried AI adoption will kill your team's creativity? A new study shows that AI tools can boost creativity if you know how to think with them. A field experiment published in the prestigious Journal of Applied Psychology followed 250 tech consultants using LLMs in their actual work. The results? AI tools boosted creativity by providing "cognitive job resources" such as: - Access to broader information - Ability to switch between tasks - Opportunities for mental breaks But not all employees got the same boost. The key differentiator? Employees’ metacognitive strategies. What are metacognitive strategies? They're the skills that help you: ► Evaluate if prompts are working ► Think through what you need from the LLM before starting ► Use AI for information gathering while you focus on synthesis ► Change your strategy rather than repeating the same prompts In the end, this study shows us that passively consuming AI outputs is what yields minimal creative benefits. ✅ Instead, active engagement—the mental collaboration between human and machine—is what drives real creative output. For leaders, this study offers a helpful insight: simply deploying AI tools isn't enough. 👇 Organizations should assess and train employees in AI collaboration to get the most from their investments.