Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

19
  • 1
    "Learning" is intentional because historically Stack users not only want answers, but want to and appreciate learning. We've designed AI Assist to enforce this via its responses. We're continually improving the search portion of AI Assist to return the best results from the community. Thanks for sharing. I'm not sure what your question re: conversations is. Conversations are what each sessions of AI Assist is called. Commented Dec 2, 2025 at 17:35
  • Thanks, @AshZade. Sure, but I'm not saying it's not what people want to hear, just that it's not really what the AI Assist can help with. It's unclear communication, like some other things I pointed out. And aha, the conversations part now makes complete sense and I feel embarrassed I didn't pick up on that :| Commented Dec 2, 2025 at 18:07
  • 1
    no need to be embarrassed at all. AI Assist can be used to learn. It doesn't just return search results. You can ask about how things work, how to get started, trade-offs, etc. It may be semantics re: what we think "learning" is, but it's not designed to "just give the user the answer". One big reason users like chatbots is that they're conversational where they respond naturally and users can refine the conversation so they can get what they want from it in a way they understand. Commented Dec 2, 2025 at 18:10
  • "Then why does it ask me what I'd like to learn today". Instead of "What would you like to learn today?", how about: Where do you want to go today? Commented Dec 3, 2025 at 7:54
  • 3
    @Lundin That's similarly confusing, which is quite typical for adspeak. For a car or airline company it makes sense, not for an aggregator tool. Commented Dec 3, 2025 at 10:35
  • 2
    @AshZade That it doesn't just give the answer is not the problem I have with it, rather the opposite: it's only collecting information. It helps one learn as much as looking up a Q&A oneself, or reading an article on Wikipedia. But Wikipedia itself provides said information, while this tool does not; its purpose is not even to help you learn, because it can't structure information properly or tailored to the user, its purpose is to quickly go through and recap pre-existing information that would otherwise take a while to find. "What can I help you look for?" would be more appropriate. Commented Dec 3, 2025 at 10:44
  • 4
    @Joachim My point: if you work in IT and didn't live underneath a rock in the 90s, you wouldn't pick something that sounds just like Microsoft's old, bad sales blurb from the 1990s. Unless you want the user to associate your product with old bad Microsoft products... Commented Dec 3, 2025 at 11:12
  • "...what exactly are those conversations..." To add to other answers. The idea is nowadays that you can refine results of a search by giving additional feedback. Basically you create different more and more extended versions of your search phrase that hopefully converge to what you wanted to get. Commented Dec 3, 2025 at 11:26
  • @NoDataDumpNoContribution Ah, right. "Prompting is Hard: Let's go do Our Own Research". Commented Dec 3, 2025 at 11:32
  • 1
    @AshZade Can you elaborate on the "trusted human intelligence layer", by the way? Commented Dec 3, 2025 at 11:33
  • 1
    @Joachim on this point "it can't structure information properly or tailored to the user", that's exactly what we're trying to do with how we structure the response with the sections like "tips & alternatives, trade-offs, next steps". They're tailored for learning. The "trusted human intelligence layer" is very wordy but it is to do with prioritizing SO/SE content (where it exists) and the LLM supplementing it. Commented Dec 3, 2025 at 14:15
  • 2
    @AshZade So that 'layer' is what the Assist finds? And since what it finds is based on human intelligence (i.e. written by humans—for now, at least) and is trusted (because it prioritizes answers based on votes?) the team came up with that phrase? As for the "tailored" part: the Assist is tailored for finding solutions quicker, sure, but not "to the user" (I'm definitely nitpicking here). Where does it find those additional tips, actually? It doesn't seem to take them from the community answer, so is it searching the web, as well? Commented Dec 3, 2025 at 14:30
  • 1
    @AshZade Imagine if you just removed the AI assist and were left with only the "trusted human intelligence layer", ie. a Q&A site like SO as was. Once again it's confusing why so much effort is being put into a feature no one wants, who's best answers are just pointers to actual SO content. Commented Dec 16, 2025 at 10:30
  • @pilchard I've commented a few times on the "no one wants" sentiment. I understand a shared opinion here on meta, but the overall usage and data tell a different story. I understand that a lot of folks here to respond well to that or demand more information. We would not invest months of time and energy into AI Assist if we didn't have the data to back it up. Commented Dec 16, 2025 at 13:44
  • 1
    @AshZade SO has a track record of continuing to develop unwanted features based on 'data' that eventually are abandoned or just added to the pile of things people block with UI addons, so yes, you would invest months of time. Generally this is seems to be due to skewed metrics based on poor A/B testing, poor design decisions or a seemingly willful process of naively interpreting metrics exactly so that development can continue in the direction the company wants to go, not what is best for the content or community. Commented Dec 16, 2025 at 16:13