Copilot doesn’t work if profanity exists #55630
Replies: 21 comments 12 replies
-
|
It also doesn't work at all if the project directory name contains profanity #55378 |
Beta Was this translation helpful? Give feedback.
-
|
We have been intentionally spreading profanity through our code bases to keep co-pilot from stealing. It has been very successful so far, and I hope it stays this way. It's not a bug. It's a feature. |
Beta Was this translation helpful? Give feedback.
-
|
Just spent time debugging copilot not working because I had shorthanded an "assembly" variable to "ass" in a comment. I love the future. |
Beta Was this translation helpful? Give feedback.
-
|
I was working on a small project which is for processing ASS (Advanced SubStation Alpha) subtitle (https://en.wikipedia.org/wiki/Subtitles#Subtitle_formats) and copilot refused to work due to variable name containing "ass". This is ridiculous, ass as a word or shorthand is too common to be put into "profanity words list". |
Beta Was this translation helpful? Give feedback.
-
|
Found this pretty funny lol, but strongly believe that you should be able to have anything you want in your code... Microsoft doesn't appear to have changed its mind on this 1 year after so I built my own Copilot extension that handles profanities (there are no words banned). You can try it here if you're still looking for a solution. |
Beta Was this translation helpful? Give feedback.
-
|
The reported behavior of Copilot, where it refuses to function properly in the vicinity of profanity within code files, is indeed concerning and frustrating. It's understandable for developers to encounter challenges with profanity filtering in code-related tools, but the described behavior seems inconsistent and puzzling. |
Beta Was this translation helpful? Give feedback.
-
|
Just discovered this. The irony is that when I start putting profanity in my code, it is because I am frustrated, then copilot stops working, which frustrates me further. |
Beta Was this translation helpful? Give feedback.
-
|
While the completion quality doesn't seem to match Github Copilot yet, CodeGemma 2B has no issues with profanities. I use it locally with Continue.dev and Ollama. "tabAutocompleteModel": {
"title": "Code Gemma",
"provider": "ollama",
"model": "codegemma:code"
},Hopefully the existence of good and free models will make Github and Microsoft change their stance towards their quite excessive guardrails. |
Beta Was this translation helpful? Give feedback.
-
|
Wasted 20 minutes trying to figure out why copilot wouldn't work... it was the print("Hello ass!"); statement, something I tend to type instead of hello world. Great. In addition to the horrible decision to be this sensitive, why on earth would it be a totally silent error so it just seems broken?! |
Beta Was this translation helpful? Give feedback.
-
|
This is absurd. I’ve written an entire function just to check if my data is screwed, and Co-pilot won't touch it. Is this some kind of American prudishness where they think swearing will cause the apocalypse or universal healthcare? The English swear constantly, and the Australians practically use it as punctuation. |
Beta Was this translation helpful? Give feedback.
-
|
>llm |
Beta Was this translation helpful? Give feedback.
-
|
I had the same issue and spent hours debugging before I came here. To identify the offending word, I typed something at the start of the file where completions worked, then moved line by line until completions stopped working. In my case, the issue was caused by the Norwegian word "slutt" (which means "end") in a comment. What a useless feature. Least they could do was log something at least. Only clue in the log is this: |
Beta Was this translation helpful? Give feedback.
-
|
This problem has not been solved for the second year. I'm thinking of canceling my subscription and looking for another solution |
Beta Was this translation helpful? Give feedback.
-
|
This is still an annoying bug. It seems to trigger on extremely mild profanity, and words that aren't considered profanity in my locale. |
Beta Was this translation helpful? Give feedback.
-
|
Is this just gonna be ignored forever? I'm still having this issue, this time while trying to use Copilot in a Vue component about an album which contains the word "damn" in one of the b-side song titles. (not even in a profane way, just a reference to a somewhat silly mnemonic) Are we really censoring "damn" now? What is this, the 80s/90s? What could an LLM say using "damn" that couldn't also be said using either "screw" or "stupid?" Who is this supposed to protect? |
Beta Was this translation helpful? Give feedback.
-
|
This royally pisses me off. I PAY for this software and I PAY for it to do as I ask. I don't pay for censorship. I really fucking love this future where I have to base64-encode abbreviations because our corpo-overlords are scared of reality. |
Beta Was this translation helpful? Give feedback.
-
|
+1, absolutely mind-boggling |
Beta Was this translation helpful? Give feedback.
-
|
+1 |
Beta Was this translation helpful? Give feedback.
-
|
Oh the notification reminded me, honestly I've just stopped using "autocomplete" LLMs entirely for a bit. Using a local uncensored one is probably fine, but my hardware is too old for that at any reasonable speed. For the "boilerplate" that I would typically use copilot for, like formatting data into strings or imports and other crap, I still use Gemini CLI. It feels separate enough from my actual editing process that programming feels more fun again. Also not nearly as "integrated" into the workflow-- so rare that I don't feel like I'd be "missing out" if I decide to type some string that Big Tech doesn't like and lost access to it. Not to mention IT DOESN'T HAVE THIS STUPID ISSUE BUILT INTO IT!!!!!!!! |
Beta Was this translation helpful? Give feedback.
-
|
This policy is absolutely asinine. Even the presence of these words in a file prevents code completion from working for anything around it. "trans" is listed as one of their banned words, which is absurd because it isn't profane, and is used SO commonly in data science as short for "transform." "sex" is also banned, even though this is a medical term, and most medical data includes a sex designation - anyone writing code to analyze epidemiology data, or clinical trial data, or mouse studies, cannot use the word "sex." |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Select Topic Area
Bug
Body
I’m unsure whether this is truly a bug, but it is frustrating. Whenever there’s profanity in a file, Copilot refuses to work.
If I’m ‘far enough’ away from the piece of code or comments that contains expletives, Copilot will work in that file, and it’ll also work in other files, but it won’t work near the expletives, which is presumably due to the context length limit.
It’s quite bizarre, especially since Copilot is capable of generating profanities itself. I don’t understand why it decides not to work if it’s close to profanity.
Beta Was this translation helpful? Give feedback.
All reactions