Skip to content

Conversation

@jamestut
Copy link
Contributor

This PR is to address issue #240918 where large files still get tokenized even if the editor.largeFileOptimizations option is still active.

This is caused by the _tokenizationTextModelPart being initialized before _isTooLargeForTokenization is computed. This PR moves _isTooLargeForTokenization before the initialization of _tokenizationTextModelPart to address this issue.

@alexdima alexdima enabled auto-merge (squash) February 19, 2025 18:24
@alexdima
Copy link
Member

Thank you!

@alexdima alexdima added this to the February 2025 milestone Feb 19, 2025
@alexdima alexdima merged commit e1c80bc into microsoft:main Feb 19, 2025
7 checks passed
@vs-code-engineering vs-code-engineering bot locked and limited conversation to collaborators Apr 5, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

3 participants