Skip to content

[8.x] [ML] Fix for Deberta tokenizer when input sequence exceeds 512 tokens (#117595) #127387

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
May 1, 2025

Conversation

davidkyle
Copy link
Member

Backport

This will backport the following commits from main to 8.x:

Questions ?

Please refer to the Backport tool documentation

…elastic#117595)

* Add test and fix

* Update docs/changelog/117595.yaml

* Remove test which wasn't working

(cherry picked from commit 433a00c)
@davidkyle davidkyle added the :ml Machine learning label Apr 25, 2025
@davidkyle davidkyle added the auto-merge-without-approval Automatically merge pull request when CI checks pass (NB doesn't wait for reviews!) label Apr 25, 2025
@elasticsearchmachine elasticsearchmachine merged commit cb51ef0 into elastic:8.19 May 1, 2025
15 checks passed
@davidkyle davidkyle deleted the backport/8.x/pr-117595 branch May 1, 2025 12:29
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto-merge-without-approval Automatically merge pull request when CI checks pass (NB doesn't wait for reviews!) backport :ml Machine learning v8.19.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants