-
Notifications
You must be signed in to change notification settings - Fork 108
Open
Description
Are there any plans to support the ElectraTokenizer?
from transformers import AutoTokenizer
from onnxruntime_extensions import gen_processing_models
tokenizer = AutoTokenizer.from_pretrained("cross-encoder/monoelectra-base", use_fast = False)
gen_processing_models(tokenizer, pre_kwargs={})[0]
Produces the error
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
Cell In[17], line 1
----> 1 onnx_tokenizer_model = gen_processing_models(tokenizer, pre_kwargs={})[0]
File /opt/homebrew/Caskroom/miniconda/base/envs/bedrock/lib/python3.12/site-packages/onnxruntime_extensions/cvt.py:306, in gen_processing_models(processor, pre_kwargs, post_kwargs, opset, schema_v2, **kwargs)
303 return make_onnx_model(pre_g) if pre_g else None, \
304 make_onnx_model(post_g) if post_g else None
305 else:
--> 306 raise ValueError(f"Unsupported processor/tokenizer: {cls_name}")
ValueError: Unsupported processor/tokenizer: ElectraTokenizer
Metadata
Metadata
Assignees
Labels
No labels