Multilingual Text Embedding Model Pruner
Measure over-refusal in LLMs using OR-Bench
View NSQL Scores for Models
Convert a Stable Diffusion XL checkpoint to Diffusers and open a PR
Download a TriplaneGaussian model checkpoint
Display benchmark results
Evaluate model predictions with TruLens
Push a ML model to Hugging Face Hub
Convert Hugging Face model repo to Safetensors
Display leaderboard for earthquake intent classification models
Merge machine learning models using a YAML configuration file
View RL Benchmark Reports
Browse and filter ML model leaderboard data
MTEM Pruner is an advanced tool designed for pruning multilingual text embedding models. It allows users to .optimize large multilingual models by focusing on a specific language or set of languages. This makes the model more efficient and lightweight while maintaining high performance for the target language(s). MTEM Pruner is particularly useful for developers and researchers working on model benchmarking and fine-tuning.
Install the Tool
Install MTEM Pruner using pip or directly from source:
pip install mtem-pruner
Import the Library
Load the required libraries and initialize the pruner:
from mtem_pruner import MTEMPruner
pruner = MTEMPruner()
Load the Model
Load the pre-trained multilingual model you want to prune:
model = AutoModel.from_pretrained("your_multilingual_model")
Define Pruning Parameters
Specify the target language(s) and pruning settings:
params = {
"target_language": "en",
"pruning_ratio": 0.5,
"device": "cuda"
}
Perform Pruning
Apply the pruning process to the model:
pruned_model = pruner.prune_model(model, **params)
Export the Pruned Model
Save the pruned model for deployment or further use:
pruned_model.save_pretrained("pruned_model_directory")
Deploy the Model
Use the pruned model in your application, benefiting from reduced size and optimized performance.
What models does MTEM Pruner support?
MTEM Pruner is compatible with most multilingual text embedding models, including popular ones like Multilingual BERT, DistilBERT, and XLM-RoBERTa.
Can I prune the model for more than one language?
Yes, MTEM Pruner allows you to define multiple target languages. Simply specify them in the target_language parameter as a list:
target_language: ["en", "es", "fr"]
How do I choose the optimal pruning ratio?
The pruning ratio depends on your specific needs. Start with a lower ratio (e.g., 0.3) and evaluate performance. Gradually increase the ratio while monitoring accuracy and model size to find the best balance for your use case.