Multilingual Text Embedding Model Pruner
Evaluate model predictions with TruLens
Upload a machine learning model to Hugging Face Hub
Browse and evaluate ML tasks in MLIP Arena
Explore and benchmark visual document retrieval models
Merge machine learning models using a YAML configuration file
Convert Stable Diffusion checkpoint to Diffusers and open a PR
Load AI models and prepare your space
Push a ML model to Hugging Face Hub
Launch web-based model application
Benchmark LLMs in accuracy and translation across languages
Open Persian LLM Leaderboard
Display leaderboard of language model evaluations
MTEM Pruner is an advanced tool designed for pruning multilingual text embedding models. It allows users to .optimize large multilingual models by focusing on a specific language or set of languages. This makes the model more efficient and lightweight while maintaining high performance for the target language(s). MTEM Pruner is particularly useful for developers and researchers working on model benchmarking and fine-tuning.
Install the Tool
Install MTEM Pruner using pip or directly from source:
pip install mtem-pruner
Import the Library
Load the required libraries and initialize the pruner:
from mtem_pruner import MTEMPruner
pruner = MTEMPruner()
Load the Model
Load the pre-trained multilingual model you want to prune:
model = AutoModel.from_pretrained("your_multilingual_model")
Define Pruning Parameters
Specify the target language(s) and pruning settings:
params = {
"target_language": "en",
"pruning_ratio": 0.5,
"device": "cuda"
}
Perform Pruning
Apply the pruning process to the model:
pruned_model = pruner.prune_model(model, **params)
Export the Pruned Model
Save the pruned model for deployment or further use:
pruned_model.save_pretrained("pruned_model_directory")
Deploy the Model
Use the pruned model in your application, benefiting from reduced size and optimized performance.
What models does MTEM Pruner support?
MTEM Pruner is compatible with most multilingual text embedding models, including popular ones like Multilingual BERT, DistilBERT, and XLM-RoBERTa.
Can I prune the model for more than one language?
Yes, MTEM Pruner allows you to define multiple target languages. Simply specify them in the target_language parameter as a list:
target_language: ["en", "es", "fr"]
How do I choose the optimal pruning ratio?
The pruning ratio depends on your specific needs. Start with a lower ratio (e.g., 0.3) and evaluate performance. Gradually increase the ratio while monitoring accuracy and model size to find the best balance for your use case.