Multilingual Text Embedding Model Pruner
Create demo spaces for models on Hugging Face
Convert Hugging Face model repo to Safetensors
Find recent high-liked Hugging Face models
Convert and upload model files for Stable Diffusion
Rank machines based on LLaMA 7B v2 benchmark results
View and submit language model evaluations
Leaderboard of information retrieval models in French
Persian Text Embedding Benchmark
Load AI models and prepare your space
Benchmark models using PyTorch and OpenVINO
Launch web-based model application
Benchmark AI models by comparison
MTEM Pruner is an advanced tool designed for pruning multilingual text embedding models. It allows users to .optimize large multilingual models by focusing on a specific language or set of languages. This makes the model more efficient and lightweight while maintaining high performance for the target language(s). MTEM Pruner is particularly useful for developers and researchers working on model benchmarking and fine-tuning.
Install the Tool
Install MTEM Pruner using pip or directly from source:
pip install mtem-pruner
Import the Library
Load the required libraries and initialize the pruner:
from mtem_pruner import MTEMPruner
pruner = MTEMPruner()
Load the Model
Load the pre-trained multilingual model you want to prune:
model = AutoModel.from_pretrained("your_multilingual_model")
Define Pruning Parameters
Specify the target language(s) and pruning settings:
params = {
"target_language": "en",
"pruning_ratio": 0.5,
"device": "cuda"
}
Perform Pruning
Apply the pruning process to the model:
pruned_model = pruner.prune_model(model, **params)
Export the Pruned Model
Save the pruned model for deployment or further use:
pruned_model.save_pretrained("pruned_model_directory")
Deploy the Model
Use the pruned model in your application, benefiting from reduced size and optimized performance.
What models does MTEM Pruner support?
MTEM Pruner is compatible with most multilingual text embedding models, including popular ones like Multilingual BERT, DistilBERT, and XLM-RoBERTa.
Can I prune the model for more than one language?
Yes, MTEM Pruner allows you to define multiple target languages. Simply specify them in the target_language parameter as a list:
target_language: ["en", "es", "fr"]
How do I choose the optimal pruning ratio?
The pruning ratio depends on your specific needs. Start with a lower ratio (e.g., 0.3) and evaluate performance. Gradually increase the ratio while monitoring accuracy and model size to find the best balance for your use case.