Multilingual Text Embedding Model Pruner
Convert PyTorch models to waifu2x-ios format
Request model evaluation on COCO val 2017 dataset
Merge machine learning models using a YAML configuration file
Benchmark AI models by comparison
Generate leaderboard comparing DNA models
Submit deepfake detection models for evaluation
Evaluate open LLMs in the languages of LATAM and Spain.
Measure execution times of BERT models using WebGPU and WASM
View LLM Performance Leaderboard
Visualize model performance on function calling tasks
View and compare language model evaluations
Measure over-refusal in LLMs using OR-Bench
MTEM Pruner is an advanced tool designed for pruning multilingual text embedding models. It allows users to .optimize large multilingual models by focusing on a specific language or set of languages. This makes the model more efficient and lightweight while maintaining high performance for the target language(s). MTEM Pruner is particularly useful for developers and researchers working on model benchmarking and fine-tuning.
Install the Tool
Install MTEM Pruner using pip or directly from source:
pip install mtem-pruner
Import the Library
Load the required libraries and initialize the pruner:
from mtem_pruner import MTEMPruner
pruner = MTEMPruner()
Load the Model
Load the pre-trained multilingual model you want to prune:
model = AutoModel.from_pretrained("your_multilingual_model")
Define Pruning Parameters
Specify the target language(s) and pruning settings:
params = {
"target_language": "en",
"pruning_ratio": 0.5,
"device": "cuda"
}
Perform Pruning
Apply the pruning process to the model:
pruned_model = pruner.prune_model(model, **params)
Export the Pruned Model
Save the pruned model for deployment or further use:
pruned_model.save_pretrained("pruned_model_directory")
Deploy the Model
Use the pruned model in your application, benefiting from reduced size and optimized performance.
What models does MTEM Pruner support?
MTEM Pruner is compatible with most multilingual text embedding models, including popular ones like Multilingual BERT, DistilBERT, and XLM-RoBERTa.
Can I prune the model for more than one language?
Yes, MTEM Pruner allows you to define multiple target languages. Simply specify them in the target_language parameter as a list:
target_language: ["en", "es", "fr"]
How do I choose the optimal pruning ratio?
The pruning ratio depends on your specific needs. Start with a lower ratio (e.g., 0.3) and evaluate performance. Gradually increase the ratio while monitoring accuracy and model size to find the best balance for your use case.