Plt5 Small

allegro

plt5-small

plT5 Small plT5 models are T5-based language models trained on Polish corpora. The models were optimized for the original T5 denoising target. Corpus plT5 was trained on six different corpora available for Polish language: Tokenizer The training dataset was tokenized into subwords using a sentencepiece unigram model with vocabulary size of 50k tokens. Usage Example code: License CC BY 4.0 Citation If you use this model, please cite the following paper: Authors The model was trained by Machine Learning Research Team at Allegro and Linguistic Engineering Group at Institute of Computer Science, Polish Academy of Sciences. You can contact us at: klejbenchmark@allegro.pl
translation

Pricing

Cost per run
$-
USD
Avg run time
-
Seconds
Hardware
-
Prediction

Creator Models

ModelCostRuns
Plt5 Base$?2,200
Plt5 Large$?624
Herbert Klej Cased Tokenizer V1$?362
Herbert Klej Cased V1$?770
Herbert Large Cased$?1,631

Similar Models

Try it!

You can use this area to play around with demo applications that incorporate the Plt5 Small model. These demos are maintained and hosted externally by third-party creators. If you see an error, message me on Twitter.

Overview

Summary of this model and related resources.

PropertyValue
Creatorallegro
Model NamePlt5 Small
Description

plT5 Small plT5 models are T5-based language models trained...

Read more ยป
Tagstranslation
Model LinkView on HuggingFace
API SpecView on HuggingFace
Github LinkNo Github link provided
Paper LinkNo paper link provided

Popularity

How popular is this model, by number of runs? How popular is the creator, by the sum of all their runs?

PropertyValue
Runs1,215
Model Rank
Creator Rank

Cost

How much does it cost to run this model? How long, on average, does it take to complete a run?

PropertyValue
Cost per Run$-
Prediction Hardware-
Average Completion Time-