Lucadiliello
Rank:Average Model Cost: $0.0000
Number of Runs: 3,591
Models by this creator
BLEURT-20-D12
BLEURT-20-D12
This model is based on a custom Transformer model that can be installed with: Now load the model and make predictions with: Take a look at this repository for the definition of BleurtConfig, BleurtForSequenceClassification and BleurtTokenizer in PyTorch.
$-/run
1.5K
Huggingface
bart-small
$-/run
460
Huggingface
bleurt-base-512
bleurt-base-512
This model is based on a custom Transformer model that can be installed with: Now load the model and make predictions with: Take a look at this repository for the definition of BleurtConfig, BleurtForSequenceClassification and BleurtTokenizer in PyTorch.
$-/run
123
Huggingface
opt-30b-deepspeed-inference-fp16-shard-8
opt-30b-deepspeed-inference-fp16-shard-8
This is a copy of the original OPT weights that is more efficient to use with the DeepSpeed-MII and DeepSpeed-Inference. In this repo the original tensors are split into 8 shards to target 8 GPUs, this allows the user to run the model with DeepSpeed-inference Tensor Parallelism. For specific details about the OPT model itself, please see the original OPT model card. For examples on using this repo please see the following: https://github.com/huggingface/transformers-bloom-inference https://github.com/microsoft/DeepSpeed-MII
$-/run
21
Huggingface
deberta-small
$-/run
13
Huggingface
bleurt-large-128
bleurt-large-128
This model is based on a custom Transformer model that can be installed with: Now load the model and make predictions with: Take a look at this repository for the definition of BleurtConfig, BleurtForSequenceClassification and BleurtTokenizer in PyTorch.
$-/run
11
Huggingface