Akreal
Rank:Average Model Cost: $0.0000
Number of Runs: 85,603
Models by this creator
tiny-random-mbart
tiny-random-mbart
The tiny-random-mbart model is a copy of the original MBART (Multilingual Denoising Autoencoder for Bottleneck) model, with some changes to the file format. It is trained to perform various natural language processing tasks such as machine translation, text generation, and text summarization. The model can handle multiple languages and has been optimized for efficient use of computational resources.
$-/run
31.2K
Huggingface
tiny-random-t5
tiny-random-t5
tiny-random-t5 is a language model based on T5 (Text-To-Text Transfer Transformer). It is designed to generate text based on given input prompts. This model has been trained using random data and can be fine-tuned for various natural language processing tasks.
$-/run
20.8K
Huggingface
tiny-random-bert
tiny-random-bert
tiny-random-bert is a pre-trained model based on BERT (Bidirectional Encoder Representations from Transformers) that has been trained with a small random subset of the original BERT training data. It is provided in PyTorch format and can be used for a variety of natural language processing tasks, such as text classification, named entity recognition, and question answering. The model is designed to be small and fast, making it suitable for applications with limited computational resources.
$-/run
13.3K
Huggingface
tiny-random-xlnet
tiny-random-xlnet
The model "tiny-random-xlnet" is a version of the XLNet language model that has been trained on a reduced amount of data. It is designed to be a smaller and faster version of the original XLNet model, making it more suitable for applications with limited computational resources. The model can be used for various natural language processing tasks such as text classification, question answering, and language generation.
$-/run
6.9K
Huggingface
tiny-random-mpnet
tiny-random-mpnet
The tiny-random-mpnet model is a copy of the original "https://huggingface.co/hf-internal-testing/tiny-random-mpnet" model. The only change made is to use an old format for the `pytorch_model.bin` file. The purpose and functionality of the model remain the same.
$-/run
6.6K
Huggingface
tiny-random-gpt2
tiny-random-gpt2
tiny-random-gpt2 is a language model based on OpenAI's GPT-2 architecture. It has been trained to generate text and has been reduced in size to be more lightweight. This model is capable of producing cohesive and contextually relevant text based on a given prompt.
$-/run
6.6K
Huggingface
mbart-large-50-finetuned-media
$-/run
26
Huggingface
mbart-large-50-finetuned-portmedia-lang
mbart-large-50-finetuned-portmedia-lang
Platform did not provide a description for this model.
$-/run
26
Huggingface
mbart-large-50-finetuned-catslu
$-/run
14
Huggingface
mbart-large-50-finetuned-portmedia-dom
mbart-large-50-finetuned-portmedia-dom
Platform did not provide a description for this model.
$-/run
10
Huggingface