Philschmid
Rank:Average Model Cost: $0.0000
Number of Runs: 691,652
Models by this creator
bart-large-cnn-samsum
bart-large-cnn-samsum
bart-large-cnn-samsum is a summarization model based on the BART architecture. It is specifically trained to generate summaries for the SAMSum corpus, which consists of dialogues from Amazon Mechanical Turk conversations. The model has been fine-tuned to produce high-quality abstractive summaries of these dialogues.
$-/run
585.3K
Huggingface
distilbert-onnx
distilbert-onnx
DistilBERT is a transformer-based model that has been converted to the ONNX format. It is designed for question answering tasks. The model is capable of taking a question as input and predicting the answer based on a given context. The conversion to ONNX format allows the model to be integrated into various applications and deployed on different platforms.
$-/run
44.5K
Huggingface
pyannote-segmentation
$-/run
14.5K
Huggingface
pyannote-speaker-diarization-endpoint
pyannote-speaker-diarization-endpoint
The pyannote-speaker-diarization-endpoint model is a deep learning model used for speaker diarization, which is the task of determining "who spoke when" in a given audio recording. The specific task of this model is to detect the endpoint of each voice activity segment in the audio. It takes in an audio signal as input and outputs the start and end time of each voice activity segment in the audio. The model is trained using deep neural networks and can be used for various applications such as speech recognition, speaker recognition, and audio indexing.
$-/run
11.7K
Huggingface
distilbart-cnn-12-6-samsum
distilbart-cnn-12-6-samsum
distilbart-cnn-12-6-samsum is a pre-trained model for text summarization using the BART architecture and distilled from a larger model. It is trained specifically on the SAMSum dataset, which consists of conversation datasets for abstractive dialogue summarization. This model can be used to generate concise summaries of input conversations.
$-/run
10.6K
Huggingface
flan-t5-xxl-sharded-fp16
$-/run
9.7K
Huggingface
flan-t5-base-samsum
flan-t5-base-samsum
The flan-t5-base-samsum model is a text-to-text generation model. It performs tasks such as summarization, translation, and question-answering. However, the creator did not provide a specific description of this particular model.
$-/run
6.6K
Huggingface
BERT-Banking77
BERT-Banking77
BERT-Banking77 is a pre-trained language model specifically designed for the banking and financial domain. It is based on BERT (Bidirectional Encoder Representations from Transformers), a transformer-based neural network architecture, which is known for its strong performance in natural language processing tasks. BERT-Banking77 can be fine-tuned for various text classification tasks in the banking and financial domain, such as sentiment analysis, intent detection, and document classification. The model has been trained on a large corpus of financial documents to effectively understand and analyze text data in this specific domain.
$-/run
5.3K
Huggingface
instruct-igel-001
$-/run
2.0K
Huggingface
tiny-bert-sst2-distilled
$-/run
1.4K
Huggingface