Rinna

Rank:

Average Model Cost: $0.0000

Number of Runs: 1,570,919

Models by this creator

japanese-hubert-base

japanese-hubert-base

rinna

The Japanese Hubert base model is a large-scale language model trained on a Japanese audio dataset. It uses the Hidden Unit Bidirectional Encoder Representations from Transformers (Hubert) architecture, which consists of 12 transformer layers with 8 attention heads. The model was trained using code from the official repository and the training configuration can be found in the repository and the original paper. The model is trained on approximately 19,000 hours of the ReazonSpeech corpus and is available under the Apache 2.0 license.

Read more

$-/run

1.5M

Huggingface

japanese-gpt-neox-3.6b-instruction-ppo

japanese-gpt-neox-3.6b-instruction-ppo

The japanese-gpt-neox-3.6b-instruction-ppo model is a Japanese language model trained using the PPO (Proximal Policy Optimization) algorithm. It is based on the GPT-NeoX architecture and has been trained on a large dataset of Japanese text. The model has the ability to generate Japanese text based on the input it receives, making it useful for tasks such as text generation and natural language understanding in the Japanese language.

Read more

$-/run

18.8K

Huggingface

japanese-gpt2-medium

japanese-gpt2-medium

The japanese-gpt2-medium is a text generation model trained specifically for the Japanese language. It is a medium-sized variant of the GPT-2 model and can generate coherent and contextually relevant Japanese text based on a given prompt. The model has been trained on a large amount of Japanese text data and can be used for a variety of natural language processing tasks such as language generation, summarization, question answering, and more.

Read more

$-/run

11.6K

Huggingface

japanese-gpt2-small

japanese-gpt2-small

The Japanese GPT-2 Small model is a text generation model developed by OpenAI. It's designed specifically for the Japanese language and can generate human-like text based on given prompts. It has been pre-trained on a large corpus of Japanese text and can be fine-tuned for specific tasks or applications. This model can be used for various natural language processing tasks such as chatbots, question answering systems, and language translation.

Read more

$-/run

9.4K

Huggingface

japanese-gpt-neox-3.6b-instruction-sft

japanese-gpt-neox-3.6b-instruction-sft

The japanese-gpt-neox-3.6b-instruction-sft model is a language model specifically trained on Japanese text. It is designed to generate coherent and contextually relevant text based on given prompts or instructions. The model has been fine-tuned to perform text generation tasks, making it useful for a wide range of applications such as chatbots, content generation, and language translation. With a parameter size of 3.6 billion, the model has a large capacity for understanding and generating natural language in Japanese.

Read more

$-/run

8.1K

Huggingface

japanese-gpt-1b

japanese-gpt-1b

japanese-gpt-1b is a text generation model that is designed to generate Japanese language text. It is built on the GPT-3 architecture and can generate coherent and contextually relevant Japanese text based on a given prompt. The model has been trained on a large corpus of Japanese text and is capable of generating a wide range of text, including articles, stories, poetry, and more. It can be used for various applications such as content creation, language translation, and text generation for chatbots or virtual assistants.

Read more

$-/run

6.0K

Huggingface

Similar creators