Openchat

Rank:

Average Model Cost: $0.0000

Number of Runs: 3,213

Models by this creator

openchat_8192

openchat_8192

openchat

Platform did not provide a description for this model.

Read more

$-/run

1.6K

Huggingface

opencoderplus

opencoderplus

OpenChat: Less is More for Open-source Models OpenChat is a series of open-source language models fine-tuned on a diverse and high-quality dataset of multi-round conversations. With only ~6K GPT-4 conversations filtered from the ~90K ShareGPT conversations, OpenChat is designed to achieve high performance with limited data. Generic models: OpenChat: based on LLaMA-13B (2048 context length) šŸš€ 105.7% of ChatGPT score on Vicuna GPT-4 evaluation šŸ”„ 80.9% Win-rate on AlpacaEval šŸ¤— Only used 6K data for finetuning!!! OpenChat-8192: based on LLaMA-13B (extended to 8192 context length) 106.6% of ChatGPT score on Vicuna GPT-4 evaluation 79.5% of ChatGPT score on Vicuna GPT-4 evaluation Code models: OpenCoderPlus: based on StarCoderPlus (native 8192 context length) 102.5% of ChatGPT score on Vicuna GPT-4 evaluation 78.7% Win-rate on AlpacaEval Note: Please load the pretrained models using bfloat16 Code and Inference Server We provide the full source code, including an inference server compatible with the "ChatCompletions" API, in the OpenChat GitHub repository. Web UI OpenChat also includes a web UI for a better user experience. See the GitHub repository for instructions. Conversation Template The conversation template involves concatenating tokens. Besides base model vocabulary, an end-of-turn token <|end_of_turn|> is added, with id eot_token_id. Hint: In BPE, tokenize(A) + tokenize(B) does not always equals to tokenize(A + B) Following is the code for generating the conversation templates:

Read more

$-/run

328

Huggingface

Similar creators