dvitel/hearthstone
Viewer • Updated • 665 • 40 • 1
How to use dvitel/h0 with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="dvitel/h0") # Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("dvitel/h0")
model = AutoModelForCausalLM.from_pretrained("dvitel/h0")How to use dvitel/h0 with vLLM:
# Install vLLM from pip:
pip install vllm
# Start the vLLM server:
vllm serve "dvitel/h0"
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:8000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "dvitel/h0",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker model run hf.co/dvitel/h0
How to use dvitel/h0 with SGLang:
# Install SGLang from pip:
pip install sglang
# Start the SGLang server:
python3 -m sglang.launch_server \
--model-path "dvitel/h0" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "dvitel/h0",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'docker run --gpus all \
--shm-size 32g \
-p 30000:30000 \
-v ~/.cache/huggingface:/root/.cache/huggingface \
--env "HF_TOKEN=<secret>" \
--ipc=host \
lmsysorg/sglang:latest \
python3 -m sglang.launch_server \
--model-path "dvitel/h0" \
--host 0.0.0.0 \
--port 30000
# Call the server using curl (OpenAI-compatible API):
curl -X POST "http://localhost:30000/v1/completions" \
-H "Content-Type: application/json" \
--data '{
"model": "dvitel/h0",
"prompt": "Once upon a time,",
"max_tokens": 512,
"temperature": 0.5
}'How to use dvitel/h0 with Docker Model Runner:
docker model run hf.co/dvitel/h0
This model is a fine-tuned version of distilgpt2 on hearthstone dataset. GitHub repo. It achieves the following results on the evaluation set:
DistilGPT2 fine-tuned on HearthStone dataset for 200 epochs
HearthStone card code synthesis.
See split of hearthstone dataset
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Exact Match | Bleu | Codebleu | Ngram Match Score | Weighted Ngram Match Score | Syntax Match Score | Dataflow Match Score | Chrf |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 0.543 | 11.94 | 1600 | 0.2701 | 0.0152 | 0.8552 | 0.6144 | 0.6027 | 0.6136 | 0.6431 | 0.5982 | 89.0280 |
| 0.1459 | 23.88 | 3200 | 0.2408 | 0.0909 | 0.8841 | 0.6733 | 0.6610 | 0.6719 | 0.7210 | 0.6393 | 91.2517 |
| 0.0801 | 35.82 | 4800 | 0.2498 | 0.1515 | 0.8966 | 0.6999 | 0.6954 | 0.7054 | 0.7326 | 0.6662 | 92.1356 |
| 0.0498 | 47.76 | 6400 | 0.2569 | 0.1818 | 0.9012 | 0.7015 | 0.7022 | 0.7114 | 0.7428 | 0.6496 | 92.4668 |
| 0.0323 | 59.7 | 8000 | 0.2732 | 0.1667 | 0.9044 | 0.7241 | 0.7025 | 0.7123 | 0.7551 | 0.7266 | 92.5429 |
| 0.0214 | 71.64 | 9600 | 0.2896 | 0.1667 | 0.9034 | 0.7228 | 0.7101 | 0.7195 | 0.7670 | 0.6945 | 92.4258 |
| 0.015 | 83.58 | 11200 | 0.2870 | 0.1667 | 0.9046 | 0.7292 | 0.7137 | 0.7228 | 0.7667 | 0.7137 | 92.5979 |
| 0.0121 | 95.52 | 12800 | 0.2907 | 0.1667 | 0.9075 | 0.7287 | 0.7198 | 0.7297 | 0.7696 | 0.6958 | 92.7074 |
| 0.0093 | 107.46 | 14400 | 0.2976 | 0.1667 | 0.9073 | 0.7365 | 0.7134 | 0.7238 | 0.7732 | 0.7356 | 92.8347 |
| 0.0073 | 119.4 | 16000 | 0.3037 | 0.1818 | 0.9085 | 0.7326 | 0.7154 | 0.7241 | 0.7529 | 0.7381 | 92.8343 |
| 0.006 | 131.34 | 17600 | 0.3047 | 0.1970 | 0.9104 | 0.7410 | 0.7230 | 0.7312 | 0.7667 | 0.7433 | 92.8286 |
| 0.005 | 143.28 | 19200 | 0.3080 | 0.1970 | 0.9088 | 0.7377 | 0.7232 | 0.7316 | 0.7746 | 0.7214 | 92.8035 |
| 0.0044 | 155.22 | 20800 | 0.3071 | 0.1970 | 0.9076 | 0.7343 | 0.7196 | 0.7283 | 0.7783 | 0.7112 | 92.7742 |
| 0.004 | 167.16 | 22400 | 0.3097 | 0.1970 | 0.9082 | 0.7440 | 0.7236 | 0.7334 | 0.7601 | 0.7587 | 92.8117 |
| 0.0035 | 179.1 | 24000 | 0.3111 | 0.1970 | 0.9080 | 0.7355 | 0.7204 | 0.7295 | 0.7616 | 0.7304 | 92.7990 |
| 0.0036 | 191.04 | 25600 | 0.3117 | 0.1970 | 0.9085 | 0.7341 | 0.7211 | 0.7299 | 0.7536 | 0.7317 | 92.8689 |