| --- |
| base_model: teknium/OpenHermes-2.5-Mistral-7B |
| license: apache-2.0 |
| datasets: |
| - teknium/openhermes |
| - allenai/ultrafeedback_binarized_cleaned |
| - Intel/orca_dpo_pairs |
| language: |
| - en |
| library_name: transformers |
| pipeline_tag: text-generation |
| quantized_by: bartowski |
| --- |
| |
| # Exllama v2 Quantizations of DPOpenHermes-7B-v2 at 6.0 bits per weight |
|
|
| Using <a href="https://github.com/turboderp/exllamav2/releases/tag/v0.0.10">turboderp's ExLlamaV2 v0.0.10</a> for quantization. |
|
|
| Conversion was done using wikitext-103-raw-v1-test.parquet as calibration dataset. |
|
|
| Original model: https://huggingface.co/openaccess-ai-collective/DPOpenHermes-7B-v2 |
|
|
| ## Download instructions |
|
|
| With git: |
|
|
| ```shell |
| git clone --single-branch --branch 6.0 https://huggingface.co/bartowski/DPOpenHermes-7B-v2-exl2 |
| ``` |
|
|
| With huggingface hub (credit to TheBloke for instructions): |
|
|
| ```shell |
| pip3 install huggingface-hub |
| ``` |
|
|
| To download from a different branch, add the `--revision` parameter: |
|
|
| ```shell |
| mkdir DPOpenHermes-7B-v2-exl2 |
| huggingface-cli download bartowski/DPOpenHermes-7B-v2-exl2 --revision 6_0 --local-dir DPOpenHermes-7B-v2-exl2 --local-dir-use-symlinks False |
| ``` |
|
|