Instructions to use Jofthomas/Llama-3-8B-function_calling with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use Jofthomas/Llama-3-8B-function_calling with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Meta-Llama-3-8B-Instruct") model = PeftModel.from_pretrained(base_model, "Jofthomas/Llama-3-8B-function_calling") - Notebooks
- Google Colab
- Kaggle
File size: 129 Bytes
25e7758 | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:ee9d99c0aaa3869d0518e5cfd2d7d80fb5523f4f91fb9de6e92e77c4f926fedb
size 5048
|