Chewie 1.2: African Community Health Worker AI Assistant
Chewie 1.2 is a medical AI assistant fine-tuned specifically for Community Health Workers (CHWs) in Africa. Built on Google's MedGemma-4B, it provides structured clinical guidance following a consistent Assessment → Action → Advice format.
Model Details
Model Description
Chewie 1.2 is a LoRA adapter fine-tuned on top of google/medgemma-4b-it using a curated dataset of 11,880 multilingual medical conversations designed for community health settings in Africa.
- Developed by: Electric Sheep Africa
- Model type: Causal Language Model (LoRA Adapter)
- Language(s): English, Swahili, Hausa, Yoruba, Amharic, Zulu, French
- License: Apache 2.0
- Finetuned from: google/medgemma-4b-it
Model Sources
- Repository: electricsheepafrica/chewie-1.2
- Previous Version: electricsheepafrica/chewie-llama-3b
Intended Use
Primary Use Cases
- Clinical Decision Support: Helping CHWs assess symptoms and determine appropriate actions
- Triage Assistance: Identifying danger signs that require immediate referral
- Health Education: Providing patient-friendly explanations and preventive advice
- Multilingual Support: Serving diverse African language communities
Target Users
- Community Health Workers (CHWs)
- Primary healthcare providers in resource-limited settings
- Health education programs
- Mobile health (mHealth) applications
Out-of-Scope Use
- NOT for direct patient diagnosis - Always requires human clinical oversight
- NOT a replacement for professional medical care
- NOT validated for emergency/critical care decisions
- Should not be used without proper clinical supervision
How to Get Started
Installation
pip install transformers peft bitsandbytes accelerate
Inference Code
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig
from peft import PeftModel
# Load base model with 4-bit quantization
base_model_id = "google/medgemma-4b-it"
adapter_id = "electricsheepafrica/chewie-1.2"
bnb_config = BitsAndBytesConfig(
load_in_4bit=True,
bnb_4bit_use_double_quant=True,
bnb_4bit_quant_type="nf4",
bnb_4bit_compute_dtype=torch.float16,
)
# Load base model
base_model = AutoModelForCausalLM.from_pretrained(
base_model_id,
quantization_config=bnb_config,
attn_implementation="eager",
device_map="auto",
)
# Load adapter
model = PeftModel.from_pretrained(base_model, adapter_id)
tokenizer = AutoTokenizer.from_pretrained(adapter_id)
# Generate response
def chat(prompt):
messages = [{"role": "user", "content": prompt}]
inputs = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
return_tensors="pt"
).to(model.device)
with torch.no_grad():
outputs = model.generate(
inputs,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.9,
)
return tokenizer.decode(outputs[0][inputs.shape[1]:], skip_special_tokens=True)
# Example usage
response = chat("What are the danger signs in a child with diarrhea?")
print(response)
Training Details
Training Data
- Dataset Size: 11,880 examples
- Languages: English, Swahili, Hausa, Yoruba, Amharic, Zulu, French
- Domain: Community health, primary care, maternal/child health, infectious diseases
- Format: Structured Assessment → Action → Advice responses
Training Procedure
- Base Model: google/medgemma-4b-it
- Method: LoRA (Low-Rank Adaptation)
- Framework: Hugging Face TRL (SFTTrainer)
Training Hyperparameters
| Parameter | Value |
|---|---|
| LoRA Rank (r) | 16 |
| LoRA Alpha | 16 |
| LoRA Dropout | 0.05 |
| Target Modules | all-linear |
| Learning Rate | 2e-4 |
| Batch Size | 1-2 |
| Gradient Accumulation | 8-16 |
| Epochs | 1 |
| Precision | bf16/fp16 |
| Optimizer | AdamW (fused) |
Training Results
| Metric | Value |
|---|---|
| Final Training Loss | 1.08 |
| Final Validation Loss | 1.13 |
| Mean Token Accuracy | 72.5% |
| Training Time | ~1 hour |
Compute Infrastructure
- Hardware: NVIDIA A100 40GB / T4 16GB (compatible with both)
- Cloud Provider: Google Colab
- Training Framework: Transformers, PEFT, TRL, BitsAndBytes
Evaluation
Sample Outputs
Input: "What are the symptoms of malaria?"
Output:
**Assessment:** Malaria can cause various symptoms, including fever, chills,
sweating, headache, muscle aches, nausea, vomiting, and fatigue. In severe
cases, it may lead to anemia, kidney failure, or brain damage.
**Action:** If you or someone you know has a high fever (above 102°F or 39°C),
persistent headaches, or unusual weakness, seek medical help immediately at
a clinic. These could be danger signs for severe malaria.
**Advice:** To prevent malaria, use insect repellent when outdoors, sleep
under mosquito nets, and take preventative medications if recommended by
your healthcare provider.
Limitations and Risks
Known Limitations
- Model outputs should always be verified by qualified healthcare professionals
- May not reflect the most current medical guidelines
- Performance may vary across different African languages
- Not trained on region-specific drug formularies or protocols
Ethical Considerations
- This model is intended as a decision-support tool, not a replacement for clinical judgment
- Healthcare providers should use their professional expertise alongside model outputs
- Patient safety must always take precedence over model recommendations
Citation
@misc{chewie-1.2,
author = {Electric Sheep Africa},
title = {Chewie 1.2: African Community Health Worker AI Assistant},
year = {2026},
publisher = {Hugging Face},
url = {https://huggingface.co/electricsheepafrica/chewie-1.2}
}
Model Card Contact
- Organization: Electric Sheep Africa
- Issues: Please open an issue on the model repository
Framework Versions
- PEFT: 0.18.0
- Transformers: 4.x
- TRL: 0.x
- BitsAndBytes: 0.x
- PyTorch: 2.x
- Downloads last month
- 3