Phi Nguyen
nxphi47
AI & ML interests
NLP
Organizations
Cool model
❤️
1
1
#1 opened 5 months ago
by
nxphi47
Is this model native 128K context length, or YaRN extended?
7
#28 opened 10 months ago
by
danielhanchen
Prompt Template with Langchain
5
#5 opened over 1 year ago
by
Hanifahreza
Xinference集成输出乱码
2
#4 opened over 1 year ago
by
Livinng
Best way to prompt to get consistent Tagalog-English responses
1
#8 opened almost 2 years ago
by
Pats14
Correct template for use with this model
2
#1 opened over 1 year ago
by
saknarak
Responses from model are showing to other users
➕
4
7
#18 opened over 1 year ago
by
djstrong
built on gemma or llama?
1
#3 opened over 1 year ago
by
YaoLiu61
dedicated endpoints Issue with v2.5 compared with v2
4
#1 opened over 1 year ago
by
trungnx26
Why using ununsual chat template?
2
#2 opened over 1 year ago
by
LoneRanger44
What is the maximum context window that SeaLLMs can handle?
1
#7 opened almost 2 years ago
by
trungnx26
repeated instructions
2
#6 opened almost 2 years ago
by
Mewband12
`std::runtime_error: [Matmul::eval_cpu] Currently only supports float32`
2
#2 opened almost 2 years ago
by
adhishthite
Best way to prompt seallm for thai language
7
#4 opened almost 2 years ago
by
Mewband12
Adding `safetensors` variant of this model
1
#2 opened almost 2 years ago
by
SFconvertbot
sharded not supported for Automodel
👍
1
1
#3 opened almost 2 years ago
by
Mewband12
When will the model be realeased?
👍
8
9
#1 opened about 2 years ago
by
wiccanmind
Upload folder using huggingface_hub
#1 opened almost 2 years ago
by
nxphi47