context length?
#1
by tpiperseek - opened
Have I interpreted this appropriately, does this mean that this model should have a context length of 8192?
Extended context length: Processes sequences up to 8192 tokens, great for LLM output evals.
The reason why I am confused is that the config.json set the following."max_position_embeddings": 512,
this seems contradictory.