Is the completion window length included in the context window length for OpenAI Codex models?
For da-vinci
, the context window length is set to 4000
tokens.
From what I understand, as an example, if the prompt length is 3500
tokens, then the remaining 500
is for the completion. And there is no way use the whole 4000
token as the prompt.
I am pretty sure in my understanding, but it would be helpful to have it confirmed by someone knowledgeable.