2

is window size and context length of language model one and the same thing?

******** following text is added as question with ONLY above text was not allowed ***** I am trying to understand how GPT model is trained and this question to my mind. I tried to search answer on google but couldn't find an answer thus asking here.

Vinay Sharma
  • 187
  • 1
  • 1
  • 6

1 Answers1

4

Yes, in large language models, window and context length refer to the same thing: the maximum token sequence length that the language model can handle at once.

noe
  • 28,203
  • 1
  • 49
  • 83