What is prompt and context window in Large Language Models (LLMs)

RAHULRAJ P V
2 min readDec 9, 2023

--

Large Language Models (LLMs) have become increasingly popular for various natural language processing tasks. Understanding two key concepts, “prompt” and “context window,” is crucial when working with these models.

Prompt

A “prompt” is like a set of instructions or a query that you provide to the LLM to get a specific response. Think of it as a starting point for a conversation with the model. For example, if you want the model to write a story about a talking cat, your prompt could be: “Write a story about a cat that can talk.” The prompt guides the model in generating text that aligns with your request. For Example :

Context Window

The “context window” refers to the amount of text or tokens the model considers when generating a response. LLMs can only analyze a limited number of words or tokens at a time. If the context window is too small, the model might miss important information. If it’s too large, it can be computationally challenging. Imagine it as a frame through which the model views the text.

For example, if you’re having a conversation with the model, it will use the context window to remember what was said previously. If the window is too small, it might forget the conversation’s context, leading to confusing responses.

So, Both are vital for effective communication with LLMs and play a role in shaping the responses you receive.

--

--

RAHULRAJ P V

LinkedIn Datascience TopVoice '23 ⭐ | IIMK Research Intern | IIITMK MTech CSE AI '24 | CSIR NPL Project Intern | MSc Physics | PGDDSA