By Bernat Sampera 1 min read Follow:

4 Context Engineering Principles for building AI Agents

Context Engineer is taking over the AI Engineering space. Learn how to offload, implement one shot prompting, isolate and reduce context in this guide

Many engineers building AI Agents have realized that a major block when building AI Agents is having control over the information is sent to the LLM in every call.

It’s CRUCIAL to sent as few information as possible to the llm in order to

  1. Get accurate response

  2. Reduce Costs

  3. Reduce the Time the LLM needs to answer.

BUT, it’s still important to pass all the relevant information to the LLM, so the response is correct.

Techniques

There are various techniques to achieve this, some of them are

  1. Context offloading: Instead of keeping all the information in the state, write to a txt or md file to come back to it later, or to track progress of the tasks done.

  2. Prompt Oneshotting: This is often used in applications that have a first step that involves research. While it can be tempting to already convert the information to our final desired output after every iteration. It’s much better performant to keep all the researched as simple text and convert it to the final input at the last step, when all the previous research has already been completed.

  3. Isolating Context: This technique is quite analogous to the principle of “Single Responsibility” in software. Which consist in keeping functions and classes to a single purpose. In AI Engineering this could be defined as having multiple subagents inside our main agent and each of these agents have just the Context they need.

  4. Reduce Context: This may involve various techniques like summarization or deleting messages and information from the state. This way we keep the essential information the LLM needs for successive calls. ATTENTION: This can be dangerous as important context may be also lost.

Let's connect !!

Get in touch if you want updates, examples, and insights on how AI agents, Langchain and more are evolving and where they’re going next.