Introduction
You have probably already been exposed to models from AI21 Labs. I am very comfortable working with them, but one of the challenges has been to manually record the history of my interaction with the model. Fortunately, LangChain has made it possible to implement this operation quickly and efficiently!
In this tutorial, I will explain how to implement this feature rapidly, allowing you to experiment further on your own.
Implementation
Dependencies
First, we need to create a project directory, set up a virtual environment, and install some necessary dependencies. Let’s get started!
Creating a Virtual Environment
To create a virtual environment, you can use the following command in your terminal:
python -m venv myenv
Replace myenv
with your desired environment name.
Installing Dependencies
Once your environment is set up, you can install the required dependencies using:
pip install langchain ai21
This will install both LangChain and the AI21 API integrations for our project.
Coding Time!
Now we can start coding! Begin by creating a .env
file in your project directory to store your API key from AI21 Labs Studio. Use AI21_API_KEY
as the variable name.
Creating the Main Code File
Next, create a main.py
file where we will write our code. To get started, import all necessary modules and load your API key:
import os
from langchain.llm import OpenAI
from langchain.memory import ConversationBufferMemory
from langchain.chains import LLMChain
Building the Chatbot
For the purpose of this tutorial, I will demonstrate how to create a simple chatbot that runs in the terminal. However, feel free to adapt this method for your unique applications!
First, I will create a prompt template to guide the model in understanding our task better. Since I will use a regular LLM instead of a chat model, this step is essential.
I will also establish a memory object that will store the conversation history. While I chose to use ConversationBufferMemory, you can explore other types of memory as noted in this article.
Creating the LLMChain
Let’s set up the LLMChain which will handle our conversation. I will use verbose=True
to monitor the input from memory to the chain.
memory = ConversationBufferMemory()
llm = OpenAI(api_key=os.getenv('AI21_API_KEY'))
chain = LLMChain(llm=llm, memory=memory, verbose=True)
Main Loop for the Conversation
Now we can create the main loop that will iterate through the different stages of our conversation.
while True:
user_input = input("You: ")
response = chain.run(user_input)
print(f"Bot: {response}")
Results
Let’s run our application!
Conversation Testing
In testing the chatbot, I conducted a simple conversation to assess the performance of our program. As demonstrated, the conversation history is maintained and functions correctly beyond the first exchange!
This implementation is incredibly easy and quick, showcasing the power of combining AI21 and LangChain to build robust applications. I encourage you to participate in our upcoming Plug into AI with AI21 Hackathon to explore and create even more!
Leave a comment
All comments are moderated before being published.
This site is protected by hCaptcha and the hCaptcha Privacy Policy and Terms of Service apply.