Bagisto Chatbot Using OpenAI (ChatGPT) LLM: This chatbot module is built on the Langchain framework and uses Pinecone and Chroma DB to store embeddings.
It allows store admins to create powerful chatbots using OpenAI GPT-4o model that can interact with customers in a natural way.
The chatbot can be used for a variety of purposes, such as Customer Service, and providing information about your store products.
- Admin can enable/disable the chatbot functionality for the store.
- Customers can ask about store products, prices, availability, and more to receive instant responses.
- This module supports an advanced OpenAI GPT-4o model for intelligent and contextually relevant answers.
- Built on the Langchain framework and Integrated products automatically using this framework.
- Display the data based on the customer’s last search and preference.
- The customer will get product details and ask for more details about the product.
- Supports Pinecone and Chroma DB for vector storage
- It can be used to build multiple chatbots that can be used in SAAS solutions
Check the below video for an overview of this module:
Installation
After the Fresh Installation of Bagisto Chatbot Using OpenAI ( ChatGPT ) LLM, then we need to configure all the information related to the Langchain AI for setting up the Vector Storage. Once that is done, we need to follow the below installation process
Unzip the respective extension zip and then merge “packages” folder into project root directory.
Go to the config/app.php file and add the following line under ‘providers’
Webkul\AIChatbot\Providers\AIChatbotServiceProvider::class
Goto composer.json file and add the following line under ‘psr-4’
Webkul\AIChatbot\": "packages/Webkul/AIChatbot/src
Run these below commands to complete the setup
Composer dump-autoload php artisan optimize php artisan migrate php artisan route:cache php artisan vendor:publish --force
Press the number before “Webkul\AIChatbot\Providers\AIChatbotServiceProvider” and then press enter to publish all assets and configurations.
Generate OpenAI Key – OpenAI Account
The user has to first create an OpenAI account and log in to the OpenAI dashboard as shown in the screenshot.
Now, the user can click on the Personal option located in the upper-right corner. Once click popup appears on the screen, within this popup, users will find the “View API Keys” choice, which they can select to proceed.
The user is redirected to another page where the Create New Secret Key will be visible as shown in the screenshot.
When the user clicks on this button, a pop-up will appear showing the API key. In the pop-up, there will be a “Copy” button that allows the user to easily copy the key to their clipboard.
The generated key is to be used in the configuration settings section under the Open AI key.
For storing the Information and product-related information we need to have Vector Storage, hence for this, we use Pinecone and Chroma for the same.
Generate Pinecone – Secret Key & Index
To create your Pinecone secret key and index, simply visit the Pinecone website and complete the registration process.
After registration, you can sign in to the Pinecone account.
After login, navigate to the API Keys section. Here, you can find all of the keys that you have created.
To Create a new API key, click the Create API Key button at the top right. Give your key a name, then click Create Key.
Once created, copy the key. You’ll need to paste it into your extension’s configuration settings later.
Now, find the Indexes section in the side menu. Click Create your first index.
A pop-up will show up. Enter an index name, set the dimension to 1536, choose the metric, select the Pod Type as needed, and click Create Index.
After creating the index, copy its name. You’ll use it in your extension’s Pinecone index section.
Moreover, you can also find the environment key under the indexer section.
For the chroma set you have to follow the same step as the pinecone you have to enter the chroma collection as per the steps mentioned in the documentation.
For API URL- Langchain
we need to add the following configurations in the langchain Server.
Follow these steps to set up the module API:
Install Dependencies: Run the command below to install the necessary npm dependencies:
npm i
Start the Server: The npm run dev command is used to run the corresponding script from the scripts section of your package. json file.
npm run dev
1- Create Embeddings: This process fetches product details from your store and stores them in vector storage.
API Endpoint:
POST /api/embeddings/upsert
“Usage: Send a request with up to 20 products to be stored in one go”
2- Remove Embedding by ID: This action removes a specific product from the vector storage using its ID.
API Endpoint:
DELETE /api/embeddings/remove/{id}
“Usage: Use this endpoint to remove a product by providing its ID”
3- Remove All Embeddings: Clear all stored information from the vector storage.
API Endpoint:
DELETE /api/embeddings/remove-all
“Usage: Remove all stored information with this endpoint”.
4- Conversations: Engage in chatbot interactions with customers.
API Endpoint:
POST /api/conversation
“Usage: Initiate a conversation with the chatbot and customers”.
By following these steps, you’ll successfully set up the Langchain commands and API endpoints for module integration.
Admin Configuration of the AI Chatbot:
- API URL: Enter the web address where your module will communicate with another service
- Providing the OpenAI API key allows the module to connect and communicate with the OpenAI service.
- Vector Store: The admin selects the vector store they can use Pinecone or Chroma for the same.
- Pinecone API Key: Enter your Pinecone API key that helps with searching and finding similar items quickly.
- Pinecone Environment: Provide the Pincone environment that you will find with Pincone API.
- Pinecone Index : Enter the indexer of pinecone.
- Chroma Collection Name: if you select the chroma vector you have to enter the collection name.
- This follows the same process as the pine cone.
Note: The fields below depicting [default] refer to the settings of the default channel and the fields depicting [default-en] refer to the setting of the English locale.
Note: The admin can also use the Bagisto AI content generator to generate the product, category, and CMS page description.
Storefront Workflow – Bagisto Chatbot Using OpenAI ( ChatGPT ) LLM
The users of the website click on the ChatGPT bot at the storefront and provide their name and email to get started
Chatbot courteously asks, “How can we help you?“The user asks a question or states their need.
The chatbot processes the query and provides a helpful response. It offers product information, recommendations, support, and more, based on the user’s needs.
Display the data based on the customer’s last search and preference.
Clicking on the product link user will be redirected to the product page from where they can then make a checkout as required.
Support
That is all about the Bagisto Chatbot Using OpenAI ( ChatGPT ) LLM extension. If you have any queries regarding the plugin, please contact us at Webkul Support System.
You can add ChatGPT to your Bagisto Laravel E-commerce platform as well.
You may also check our quality Bagisto Extensions.
Be the first to comment.