Reading list Switch to dark mode

    How to Use LLama2 In Magento

    Updated 4 April 2024

    Have you ever considered having a conversation with an intelligent computer? Or perhaps you’re interested in creating innovative projects base on Magento with AI LLM LLama2 ? Well, take a look at LLama2 ! It’s the newest highly intelligent computer program developed by Meta AI, a division of Meta (formerly known as Facebook).

    What is LLama2?

    Llama 2 is part of a group of smart AI called large language models (LLMs). These machines can create and understand human language. They get really smart with huge amounts of text from all sorts of places like books, websites, and social media. This helps them learn how words and sentences work in different situations and topics.

    How can we use it and its installation process?

    There are multiple ways to use LLama2, we are discussing it to use it on the local system or on a cloud hosted server for that we can use the ollama.

    About ollama

    Ollama is a handy tool that lets you use open-source large language models (LLMs) right on your own computer. It works with different models like Llama 2, Code Llama, and more. It gathers everything these models need – like their settings, data, and code – into one neat package called a Modelfile. Ollama simplifies autonomous learning and adjustment, minimising the requirement for extensive manual reprogramming and fine-tuning of AI systems. We can use this to make various applications like natural language processing and image recognition.

    Start your headless eCommerce
    now.
    Find out More

    Ollama Installation:

    For Windows:

    https://ollama.com/download/windows

    For Mac:

    https://ollama.com/download/mac

    For Linux:

    Install it using a single command:

    curl -fsSL https://ollama.com/install.sh | sh

    Ollama Supports list of models available on  https://ollama.com/library

    Below are listed a few models.

    Llama2

    Please keep in mind that running the 7B models requires a minimum of 8 GB of RAM, while the 13B models need 16 GB, and the 33B models need 32 GB.

    After ollama installation, you can easily fetch any of the models using a simple pull command.

    ollama pull llama2

    After completing the pull command, you can run it directly in the terminal for text generation.

    For Example.

    ollama run llama2

    Llama2 run on terminal.

    Ollama also offers a REST API for running and managing models.

    See the API Documentation for the endpoints.

    How can we use it in Magento?

    As A Shopping Assistant: 

    With Llama-2, you can develop chat bots capable of conversing naturally with people. You have the flexibility to tailor these chat bots to fit your specific field, personality, and tone. 

    For auto generate content of products : 
    Using Llama-2, you can generate content of products base on products Name , SKU or other attributes as your need.

    Ex. using following method you can generate content of products base on product name and set in product description as your need.

    <?php
    namespace Webkul\Ollama\Model;
    
    class AIContent
    {
        /**
         * @var \Magento\Framework\HTTP\Client\Curl
         */
        protected $curl;
    
        /**
         * @param \Magento\Framework\HTTP\Client\Curl $curl
         */
        public function __construct(
            \Magento\Framework\HTTP\Client\Curl $curl
        ) {
            $this->curl = $curl;
        }
    
        /**
         * Get AI content
         *
         * @param string $productName
         * @return string
         */
        public function getAIContent($productName)
        {
            $this->curl->addHeader('content-type', 'application/json');
            $arr = [
                CURLOPT_RETURNTRANSFER => true,
                CURLOPT_ENCODING => '',
                CURLOPT_MAXREDIRS => 10,
                CURLOPT_TIMEOUT => 0,
                CURLOPT_FOLLOWLOCATION => true,
                CURLOPT_HTTP_VERSION => CURL_HTTP_VERSION_1_1,
                CURLOPT_POSTFIELDS =>'{"model":"llama2","prompt":"Product Specification and description for "'.$productName.',"stream": false}'
            ];
            
            $this->curl->setOptions($arr);
            //Here we pass Ollama API end point.
            $this->curl->post('http://XXX.XXX.XX.XXX:11434/api/generate', []);
            $productContent = json_decode($this->curl->getBody(), true);
            return $productContent['response'];
        }
    }

    Thank you !!

    . . .

    Leave a Comment

    Your email address will not be published. Required fields are marked*


    Be the first to comment.

    Back to Top

    Message Sent!

    If you have more details or questions, you can reply to the received confirmation email.

    Back to Home