AI

Integrating Google Bard with Python via Bard-API

Using Google Bard in Python via bardapi

Google Bard is a large language model (LLM) similar to OpenAI’s ChatGPT, capable of answering questions, generating creative text, translating languages, and producing various forms of creative content. Currently, Google Bard is available to the public, but in a limited beta release. Users can join a waitlist to apply for access to Bard.

For those eager to integrate Google Bard with Python, it is important to note that there is currently no official API available. However, we can utilize Daniel Park’s bardapi to use in python environment.

There are several advantages of using Google Bard (bardapi version) over OpenAI (API version).

  1. Bard is more up-to-date compared to OpenAI, whose knowledge cutoff was in September 2021.
  2. Bard is currently available for free usage unlike the OpenAI API version.
  3. Bardapi has the ability to understand chat history, which is not readily available with the OpenAI API. However, it is worth noting that by leveraging tools like LangChain, it is possible to integrate memory state functionality with the OpenAI API, enabling similar capabilities.

Below is a sample guide on how to integrate Bard with python using the Bardapi.


# Script reference: https://github.com/dsdanielpark/Bard-API

from bardapi import Bard
import os
import requests
os.environ['_BARD_API_KEY'] = 'xxxxxxx'

# This will allow us to continue conversation with Bard in separate query
session = requests.Session()

session.headers = {
"Host": "bard.google.com",
"X-Same-Domain": "1",
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.114 Safari/537.36",
"Content-Type": "application/x-www-form-urlencoded;charset=UTF-8",
"Origin": "https://bard.google.com",
"Referer": "https://bard.google.com/",
        }

session.cookies.set("__Secure-1PSID", os.getenv("_BARD_API_KEY")) 

bard = Bard(token=token, session=session, timeout=30)
bard.get_answer("We will talk about latest presidents in Germany and Italy. Who are they")['content']

# Continued conversation without set new session
bard.get_answer("What we talk about just now??")['content']


Python Bard API continuing conversation within a session

Google Bard also has the ability to return images, which has been enabled in the development version of Bard-API. We can expect this feature to be available in the production version soon.

Conclusion

In conclusion, the Bard-API package provides a practical and effective means to interact with Google Bard’s response API within your Python environment, bridging the gap until the official release of the Bard API.

By utilizing Bard-API, you can fully explore and leverage the capabilities of Bard, allowing you to experiment with various queries and unlock the valuable insights it has to offer.

This post has also been published on Medium.

PandasAI — Exploratory Data Analysis with Pandas and AI prompts

I came across PandasAI while searching for AI integration with Pandas dataframes. My primary objective is to conduct fast exploratory data analysis on new datasets, which would guide my future analysis approach. PandasAI appeared to meet my needs in this regard. In summary, PandasAI is a Python library that seamlessly integrates generative artificial intelligence capabilities (eg Openai) into Pandas, enabling users to perform basic Pandas operations using simple text prompts. It’s worth noting that PandasAI is designed to complement rather than replace Pandas.

What I like about pandasAI

  1. Alternatives LLM integration: Besides openai, PandasAI support integration with Hugging Face’s Starcoder, which is free to use and works pretty well with PandasAI
  2. Return Dataframe Object: PandasAI returns dataframe objects that can be further processed by Pandas or PandasAI itself.
  3. Simplified Plotting process: PandasAI simplifies common plotting tasks for easy data visualization.

In the following sections, we explore a range of common tasks that can be performed by prompting the dataframe instead of the usual pandas operations. We will use a sample dataset “Penguins” loaded from seaborn as our study. We will also be using the Hugging Face starcoder LLM which is free. However, I find that openai is able to deliver the right output with longer and more complex prompt.

!pip install pandasai
# Setting up for prompt
import pandas as pd
from pandasai import PandasAI
from pandasai.llm.starcoder import Starcoder
from pandasai.llm.openai import OpenAI
import seaborn as sns

# Instantiate a LLM
# Openai
# llm = OpenAI(api_token="openai_key")

# Starcoder
llm = Starcoder(api_token="hugging face api key")
pandas_ai = PandasAI(llm)

# Load dataset
penguins = sns.load_dataset("penguins")
Basic Operations prompt
NA operations
Fillna and row operations

There are some cases where I did not managed to get an output (openai llm might do a better job) such as below.

# it set to the penguins dataframe instead.
penguins_update = pandas_ai(penguins, prompt= 'return a copy. penguin[ bill_length_mm] = 0 if island = Torgersen', show_code=True)
# Does not return any output
penguins_newcol = pandas_ai(penguins, prompt= 'Add new column "bill_length_cm" by taking "bill_length_mm" /100.')

In conclusion, PandasAI excels at enabling simple and clean exploratory analysis, particularly with its seamless integration of Starcoder, which eliminates cost concerns. However, it may not perform as effectively with longer and more complex prompts, especially when used with Starcoder. It’s important to note that while PandasAI offers valuable functionalities, you will still rely on Pandas for more extensive data manipulation and analysis tasks.

This post has also been published on Medium

Effortless Prompt Generation: Auto-generating AI System Prompt Phrases with ChatGPT

Prompt engineering is essential for optimizing behavior and output of AI systems like ChatGPT. Creating effective prompts is challenging and time-consuming, especially when tailoring them for different roles/persona. However, we can utilize ChatGPT’s capabilities to generate prompts for us.

By empowering ChatGPT with custom knowledge on crafting effective prompts, we enable it to learn the skill of generating prompts tailored to various roles. To impart this custom knowledge, we leverage a collection of websites that provide good instances of well-crafted system prompts.

In this blog post, we will explore using Python, ChatGPT, and the LangChain module to generate role-specific prompt phrases. LangChain is a versatile tool that enables the conversion of external sources into document objects. Converted document objects can then be indexed in databases like Chroma, enabling fast and efficient information retrieval. This integration allows AI systems such as ChatGPT to access a broad spectrum of knowledge sources.

Setting up the Environment

Below code snippet demonstrates the necessary steps to extract additional data from websites, create embeddings, utilize Chroma vector store, load documents from the web, and persist the processed instance. These steps lay the foundation for generating system prompt phrases using the extracted data.

# intalling the necessary libraries in Jupyter
!pip install tiktoken
!pip install openai
!pip install chromadb
!pip install langchain
!pip install nest_asyncio
from langchain.embeddings.openai import OpenAIEmbeddings
from langchain.vectorstores import Chroma
from langchain.text_splitter import RecursiveCharacterTextSplitter
from langchain.llms import OpenAI
from langchain.chains import RetrievalQA
from langchain.chat_models import ChatOpenAI
from langchain.document_loaders import WebBaseLoader
from langchain.prompts import PromptTemplate
import nest_asyncio

nest_asyncio.apply()

# sample website with good system prompts
tgt_sites = ['https://github.com/f/awesome-chatgpt-prompts',
'https://www.greataiprompts.com/prompts/best-system-prompts-for-chatgpt/',
'https://stackdiary.com/chatgpt/role-based-prompts/']

def add_documents(loader, instance):
    documents = loader.load()
    text_splitter = RecursiveCharacterTextSplitter(chunk_size=1000, chunk_overlap=100, separators= ["\n\n", "\n", ".", ";", ",", " ", ""])
    texts = text_splitter.split_documents(documents)
    instance.add_documents(texts)

embeddings = OpenAIEmbeddings(openai_api_key='YOUR_OPENAI_API_KEY')
instance = Chroma(embedding_function=embeddings, persist_directory='PATH_TO_PERSIST_DIRECTORY')

loader = WebBaseLoader(tgt_sites)
if loader:
    add_documents(loader, instance)

instance.persist()
instance = None

instance = Chroma(persist_directory='PATH_TO_PERSIST_DIRECTORY', embedding_function=embeddings)

Generating System Prompt Phrases

Now that we have set up our environment and loaded the necessary data, we can proceed to generate system prompt phrases using ChatGPT. We will utilize the RetrievalQA class from Langchain, which incorporates the ChatOpenAI model to interact with the language model. Here is the code snippet to generate system prompt phrases:

qa = RetrievalQA.from_chain_type(
    llm=ChatOpenAI(
        model_name="gpt-3.5-turbo",
        temperature=0,
        openai_api_key='YOUR_OPENAI_API_KEY'
    ),
    chain_type="stuff",
    retriever=instance.as_retriever()
)

query_str = """
              Craft a paragraph of how chatgpt (address as you) supposed to act based on the role stated. 
              Provide expectation of the required scope, skillset and knowledge. 
              If there is no specific role found, use relative reference if necessary. 
              The role is "python blog professional writer". Maximium 5 sentences. 
              Start the paragraph with "I want you to act as a "

            """
output_string = qa.run(query_str)
print(output_string)
[Sample output]
I want you to act as a Python blog professional writer. As a ChatGPT, you are expected to have a good understanding of Python programming language and its various libraries and frameworks. You should be able to write informative and engaging blog posts that cater to both beginners and advanced users. Your writing should be clear, concise, and well-structured, with a focus on providing practical examples and use cases.Additionally, you should be able to keep up with the latest trends and developments in the Python community and incorporate them into your writing.

In this script, our focus is on generating system role prompts. You can append your specific requests or target tasks to these prompts for ChatGPT to understand and respond accordingly. To enhance ChatGPT’s capabilities, you can include additional relevant websites, expanding its knowledge base for prompt generation in various roles and scenarios.

Conclusion

In this blog post, we explored the process of using ChatGPT and additional data extracted from a webpage to generate system prompt phrases. By leveraging the power of language models and retrieval techniques, we can create informative and context-aware prompts to guide AI systems more efficiently.

This post has also been published on Medium.