How to Automatize your Twitter News Account with OpenAI ChatGPT and NewsAPI in Python

It’s no secret that Large Language Models (LLMs) are a powerful tool for automating social media tasks. Not only can they be used to curate relevant content that matches your audience’s interests, but also can they create the content and tailor them to the interests of your customers. This article describes how to create a Twitter News Bot such as the one that now automates sharing news updates on the relataly Twitter channel. The bot uses the NewsAPI and OpenAI’s chatGPT 3.5 model to retrieve and share relevant news updates. This tutorial will equip you with the skills to build a similar Twitter News Bot in Python that provides news on your own Twitter account.

We’ll start by exploring NewsAPI and show you how to use it to fetch news articles from various sources. Then we’ll use OpenAI to enhance the bot’s capabilities. By refining the news selection process and generating engaging tweets using ChatGPT, you can ensure your updates are unique, informative, and captivating. We’ll provide clear explanations and code samples to help you succeed. By the end, you’ll have a powerful News Bot that can deliver curated news updates fully automated. But the best thing is that it is so efficient that you can run it in the cloud almost for free. So, let’s dive in and get started on creating an impressive NewsBot using OpenAI and NewsAPI in Python!

This is what the posts from the bot look like:

The Relataly Twitter Account runs the Same News Bot described in this tutorial. It leverages a serverless cloud architecture based on Azure Functions.

Also relevant:

The Architecture of the Twitter NewsBot

Let’s begin with an overview of what the NewsBot looks like. The illustration below shows the architecture.

The architecture of the Relataly News Bot.

The modular architecture of the bot makes it easy to deploy as microservices to the cloud, just like relataly does with Azure functions. This approach also makes it simple to customize the bot by changing topics or defining a different style for the tweet creation process.

Fetching News

The bot is scheduled to run regularly, typically every hour. During each call, it retrieves the latest news articles from the NewsAPI. The NewsAPI allows filtering for specific categories:

  • business entertainment
  • general
  • health
  • science
  • sports
  • technology

The news bot only news related to “technology.” However, this is not specific enough, as we only want to retrieve news related to AI, data science, and Machine Learning. To address this, we rely on OpenAI to determine which news is relevant. ChatGPT provides us with a list that identifies the relevant news with True and irrelevant news with False. In addition, the bot tracks which news articles have already been shared in a CSV file. This ensures that the same news is not shared multiple times.

Creating News Tweets

Once we have the relevant news, the bot hands over the information to the second component responsible for creating the tweets. For each post, the bot makes an API call to OpenAI GPT, which generates a tweet based on a prompt that combines the article title, description, and URL. The resulting tweet is then sent to the Twitter API, which adds it to the account’s feed.

By automating the process of sharing news updates on Twitter, you can save time and focus on more important tasks. Additionally, using OpenAI to generate engaging tweets can help increase social media engagement and attract more followers.

Also:

Customization Options

You can easily customize the bot by changing the relevant topics or by defining another style in which OpenAI creates the tweets. By automating the process of sharing news updates on Twitter, you can save time and focus on more essential tasks. Additionally, using OpenAI to generate engaging tweets can help increase your social media engagement and attract more followers. In the next section, let’s look at the APIs used in this architecture and how you gain access.

Also: ChatGPT Style Guide: Understanding Voice and Tone Prompt Options for Engaging Conversations

APIs used in this Tutorial

To use the functionalities of our application, you will need to obtain the following API keys: NewsAPI, OpenAI API, and Twitter API. Without an API key, the API calls will lead to an error, so there is no way around signing up.

NewsAPI.org API

The NewsAPI provides access to over 30,000 news sources from around the world, including major news organizations such as CNN, BBC, and Reuters. We will walk you through how to specify criteria such as sources, keywords, and categories to retrieve relevant news articles. The free tier allows you to retrieve a certain number of news for free.

The NewsAPI key provides access to news articles and headlines from various sources. To obtain the API key, you can sign up for a NewsAPI account on their website at NewsAPI Registration and generate a unique key specifically for your application.

OpenAI API

The OpenAI API key is required to leverage the power of artificial intelligence and natural language processing provided by OpenAI. You can obtain an API key by signing up for an OpenAI account at OpenAI Registration and following their documentation to generate the key associated with your account.

While OpenAI provides powerful language models through its API, it is important to note that there is a cost associated with using the GPT model that depends on the amount of text and the model type you are using. In this tutorial, we will be using the chatGPT turbo model, which is highly cost-efficient. Because we only process tiny bits of text, each inference typically costs a fraction of a cent. However, the amount might increase if you adjust the code and process more text. Check the OpenAI pricing site for the latest price information.

To give you an idea, my monthly openai costs for running the bot are between less than 2 $:

To give you an idea, my monthly openai costs for running the bot are between less than 2 $

Twitter Developer API

Finally, you will need to obtain Twitter API keys to interact with the Twitter API and perform actions such as posting tweets. These include the API key, API secret key, access token, and access token secret. You can obtain these keys by creating a Twitter Developer account at Twitter Developer Portal, setting up a new application, and generating the required keys within the Twitter Developer dashboard. It can take some days until your application gets approved, but the good thing is that a basic tier allows you to create up to 1500 tweets for free.

Once you have obtained these API keys, you can securely store them in a secure storage solution like Azure Key Vault, as shown in the code example. This ensures that your keys remain confidential and allows your application to access the necessary services seamlessly.

Also: Accessing Remote Data Sources via REST APIs in Python

The relataly Twitter news bot leverages and integrates the following APIs: NewsAPI, OpenAI, and Twitter.
The relataly Twitter news bot leverages and integrates the following APIs: NewsAPI, OpenAI, and Twitter.

Creating an OpenAI NewsBot for Twitter using NewsAPI in Python

Now that you are familiar with the architecture of the news bot, it’s time to roll up your sleeves and get into the coding. We will guide you through the necessary steps to create an OpenAI News Bot for Twitter using Python. By leveraging the advanced language model of OpenAI and the vast resources available through NewsAPI, we will develop a sophisticated bot that automatically fetches and tweets the latest news updates on Twitter. Our bot will go beyond basic news retrieval and have the ability to curate relevant news articles within a specific scope and generate compelling tweets that stand out in the noise of social media. Let’s get started!

The code is available on the GitHub repository.

Update August 2023: If you are wondering what the exact code is for the automation of the relataly twitter account in Azure functions, I have published it in a separate GitHub repository. In the meantime, i have slightly adjusted the code and added functionalities for retrieveing articles from hackernews and posting fact tweets.

OpenAI, with its language models and natural language processing capabilities, can be a valuable tool in automating Social media tasks.

Setup and APIs

Before diving into the code, it’s essential to ensure that you have the proper setup for your Python 3 environment and have installed all the necessary packages. If you do not have a Python environment, follow the instructions in this tutorial to set up the Anaconda Python environment. This will provide you with a robust and versatile environment well-suited for machine learning and data science tasks.

In this tutorial, we will be working with the OpenAI library. You can install the OpenAI Python library using console commands:

  • pip install openai
  • conda install openai (if you are using the Anaconda packet manager)

Step #1 Imports and Authentication

We begin by making the necessary imports and setting up the API keys required to authenticate against the different APIs. In the example, below I have used the Azure key vault to store the API keys in a secure manner in the cloud. You can also load the API keys from a yaml file or store the API keys in the code (not advised for security reasons).

We retrieve and set API keys and authentication credentials for services like NewsAPI, OpenAI, and Twitter. These keys and credentials are essential for authenticating our application and ensuring secure access to the respective APIs. Additionally, we implement logging functionality to record important information about our program’s execution, enabling us to monitor and troubleshoot as needed.

In addition, we assign a variable for the CSV file name we’ll use to store our news log data.

By executing these steps, we lay the foundation for seamless integration with external APIs and services, enabling our application to perform a wide range of functionalities in subsequent parts of our code.

# A tutorial for this file is available at www.relataly.com
# Tested with Python 3.9.13, 1.2.0, Pandas 1.3.4, OpenAI 0.27.3, Tweepy 4.13.0, Requests 2.26.0, 

import logging
import pandas as pd
import openai
import tweepy
import csv
import requests

# Set API Keys and Authentification
logging.info('Setting NewsAPI API Key')
NEWSAPI_API_KEY = <your API key> # replace with own API key

logging.info('Setting OpenAI API Key')
openai.api_key = <your API key> # replace with own API key

logging.info('Setting Twitter API Key')
auth=tweepy.OAuthHandler(<your API key>,
                         <your API secret>)
auth.set_access_token(<your access token>,
                      <your access secret>)
twitter_api=tweepy.API(auth)

CSV_NAME = 'news_log.csv'

Step #2 Collecting the News from the NewsAPI

Next, we define a function to retrieve technology news from the NewsAPI.

The function below takes an optional parameter number to specify the desired number of news items. Inside the function, we construct a URL using the NewsAPI endpoint and our API key. We send a request to the URL and receive a JSON response containing the news data. Then, we extract the relevant information, such as the title, description, and URL, from the response.

Using the Pandas library, we create a DataFrame to organize the news items neatly. We filter out any rows with missing values, ensuring we have clean and complete data. Finally, we return the top number news items as a DataFrame.

By utilizing the fetch_news function, we can effortlessly access a collection of up-to-date technology news articles. This allows us to stay informed and integrate the latest news seamlessly into our application.

#### NewsAPI
def fetch_news(number=10):
    # Fetch tech news from NewsAPI
    url = f"https://newsapi.org/v2/top-headlines?country=us&category=technology&apiKey={NEWSAPI_API_KEY}"
    response = requests.get(url).json()
    news_items = response["articles"]
    df = pd.DataFrame(news_items)
    df = df[["title", "description", "url"]].dropna()
    return df.head(number)

Step #3 OpenAI Functions for News Relevance and Tweet Generation


We proceed with the OpenAI integration, which allows us to perform important tasks in our application. Firstly, we utilize the OpenAI GPT-3.5 engine to identify the relevance of news articles based on a specific scope of topics. By constructing prompts and leveraging the GPT-3.5 Turbo model, we can effectively determine the relevance of news articles to enhance the accuracy of our information. If you are interested in retrieving other news, you can easily adjust the prompt and include other topics that you deem relevant for your application.

In addition, we define a clause for the case that the bot does not identify any news as relevant. In this case, the bot will create a fact tweet that describes common terms from the domain of machine learning and data science.

Furthermore, we harness the power of OpenAI to create tweets uniquely and engagingly. By generating prompts that provide a title, description, and a tiny URL, we can generate informative tweets within the 280-character limit. This enables us to share compelling news content while utilizing hashtags to expand the reach of our tweets. As you can see in the function “select_relevant_news_prompt” code, we provide a single sample response to OpenAI ChatGPT. This approach is known as 1-shot learning, and it typically significantly improves the results compared to a prompt without any samples (zero-shot learning).

In the second function, “create_tweet_prompt,” we do not provide any sample response to minimize the number of tokens processed and lower our cost for calling the model. This approach is known as zero-shot learning because the model is not given any sample.

#### OpenAI Engine
def openai_request(instructions, task, sample = [], model_engine='gpt-3.5-turbo'):
    prompt = [{"role": "system", "content": instructions }, 
              {"role": "user", "content": task }]
    prompt = sample + prompt
    completion = openai.ChatCompletion.create(model=model_engine, messages=prompt, temperature=0.2, max_tokens=400)
    return completion.choices[0].message.content


#### Define OpenAI Prompt for news Relevance
def select_relevant_news_prompt(news_articles, topics, n):    
    instructions = f'Your task is to examine a list of News and return a list of boolean values that indicate which of the News are in scope of a list of topics. \
    Return a list of True or False values that indicate the relevance of the News.'
    task =  f"{news_articles} /n {topics}?" 
    sample = [
        {"role": "user", "content": f"[new AI model available from Nvidia, We Exploded the AMD Ryzen 7, Release of b2 Game, XGBoost 3.0 improvices Decision Forest Algorithms, New Zelda Game Now Available, Ukraine Uses a New Weapon] /n {topics}?"},
        {"role": "assistant", "content": "[True, False, False, True, False, False]"},
        {"role": "user", "content": f"[Giga Giraff found in Sounth Africa, We Exploded the AMD Ryzen 7, Release of b2 Game, Donald Trump to make a come back, New Zelda Game Now Available, Ukraine Uses a New Weapon] /n {topics}?"}, 
        {"role": "assistant", "content": "[False, False, False, False, False, False]"}]
    return instructions, task, sample


#### Define OpenAI Prompt for news Relevance
def check_previous_posts_prompt(title, old_posts):    
    instructions = f'Your objective is to compare a news title with a list of previous news and determine whether it covers a similar topic that was already covered by a previous title. \
        Rate the overlap on a scale between 1 and 10 with 1 beeing a full overlap and 10 representing an unrelated topic. "'
    task =  f"'{title}.' Previous News: {old_posts}."
    sample = [
        {"role": "user", "content": "'Nvidia launches new AI model.' Previous News: [new AI model available from Nvidia, We Exploded the AMD Ryzen 7 7800X3D, The Lara Croft Collection For Switch Has Been Rated By The ESRB]."},
        {"role": "assistant", "content": "1"},
        {"role": "user", "content": "'Big Explosion of an AMD Ryzen 7.' Previous News: [Improving Mental Wellbeing Through Physical Activity, The Lara Croft Collection For Switch Has Been Rated By The ESRB]."},
        {"role": "assistant", "content": "10"},
        {"role": "user", "content": "'new AI model available from Google.' Previous News : [new AI model available from Nvidia, The Lara Croft Collection For Switch Has Been Rated By The ESRB]."},
        {"role": "assistant", "content": "9"},
        {"role": "user", "content": "'What Really Made Geoffrey Hinton Into an AI Doomer - WIRED.' Previous News : [Why AI's 'godfather' Geoffrey Hinton quit Google, new AI model available from Nvidia, The Lara Croft Collection For Switch Has Been Rated By The ESRB]."},
        {"role": "assistant", "content": "4"}]
    return instructions, task, sample


#### Define OpenAI Prompt for News Tweet
def create_tweet_prompt(title, description, tiny_url):
    instructions = f'You are a twitter user that creates tweets with a maximum length of 280 characters.'
    task = f"Create an informative tweet on twitter based on the following news title and description. \
        The tweet must use a maximum of 280 characters. \
        Include the {tiny_url}. But do not include any other urls.\
        Title: {title}. \
        Description: {description}. \
        Use hashtags to reach a wider audience. \
        Do not include any emojis in the tweet"
    return instructions, task


#### Define OpenAI Prompt for news Relevance
def previous_post_check(title, old_posts):
    instructions, task, sample = check_previous_posts_prompt(title, old_posts)
    response = openai_request(instructions, task, sample)
    return eval(response)


#### Define OpenAI Prompt for News Tweet
def create_fact_tweet_prompt(old_terms):
    instructions = f'You are a twitter user that creates tweets with a length below 280 characters.'
    task = f"Choose a technical term from the field of AI, machine learning or data science. Then create a twitter tweet that describes the term. Just return a python dictionary with the term and the tweet. "
    # if old terms not empty
    if old_terms != []:
        avoid_terms =f'Avoid the following terms, because you have previously tweetet about them: {old_terms}'
        task = task + avoid_terms
    sample = [
        {"role": "user", "content": f"Choose a technical term from the field of AI, machine learning or data science. Then create a twitter tweet that describes the term. Just return a python dictionary with the term and the tweet."},
        {"role": "assistant", "content": "{'GradientDescent': '#GradientDescent is a popular optimization algorithm used to minimize the error of a model by adjusting its parameters. \
         It works by iteratively calculating the gradient of the error with respect to the parameters and updating them accordingly. #ML'}"}]
    return instructions, task, sample

# Load previous information from a csv file
def get_history_from_csv(csv_name):
    try:
        # try loading the csv file
        df = pd.read_csv(csv_name)
    except:
        # create the csv file
        df = pd.DataFrame(columns=['title'])
        df.to_csv(csv_name, index=False)
    return df

Step #4 Functions for Publishing Twitter Tweets


Next, we will create a series of functions that enable us to create and post tweets based on specific news articles. The first function, check_tweet_length, checks the length of a tweet and ensures it does not exceed the 280-character limit. If the tweet is too long, it returns False; otherwise, it returns True.

Given the character limit of Twitter tweets at only 280 characters, including a URL can often take up a significant amount of that limit. To avoid wasting precious characters on lengthy URLs, we can use an URL-shortening service called TinyURL. This service provides a shortened version of any input URL, allowing us to fit more text within the character limit. The good news is that TinyURL offers an API that we can use without the need for an API key.

The create_tweet function takes the title, description, and URL of a news article as inputs. It generates a tiny URL using the create_tiny_url function and constructs a prompt for tweet creation using the create_tweet_prompt function. The prompt is then sent to the OpenAI engine to generate the tweet content. The generated tweet is checked for length using the check_tweet_length function, and if it passes the length check, it is posted using the Twitter API.

By utilizing these functions together, we can effectively create and post tweets based on news articles while ensuring they meet the length requirements and include relevant information for our audience. This streamlines the process of sharing news updates and engaging with our followers in a concise and informative manner.

def check_tweet_length(tweet):
    return False if len(tweet) > 280 else True


# Create the fact tweet
def create_fact_tweet(chance_for_tweet = 0.5):
    df_old_facts = get_history_from_csv(CSV_FACTS_NAME)

    if random.random() < chance_for_tweet:
        # create a fact tweet
        instructions, tasks, sample = create_fact_tweet_prompt(list(df_old_facts.tail(10)['title']))
        tweet = openai_request(instructions, tasks, sample)
        tweet_text = list(eval(tweet).values())[0]

        # tweet creation
        print(f'Creating fact tweet: {tweet_text}')
        
        # check tweet length and post tweet
        if check_tweet_length(tweet):
            twitter_auth().update_status(tweet_text)
            term = list(eval(tweet).keys())[0]
            # save the fact in the csv file
            with open(f'{CSV_FACTS_NAME}', 'a', newline='') as file:
                            writer = csv.writer(file)
                            writer.writerow([term])
        else: 
            print('error tweet too long')
    else:
        print('No fact tweet created')
    

def create_news_tweet(title, description, url):
    # create tiny url
    tiny_url = create_tiny_url(url)

    # define prompt for tweet creation
    instructions, task = create_tweet_prompt(title, description, tiny_url)
    tweet_text = openai_request(instructions, task)

    print(f'Creating new tweet: {tweet_text}')
    # check tweet length and post tweet
    if check_tweet_length(tweet_text):
            status = twitter_auth().update_status(tweet_text)
            print(f"Tweeted: {title}")
            with open(f'{CSV_NEWS_NAME}', 'a', newline='') as file:
                writer = csv.writer(file)
                writer.writerow([title])
    else: 
        status = 'error tweet too long'
    return status


def create_tiny_url(url):
    response = requests.get(f'http://tinyurl.com/api-create.php?url={url}')
    return response.text

Step #5 Bringing it All Together – Running the Bot


Finally, we reach the exciting part where we can run our Twitter news bot. The code below utilizes the functions we’ve created earlier to compose a tweet and post it on Twitter.

First, we call the create_tweet(title, description, url) function, which takes the title, description, and URL of a news article. This function generates a shorter URL and prepares a prompt for the tweet. Next, we generate the actual content of the tweet by using the prompt with the help of the openai_request(instructions, task) function.

Once the tweet is generated and assigned to a variable called tweet, we check its length using the check_tweet_length(tweet) function. This step ensures that the tweet does not exceed the maximum character limit allowed by Twitter. If the tweet is within the allowed limit, we post it on Twitter using the twitter_api.update_status(tweet) function. We also log the ID of the posted tweet for reference.

If the tweet exceeds the character limit, we assign an error message to the status variable.

Finally, the function returns the status variable, which can be used to analyze any potential issues or handle errors.

This code segment brings everything together, allowing our Twitter news bot to automatically create and publish tweets based on news articles. You can wrap this function with a timer function to run the code on a regular interval.

#### Main Bot
def main_bot():
    # Read the old CSV data
    # try opening the csv file and creeate it if it does not exist
    df_old_news = get_history_from_csv(CSV_NEWS_NAME)
    df_old_news = df_old_news.tail(16)
    # Fetch news data
    df = fetch_news(12)
    
    # Check the Relevance of the News and Filter those not relevant
    relevant_topics ="[AI, Machine Learning, Data Science, OpenAI, Artificial Intelligence, Data Mining, Data Analytics]"
    instructions, task, sample = select_relevant_news_prompt(list(df['title']), relevant_topics, len(list(df['title'])))
    relevance = openai_request(instructions, task, sample)
    relevance_list = eval(relevance)

    s = 0
    df = df[relevance_list]
    if len(df) > 0:
        for index, row in df.iterrows():
            if s == 1:
                break
            logging.info('info:' + row['title'])
            title = row['title']
            title = title.replace("'", "")
            description = row['description']
            url = row['url']            
                                             
            if (title not in df_old_news.title.values):
                doublicate_check = previous_post_check(title, list(df_old_news.tail(10)['title']))
                if doublicate_check > 3:
                    # Create a tweet
                    response = create_news_tweet(title, description, url)
            
                else: 
                    print(f"Already tweeted: {title}")
            else: 
                print("No news articles found")
                create_fact_tweet(chance_for_tweet=0.5)

main_bot()
No news articles found
No fact tweet created
No news articles found
Creating fact tweet: Looking for a way to cluster your data? Try #KMeans! It is an unsupervised learning algorithm that groups similar data points together based on their distance to a centroid. #MachineLearning #DataScience
Creating tweet: {'KMeans': 'Looking for a way to cluster your data? Try #KMeans! It is an unsupervised learning algorithm that groups similar data points together based on their distance to a centroid. #MachineLearning #DataScience'}

Hosting the Bot Serverless on Azure Functions

If you want to run the bot fully automated, you will probably want to host it somewhere in the cloud. To host the OpenAI newsbot on Azure, you can follow these steps:

  1. Create an Azure Key Vault to store the credentials securely.
  2. Create an Azure Blob Storage account to keep track of past posts.
  3. Create two Azure Functions. The first function fetches news on a regular interval and checks that only relevant news is posted. This function also ensures that there are no duplicates with the help of OpenAI and a CSV file that keeps track of previous posts. The second function is an HTTP trigger and is used to create the post. It reaches out to OpenAI to generate the post and then makes the call to the Twitter API.
  4. Once the functions are created, deploy the code to Azure Functions.
  5. Configure the Azure Functions to use the credentials from the Key Vault and the Blob Storage account to store the CSV file.

The code for the newsbot is only slightly different from what was presented in the blog post. If you’re interested in the Azure-ready code, feel free to message me.

The Relataly OpenAI News Bot runs on a Serverless Infrastructure based on Azure Functions.
The Relataly News Bot runs on a Serverless Infrastructure based on Azure Functions

Summary

In this article, we have developed a Twitter news bot that automates news updates on the platform. Our focus was on leveraging OpenAI’s advanced language model to create a powerful bot capable of generating and posting engaging tweets based on news articles. Throughout the discussion, we covered essential aspects such as tweet length validation, concise URL generation, crafting captivating tweet prompts, and seamless integration with the Twitter API for efficient posting.

By following the step-by-step guide presented in this blog, readers were able to harness the capabilities of OpenAI technology and successfully build their own Twitter news bots. This enabled them to effortlessly deliver timely updates to their audience, expanding their reach and fostering meaningful engagement within the Twitter community.

Whether you have questions, suggestions, or insights, let us know in the comments below. Your feedback is invaluable to us as we strive to continuously improve.

Sources

0 0 votes
Article Rating
Subscribe
Notify of

1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
7 months ago

Hello. I am writing from Turkey. I read your article, but I don’t have clear information on these issues yet. I also have an idea, but since I don’t know how to do it, I request your help for a fee.

1
0
Would love your thoughts, please comment.x
()
x