Home Blog Reviews Best Picks Guides Tools Glossary Advertise Subscribe Free
Tech Frontline Mar 19, 2026 5 min read

Prompt Chaining for Supercharged AI Workflows: Practical Examples

Discover how to link prompts together for powerful, multi-step AI workflows that save you hours.

T
Tech Daily Shot Team
Published Mar 19, 2026
Prompt Chaining for Supercharged AI Workflows: Practical Examples

Prompt chaining is a powerful technique for building advanced AI workflows. By linking the output of one AI prompt to the input of another, you can perform complex, multi-step reasoning, automate content creation, and build robust data pipelines. This tutorial provides a practical, hands-on guide to prompt chaining using Python, the OpenAI API, and the LangChain library.

Prerequisites

Step 1: Set Up Your Project Environment

  1. Create and activate a virtual environment (optional but recommended):
    python -m venv ai-prompt-chaining
    cd ai-prompt-chaining
    source bin/activate  # On Windows use: .\Scripts\activate
  2. Install required packages:
    pip install openai langchain python-dotenv
  3. Set your OpenAI API key as an environment variable:
    • Create a file named .env in your project directory:
    echo "OPENAI_API_KEY=sk-..." > .env
    • Replace sk-... with your actual API key.

Step 2: Understand the Prompt Chaining Concept

In prompt chaining, you use the output of one AI prompt as the input for the next. This enables complex, multi-step tasks, such as:

We'll demonstrate prompt chaining by building a workflow that:

  1. Summarizes a news article
  2. Generates quiz questions from the summary
  3. Creates answers to those questions

Step 3: Basic Prompt Chaining with OpenAI API

  1. Load your API key and set up the OpenAI client:
    
    import os
    from dotenv import load_dotenv
    import openai
    
    load_dotenv()
    openai.api_key = os.getenv("OPENAI_API_KEY")
          
  2. Define a helper function to call the OpenAI API:
    
    def ask_openai(prompt, model="gpt-3.5-turbo", temperature=0.7):
        response = openai.ChatCompletion.create(
            model=model,
            messages=[{"role": "user", "content": prompt}],
            temperature=temperature,
            max_tokens=512,
        )
        return response.choices[0].message["content"].strip()
          
  3. Step 1: Summarize a news article
    
    news_article = """
    NASA's Mars helicopter Ingenuity has completed its 50th flight on the Red Planet. 
    The helicopter, which arrived with the Perseverance rover, was originally designed 
    for just five flights. Ingenuity has far exceeded expectations, capturing aerial 
    images and assisting the rover in navigation. NASA engineers continue to push 
    the boundaries of what the small helicopter can achieve in the challenging Martian environment.
    """
    
    summary_prompt = f"Summarize this article in 3 sentences:\n{news_article}"
    summary = ask_openai(summary_prompt)
    print("Summary:\n", summary)
          

    Expected output (example):
    NASA's Mars helicopter Ingenuity has completed its 50th flight, far surpassing its original five-flight mission. The helicopter has provided valuable aerial images and navigation support for the Perseverance rover. Engineers are continuing to explore the helicopter's capabilities in Mars' harsh environment.

  4. Step 2: Generate quiz questions from the summary
    
    questions_prompt = f"Based on this summary, generate 3 quiz questions:\n{summary}"
    questions = ask_openai(questions_prompt)
    print("Quiz Questions:\n", questions)
          

    Example output: 1. How many flights has NASA's Mars helicopter Ingenuity completed? 2. What was the original mission goal for Ingenuity? 3. How has Ingenuity assisted the Perseverance rover?

  5. Step 3: Generate answers to the quiz questions
    
    answers_prompt = f"Provide concise answers to these questions:\n{questions}\nBased on the summary:\n{summary}"
    answers = ask_openai(answers_prompt)
    print("Answers:\n", answers)
          

    Example output: 1. Ingenuity has completed 50 flights. 2. The original goal was just five flights. 3. Ingenuity has provided aerial images and navigation support.

Step 4: Advanced Prompt Chaining with LangChain

LangChain is a popular Python library for building composable AI workflows. It makes prompt chaining more robust and reusable.

  1. Import LangChain components:
    
    from langchain.llms import OpenAI
    from langchain.prompts import PromptTemplate
    from langchain.chains import LLMChain, SequentialChain
          
  2. Initialize the OpenAI LLM:
    
    llm = OpenAI(openai_api_key=os.getenv("OPENAI_API_KEY"), temperature=0.7)
          
  3. Define prompt templates for each step:
    
    summary_template = PromptTemplate(
        input_variables=["article"],
        template="Summarize this article in 3 sentences:\n{article}",
    )
    
    questions_template = PromptTemplate(
        input_variables=["summary"],
        template="Based on this summary, generate 3 quiz questions:\n{summary}",
    )
    
    answers_template = PromptTemplate(
        input_variables=["questions", "summary"],
        template="Provide concise answers to these questions:\n{questions}\nBased on the summary:\n{summary}",
    )
          
  4. Create LangChain chains for each step:
    
    summary_chain = LLMChain(llm=llm, prompt=summary_template, output_key="summary")
    questions_chain = LLMChain(llm=llm, prompt=questions_template, output_key="questions")
    answers_chain = LLMChain(llm=llm, prompt=answers_template, output_key="answers")
          
  5. Combine the chains into a sequential workflow:
    
    overall_chain = SequentialChain(
        chains=[summary_chain, questions_chain, answers_chain],
        input_variables=["article"],
        output_variables=["summary", "questions", "answers"],
        verbose=True,
    )
          
  6. Run the full prompt chaining workflow:
    
    inputs = {"article": news_article}
    results = overall_chain(inputs)
    print("Summary:\n", results["summary"])
    print("\nQuestions:\n", results["questions"])
    print("\nAnswers:\n", results["answers"])
          

    Screenshot Description: The terminal displays the summary, followed by three quiz questions, and then concise answers, all generated automatically via chained prompts.

Step 5: Customizing and Expanding Prompt Chains

Prompt chaining is highly flexible. Here are ways to extend your workflow:

Example: Adding a translation step (English to Spanish):


from langchain.prompts import PromptTemplate

translate_template = PromptTemplate(
    input_variables=["summary"],
    template="Translate this summary to Spanish:\n{summary}",
)

translate_chain = LLMChain(llm=llm, prompt=translate_template, output_key="spanish_summary")

overall_chain = SequentialChain(
    chains=[summary_chain, translate_chain, questions_chain, answers_chain],
    input_variables=["article"],
    output_variables=["summary", "spanish_summary", "questions", "answers"],
    verbose=True,
)
results = overall_chain({"article": news_article})
print("Spanish Summary:\n", results["spanish_summary"])
  

Common Issues & Troubleshooting

Next Steps

Prompt chaining empowers you to build sophisticated AI workflows with simple code. With practice, you’ll be able to automate research, content creation, and data analysis tasks that were once impossible. Happy chaining!

prompt chaining ai automation workflow tutorial

Related Articles

Tech Frontline
Definitive Guide to AI Prompt Engineering (2026 Edition)
Mar 19, 2026
Tech Frontline
How to Use AI Agents for Automated Research Workflows
Mar 19, 2026
Free & Interactive

Tools & Software

100+ hand-picked tools personally tested by our team — for developers, designers, and power users.

🛠 Dev Tools 🎨 Design 🔒 Security ☁️ Cloud
Explore Tools →
Step by Step

Guides & Playbooks

Complete, actionable guides for every stage — from setup to mastery. No fluff, just results.

📚 Homelab 🔒 Privacy 🐧 Linux ⚙️ DevOps
Browse Guides →
Advertise with Us

Put your brand in front of 10,000+ tech professionals

Native placements that feel like recommendations. Newsletter, articles, banners, and directory features.

✉️
Newsletter
10K+ reach
📰
Articles
SEO evergreen
🖼️
Banners
Site-wide
🎯
Directory
Priority

Stay ahead of the tech curve

Join 10,000+ professionals who start their morning smarter. No spam, no fluff — just the most important tech developments, explained.