Product updates
– 5 min read
Writer is now available for LangChain: Build powerful AI applications with ease

The Writer LangChain integration enables developers to build sophisticated AI applications by bringing Writer’s enterprise-grade AI capabilities into the LangChain ecosystem. Key features include ChatWriter for high-quality text generation, tool calling for accessing external functions, Knowledge Graph, and no-code applications, and document processing tools for parsing PDFs and splitting text, all of which can be smoothly integrated into complex workflows. An example use case demonstrates how to create a knowledge-grounded chat assistant that provides accurate, factual responses by leveraging Knowledge Graph, Writer’s graph-based RAG system.
As AI applications become more sophisticated, developers need tools that make it easier to build complex workflows without sacrificing flexibility or performance. Our Writer LangChain integration bridges this gap, allowing you to use the Writer enterprise-grade AI capabilities within the popular LangChain ecosystem. This powerful combination lets you quickly build production-ready applications with advanced features like Knowledge Graph integration, PDF parsing, and context-aware text splitting.
Key features
The Writer LangChain integration consists of several key components that work together to enable you to build sophisticated AI applications.
- ChatWriter is a LangChain chat model that connects to Writer Palmyra models for high-quality text generation with support for:
- Streaming responses
- Batch processing for efficient throughput
- Asynchronous operations for non-blocking applications
- Customizable parameters like temperature and max tokens
- Tool calling enables Palmyra to access external functions and built-in Writer features, including:
- Easy-to-use decorator syntax for defining tools
GraphTool
for Knowledge Graph integrationLLMTool
for model delegationNoCodeAppTool
for using AI Studio no-code applications as tools- Smooth integration with LangChain’s tool ecosystem
- Document processing tools let you parse PDFs and intelligently split text:
PDFParser
for extracting text from PDFs in plain text or markdown formatWriterTextSplitter
for context-aware document chunking with three strategies:llm_split
for precise semantic chunkingfast_split
for quick heuristic-based splittinghybrid_split
for balancing speed and quality
These components can be used individually or combined to create powerful workflows. For example, you can:
- Connect your chat application to Knowledge Graph for accurate, grounded responses
- Create complex, multi-step workflows that involve domain-specific models and no-code text generation or research assistant applications
- Parse a PDF document and split it into semantic chunks
The integration follows LangChain’s design patterns, making it familiar to LangChain users while adding Writer’s unique capabilities.
Example use case: Building product release notes and blog articles
Let’s look at an example that queries a Knowledge Graph for software product feature details and then uses LangChain’s ChatPromptTemplate
to create release notes with Palmyra X 004 and a blog article using Palmyra Creative:
from langchain_writer import ChatWriter
from langchain_writer.tools import GraphTool, LLMTool
from langchain_core.messages import HumanMessage
from langchain_core.prompts import ChatPromptTemplate
from dotenv import load_dotenv
load_dotenv()
# Initialize the chat model, defaults to Palmyra X 004
chat = ChatWriter()
# Create a graph tool with your knowledge graph ID
graph_tool = GraphTool(graph_ids=["41eaf692-2bb6-46dc-a1e3-b09d4efe7b86"])
# Create an LLM tool with Palmyra Creative for content generation
creative_llm_tool = LLMTool(
model_name="palmyra-creative",
description="A creative model that can generate engaging blog articles about software products."
)
# Bind each Writer tool to a separate chat model instance
chat_with_graph = chat.bind_tools([graph_tool])
chat_with_creative = chat.bind_tools([creative_llm_tool])
# Define content generation templates
RELEASE_NOTES_TEMPLATE = ChatPromptTemplate.from_messages([
("system", "You are a technical writer specializing in software release notes."),
("human", "Create professional release notes for version 1.0 of our product based on these features: {features}. "
"Include sections for new features, improvements, bug fixes, and known issues. "
"Use a clear, concise style appropriate for technical users.")
])
BLOG_ARTICLE_TEMPLATE = ChatPromptTemplate.from_messages([
("system", "You are a creative product marketing writer who creates engaging blog content."),
("human", "Write an engaging blog article announcing version 1.0 of our product with these features: {features}. "
"The article should be creative, highlight the benefits of the new features, and have a compelling call-to-action. "
"Keep it under 600 words and make it exciting for potential customers.")
])
# Query the knowledge graph for product features
def get_product_features():
kg_query = [
HumanMessage(
"What are the key features of our monitoring and observability product? Please provide detailed descriptions."
)
]
kg_response = chat_with_graph.invoke(kg_query)
return kg_response.content
def generate_product_content(template, features):
formatted_prompt = template.format_messages(features=features)
response = chat_with_creative.invoke(formatted_prompt)
return response.content
def generate_all_product_content():
features = get_product_features()
release_notes = generate_product_content(RELEASE_NOTES_TEMPLATE, features)
blog_article = generate_product_content(BLOG_ARTICLE_TEMPLATE, features)
return {
"features": features,
"release_notes": release_notes,
"blog_article": blog_article
}
if __name__ == "__main__":
generate_all_product_content()
This example shows how easily you can query your Knowledge Graph when needed to provide accurate, up-to-date information while intelligently delegating model requests to generate different types of content.
Check out the full guide on using Writer with LangChain for more details on each component of the integration.
Get started today
The Writer LangChain integration makes building powerful AI applications that leverage the best of both platforms easy. Ready to start building? Install langchain-writer
using:
pip install langchain-writer
Check out the GitHub repository for full documentation and examples. We’re excited to see what you build!