Skip to content

AI Agent

The AI Agent enables natural language pipeline creation and management through a chat interface.

Instead of manually configuring pipelines, describe what you want in plain English:

“Create a pipeline that syncs HubSpot contacts to BigQuery every 6 hours”

The agent will:

  1. Identify the source and destination
  2. Suggest configuration options
  3. Create the pipeline with your approval
  1. Open the Chat Interface

    Click “AI Agent” in the sidebar.

  2. Describe Your Pipeline

    Examples:

    • “I want to sync Kafka messages to BigQuery”
    • “Set up a daily export from HubSpot to our data warehouse”
    • “Create a pipeline for streaming orders to analytics”
  3. Answer Clarifying Questions

    The agent may ask about:

    • Specific streams to sync
    • Authentication credentials
    • Schedule preferences
  4. Review and Confirm

    Before creating, the agent shows:

    • Full configuration preview
    • Estimated sync frequency
    • Resource requirements
  5. Pipeline Created

    The agent creates the pipeline (disabled by default) and provides a link to view it.

User: Create a HubSpot contacts pipeline to BigQuery
Agent: I'll help you create that pipeline. I need a few details:
1. Which HubSpot stream? (contacts, companies, deals)
2. What's your BigQuery project ID and dataset?
3. How often should it run?
User: Contacts, project my-analytics, dataset crm_data, every 6 hours
Agent: Perfect! Here's the pipeline configuration:
- Source: HubSpot > contacts
- Destination: BigQuery > my-analytics.crm_data
- Schedule: Every 6 hours (0 */6 * * *)
Should I create this pipeline?
User: What sources are available?
Agent: Bizon Platform supports these sources:
- HubSpot (contacts, companies, deals, tickets)
- Kafka (streaming topics)
- Custom sources (build your own)
...
User: Change the HubSpot pipeline to run every 4 hours
Agent: I've updated the schedule from "0 */6 * * *" to "0 */4 * * *".
The pipeline will now run every 4 hours.

The agent uses LangGraph with:

  • PostgreSQL Checkpointer - Conversation state persists across sessions
  • SSE Streaming - Real-time token delivery for smooth chat experience
  • Tool Calls - Structured actions for connector discovery and pipeline creation
ToolDescription
list_sourcesGet available source connectors
list_destinationsGet available destinations
create_pipelineCreate a new pipeline
get_pipelineGet pipeline details
update_pipelineModify existing pipeline

A specialized agent for custom source development:

User: Help me build a custom source for the Notion API
Agent: I'll help you create a Notion source connector.
Please provide the Notion API documentation URL.
User: https://developers.notion.com/reference
Agent: Analyzing the documentation...
I've generated a custom source with:
- Streams: pages, databases, users
- Authentication: API key (Bearer token)
[Shows generated code]
Would you like to preview it with your credentials?

Set your preferred AI model:

Terminal window
# OpenAI
LLM_PROVIDER=openai
LLM_MODEL=gpt-4o-mini
OPENAI_API_KEY=sk-xxx
# Anthropic
LLM_PROVIDER=anthropic
LLM_MODEL=claude-3-sonnet
ANTHROPIC_API_KEY=sk-ant-xxx

The agent shares the platform database for checkpointing:

Terminal window
DATABASE_URL=postgresql+asyncpg://user:pass@host:5432/bizon
  • Pipelines created disabled - Review before enabling
  • Confirmation required - Agent asks before making changes
  • Audit trail - All agent actions are logged
  • Credential handling - Sensitive values masked in responses

Sessions persist across browser refreshes:

  • View past conversations
  • Continue where you left off
  • Reference previous pipeline configurations