Skip to contents

Creates a chat object configured with the workflow's system prompt (derived from workflow content) and all referenced tools registered. Optionally attaches validation callbacks to track job execution.

Usage

mcpflow_instantiate(
  workflow,
  chat = NULL,
  chat_provider = c("ollama", "openai", "claude", "gemini", "cortex", "azure_openai",
    "bedrock", "databricks", "github", "groq", "perplexity", "snowflake", "vllm"),
  chat_args = list(),
  additional_tools = NULL,
  on_tool_request = NULL,
  on_tool_result = NULL,
  use_job_validator = FALSE,
  validator_strict = FALSE,
  validator_verbose = TRUE,
  state_env = NULL,
  ...
)

Arguments

workflow

A 'ravepipeline_mcp_workflow' object, or a path/identifier that can be read via mcpflow_read.

chat

An existing 'ellmer' chat object to configure. If NULL (default), a new chat is created using chat_provider.

chat_provider

Character. The chat model to use when creating a new chat. One of "ollama" (default), "openai", "claude", "gemini", "cortex", "azure_openai", "bedrock", "databricks", "github", "groq", "perplexity", "snowflake", "vllm". Only used when chat is NULL.

chat_args

A named list of additional arguments passed to the 'ellmer' constructor (e.g., model, api_key, base_url). Only used when chat is NULL.

additional_tools

A list or vector of MCP tools in addition to what workflow implicates, to be added to the chat; can be strings or a list of MCP tool objects; default is NULL (default)

on_tool_request

Optional callback function for tool request events. Passed to chat$on_tool_request(). Receives a "content tool request" object. Can call tool_reject to prevent execution.

on_tool_result

Optional callback function for tool result events. Passed to chat$on_tool_result(). Receives a "content tool result" object.

use_job_validator

Logical. If TRUE, automatically validate jobs by mcpflow_job_validator. Default is FALSE. Ignored if on_tool_request or on_tool_result are provided.

validator_strict

Logical. If TRUE and use_job_validator is TRUE, the validation tool will reject out-of-order tool calls. Default is FALSE (advisory warnings only).

validator_verbose

Logical. If TRUE (default) and use_job_validator is TRUE, print progress messages.

state_env

Environment or NULL (default). Environment in which the MCP tools share and store data; see mcptool_state_factory

...

Additional arguments passed to mcptool_instantiate for each tool.

Value

An 'ellmer' chat object with:

  • System prompt set from workflow content (via convert_workflow_to_markdown)

  • All referenced MCP tools registered

  • Optional validation callbacks attached

Details

The system prompt is generated by convert_workflow_to_markdown, which includes:

  • Workflow name and description

  • Tool list

  • Settings (dangerous, requires_approval, estimated_duration)

  • Sections with content

  • Jobs with dependencies, conditions, and step details

  • Examples, warnings, and best practices

Examples

# Load workflow and create chat with Ollama
wf <- mcpflow_read("ravepipeline::rave_pipeline_class_guide")

# This example requires connecting to external service providers
if (FALSE) { # \dontrun{
# Ollama (default) might require explicit model
chat <- mcpflow_instantiate(wf, chat_args = list(model = "qwen3:8b"))

# Create chat with OpenAI and custom model
chat <- mcpflow_instantiate(
  wf,
  chat_provider = "openai",
  chat_args = list(model = "gpt-4")
)

# Use existing chat object
existing_chat <- ellmer::chat_claude()
chat <- mcpflow_instantiate(wf, chat = existing_chat)

# Enable job validation (advisory mode)
chat <- mcpflow_instantiate(wf, use_job_validator = TRUE)

# Enable strict job validation
chat <- mcpflow_instantiate(wf, use_job_validator = TRUE,
                            validator_strict = TRUE)

# Use chat
chat$chat("Help me set up a power analysis pipeline")
} # }