Skip to main content

Testing Automations

Testing is a critical part of building reliable automations. Agent Studio provides tools to test individual nodes and complete flows before activating them.

Node-Level Testing

Every node in Agent Studio can be tested individually using the Play button.

How to Test a Node

  1. Select the node by clicking on it
  2. Click the Play button (▶) that appears on the node
  3. Provide test input in the dialog that opens
  4. Review the output to verify the node works as expected

Test Input Options

When testing a node, you can provide input in several ways:
OptionDescription
Manual InputType in JSON data directly
Select EntityChoose an existing account, task, or other entity
Use Sample DataUse predefined sample data for testing
From Previous NodeUse output from a previously tested upstream node

Example: Testing a Condition Node

  1. Click on the Condition node
  2. Click Play (▶)
  3. Enter test input:
{
  "account_data": {
    "name": "Acme Corp",
    "health_score": 45,
    "arr": 150000
  }
}
  1. Click Run
  2. Review which output (True/False) received the data

Inspecting Output

After running a test:
  • Output Data: See the exact data the node produces
  • Execution Path: For logic nodes, see which path was taken
  • Errors: View any error messages if the test failed
  • Logs: See detailed execution logs

Testing Trigger Nodes

Trigger nodes require special consideration since they normally wait for events.

Testing a Trigger

  1. Click on the trigger node
  2. Click Play (▶)
  3. Select a test entity (e.g., choose an account for Account Segment trigger)
  4. The trigger will execute as if that entity triggered it

What Trigger Tests Verify

  • Entity data is loaded correctly
  • Flow context is established
  • Output data structure is correct
  • Connected downstream nodes receive data
Testing a trigger does not verify that the trigger conditions (segment criteria) are correct—it only tests the data loading and output. Verify your segment criteria separately.

Testing Complete Flows

Beyond individual nodes, you can test the entire flow end-to-end.

Run a Test Flow

  1. Save your automation
  2. Click Test Flow in the toolbar
  3. Select test data for the trigger
  4. Click Run Test
  5. Watch execution as it progresses through nodes
  6. Review results for each node

Test Flow Visualization

During a test run, you’ll see:
  • Highlighted paths: Active execution path lights up
  • Node status indicators:
    • ✓ Green: Executed successfully
    • ✗ Red: Failed with error
    • ○ Gray: Not executed (skipped path)
  • Data flow: View data at each connection point

Reviewing Test Results

After a test completes:
ViewWhat It Shows
Execution SummaryOverall success/failure, time taken
Node DetailsInput/output for each executed node
Path TakenWhich branches were followed
ErrorsDetailed error information

Testing Best Practices

Test Early, Test Often

Don’t wait until the flow is complete. Test each node as you add it to catch issues early.
Use actual entities from your Statisfy data when possible. This catches issues that sample data might miss.
Include tests for:
  • Empty or null values
  • Maximum/minimum values
  • Unusual characters in strings
  • Missing optional fields

Test All Paths

For every condition node, run tests that exercise both outputs.
If checking health_score > 50, test with scores of 49, 50, and 51.

Document Your Tests

Keep notes on what test inputs you used and what results you expected.
Consider creating dedicated test accounts/contacts that you can use repeatedly without affecting production data.

Debugging Failed Tests

Common Issues and Solutions

Cause: The node isn’t receiving data from upstream.Solutions:
  • Check that connections are properly made
  • Verify the upstream node produces output
  • Ensure the connection types are compatible
Cause: The referenced field doesn’t exist in the data.Solutions:
  • Inspect the actual input data using node testing
  • Check for typos in field names
  • Verify the data structure matches your expectations
  • Check if the field is nested differently than expected
Cause: Data type mismatch or incorrect operator.Solutions:
  • Inspect the actual values being compared
  • Check if comparing string “100” to number 100
  • Verify the operator matches your intent
  • Check case sensitivity for string comparisons
Cause: Invalid configuration or external service issue.Solutions:
  • Check all required fields are populated
  • Verify @ references resolve to valid values
  • Check integration credentials and permissions
  • Look for rate limits or service outages

Using Logs for Debugging

  1. Enable verbose logging in flow settings (if available)
  2. Check execution logs after test runs
  3. Look for error messages with specific details
  4. Trace data flow through each node

Pre-Deployment Checklist

Before activating your automation, verify:
  • Each node has been tested with realistic data
  • All outputs produce expected results
  • Full flow executed successfully end-to-end
  • All conditional paths have been exercised
  • Edge cases handled correctly
  • Email/Slack messages contain correct content
  • Tasks created with right assignments
  • Field updates work as expected
  • Flow handles missing data gracefully
  • No unintended side effects on failure
  • Errors are logged appropriately
  • No infinite loops possible
  • SQL queries are efficient
  • External API calls are reasonable

Monitoring After Deployment

After activating your automation:

Track Execution

  • Run History: View all executions with status
  • Success Rate: Monitor how often the flow completes successfully
  • Error Trends: Watch for increasing failure rates

Set Up Alerts

Consider adding monitoring within your flow:
Main Flow → Catch Errors → Slack Alert to #dev-alerts

Review Periodically

  • Check that the automation is still relevant
  • Verify integrations haven’t changed
  • Update conditions if business logic changes

Next Steps

AI Components

Add AI-powered nodes to your flows

Agents Overview

Build intelligent AI agents