Documentation
...
Reference
Working with Interfaces
overview this section of the turbine user guide describes the interfaces available in turbine solutions interfaces are standard data formats that enable components to work together seamlessly use interfaces to standardize data transformation across security operations workflows when you apply an interface to a component, it automatically configures the component's input and output data structures this standardization allows you to easily swap components in your playbooks without manual re configuration, as long as they use the same interface this guide covers what interfaces are and how they work how to use interfaces when building components available interface catalogs for soc, ai soc, and vrm workflows best practices for working with interfaces interface catalogs full interface contracts for each solution area live in these guides docid\ lkj7c4qday5w52sv1sq 2 — 22 classic soc interface contracts docid 5ply7fiylfuyiniusllcy — 4 ai soc interface contracts docid 2uzi3jfnbojho1ql0ovpi — 6 vrm interface contracts for complete data model field definitions, see docid\ i0yap22xufzu9tbegkare , docid 7efe4hvzv8uzngfkwhjtr , and docid\ pqh1nonmkovd2baaipjxm what are interfaces? an interface defines the data structure that a component expects to receive (inputs) and the data structure it produces (outputs) think of it as a standard template that ensures components can work together seamlessly key concepts component a reusable automation flow that performs a specific task components are used within playbooks to build automation workflows interface a standard data format that components can use when multiple components use the same interface, they can easily be swapped with each other because they all accept and produce data in the same format input schema defines what data your component needs to receive to work properly output schema defines what data your component will produce when it runs example imagine you have multiple threat intelligence enrichment components enrich via virustotal enrich via recorded future enrich via urlhaus if they all use the same "simple observable to enrichment" interface, they all accept the same input format (an observable like an ip address) produce the same output format (enriched observable data) this means you can swap between these components in your playbook without changing any other parts of your workflow note you can still create components without applying an interface, but they will not benefit from standardization and easy swapping benefits of using interfaces interfaces provide powerful benefits that make your automation workflows more flexible and easier to manage easy component swapping components that use the same interface can be swapped in and out of playbooks with a single click no manual re configuration or re mapping of data needed search by functionality you can search for components based on what they do, regardless of which vendor technology they use for example, find all enrichment components that work with observables, even if they use different threat intelligence sources guaranteed compatibility components built with interfaces are guaranteed to work seamlessly with playbooks and other components that use the same interface standardized data flow interfaces ensure that data flows correctly between components, reducing errors and making your playbooks more reliable vendor flexibility you can easily switch between different vendor technologies (such as virustotal, recorded future, or urlhaus) without changing your playbook structure, as long as the components use the same interface how to use interfaces interfaces are available in turbine canvas when building components when you create or edit a component open the component builder in turbine canvas navigate to the "data" tab in the side panel select an interface from the dropdown list of available interfaces the interface automatically configures the component's input and output schemas once an interface is applied, your component will have standardized inputs and outputs that match other components using the same interface, making them easily swappable in playbooks understanding interface schemas each interface defines two key parts input schema specifies what data your component expects to receive output schema specifies what data your component will produce when you apply an interface to a component, these schemas are automatically configured, ensuring your component accepts and produces data in the correct format usage patterns pattern 1 data ingestion pipeline source data → ingestion interface → normalized data → processing interface → output example email → email to email → processed email → extract observables → observable array pattern 2 enrichment workflow observable → simple observable to enrichment → enriched observable → alert creation pattern 3 remediation workflow vulnerability finding → remediation item to ticket → ticket created → remediation item check → status updated pattern 4 bulk processing array of objects → array to array interface → normalized array → individual processing best practices follow these guidelines when working with interfaces use the latest version always use the latest version of interfaces when available check the version field to ensure compatibility include required fields include all required fields in input schemas missing required fields cause transformation failures validate data before transformation validate input data against the interface schema before transformation to catch errors early handle errors implement error handling for transformation failures, especially in automated workflows choose the right interface select interfaces that match your data flow use "to none" interfaces for ingestion and logging use transformation interfaces for data conversion use remediation interfaces for automated actions extract observables from text use "text to array of observables" to extract iocs from unstructured text process bulk data efficiently use array interfaces to process multiple items efficiently validate remediation actions validate action parameters before executing remediation actions to prevent unintended consequences playbook integration how interfaces work in playbooks interfaces define the input and output schemas for playbook transformations when you create a playbook component using a builderintent interface input schema defines what data the playbook expects output schema defines what data the playbook produces validation ensures data matches schemas at runtime type safety provides type information for the ui creating a playbook with an interface example playbook yaml schema playbook/2 name observable enrichment playbook title observable enrichment playbook description enriches observables with threat intelligence \# reference to the interface schema inputschemareferenceid simple observable to enrichment v1 0 2 06fbe actions enrich observable actiontype jsonata inputs expression | { "observable" $ observable, "enrichment" { "enrichment type" "reputation", "enrichment provider" "threat intel", "enrichment verdict" "malicious", "enrichment timestamp" $now(), "enrichment context" "enriched via playbook" } } data observable observable type string observable value string publish enrichment $ enrich observable enrichment connecting multiple interfaces you can chain multiple interfaces together in a playbook actions \# step 1 extract observables from text extract observables actiontype jsonata inputs expression | { "observables" $split($ text value, " ") } data text value string \# step 2 enrich each observable enrich observable actiontype jsonata next create alert inputs expression | { "enrichment" { "enrichment type" "reputation", "enrichment provider" "threat intel", "enrichment verdict" "suspicious" } } data observable object \# step 3 create alert from enriched observable create alert actiontype jsonata inputs expression | { "alert" { "alert title" "threat detected", "alert severity" "high", "observables" \[$ enrich observable observable] } } data enrichment object troubleshooting and common pitfalls issue 1 schema validation failures symptom transformation fails with validation error causes missing required fields wrong data types invalid enum values extra fields not in schema (if additionalproperties false ) solution validate input data against schema before transformation use schema validation tools check interface documentation for required fields remove or map extra fields example fix // before (fails validation) { "observable" { "type" "ip", // wrong field name "value" "192 168 1 1" // wrong field name } } // after (passes validation) { "observable" { "observable type" "ip", // correct field name "observable value" "192 168 1 1" // correct field name } } issue 2 transformation timeouts symptom transformation fails with timeout error causes external api calls taking too long large data processing network latency insufficient timeout configuration solution increase timeout for slow operations optimize transformation logic use async processing for long operations implement retry logic with exponential backoff issue 3 incorrect interface selection symptom data does not match expected format causes using wrong interface for data type confusing similar interfaces version mismatch solution review interface documentation check input/output schemas verify interface version compatibility test with sample data first example // wrong using "simple observable to enrichment" for array // correct use "array of simple observable to none" or process individually issue 4 remediation action failures symptom remediation actions return error messages causes invalid action parameter system permissions target system unavailable invalid observable format solution validate action parameters before execution check system connectivity verify permissions review error messages for specific issues example // invalid action { "action" "ban" // should be "block" or "unblock" } // valid action { "action" "block" } issue 5 data type mismatches symptom numbers passed as strings, dates in wrong format causes data source provides wrong types missing type conversion schema expects specific format solution convert data types before transformation use transformation functions for type conversion validate data types match schema example // before (may fail) { "alert" { "alert risk score" "85" // string } } // after (correct) { "alert" { "alert risk score" 85 // integer } } frequently asked questions where can i see a list of available interfaces that i can use? currently, the only way to view the list of available interfaces would be through the component builder in turbine canvas you can select a component or create a new one, go to the "data" tab in the side panel, and use the dropdown there to view the list of supported interfaces can i create my own interfaces? no, currently you can only use the available interfaces that swimlane provides custom interface creation is on the roadmap as a future feature enhancement how do i know which interface to use for my component? choose an interface that matches your component's purpose ingestion components use interfaces that end with "to none" (such as "alert to none") enrichment components use interfaces like "simple observable to enrichment" transformation components use interfaces that transform one data type to another (such as "email to email") remediation components use remediation action interfaces (such as "block/unblock observable remediation action") what happens if i do not use an interface? you can still create components without applying an interface however, you will need to manually configure inputs and outputs, and the component will not benefit from standardization or easy swapping with other components that use the same interface can i use multiple interfaces on the same component? no, each component can only have one interface applied to it the interface defines both the input and output schemas for that component references https //json schema org/ https //attack mitre org/