Databricks
the databricks connector allows users to execute sql queries on databricks clusters or sql warehouses directly from the swimlane platform, facilitating automated data analysis and reporting databricks is a unified data analytics platform that accelerates innovation by unifying data science, engineering, and business the databricks connector for swimlane turbine allows users to execute sql queries directly on databricks clusters or sql warehouses, streamlining data analysis and decision making processes by integrating with databricks, swimlane turbine users can automate complex data workflows, enhance incident response with real time data insights, and leverage the power of big data analytics within their security operations limitations none to date supported versions this databricks connector uses the latest version additional docs https //docs databricks com/aws/en/dev tools/python sql connector prerequisites to utilize the databricks connector for turbine, ensure you have the following prerequisites custom authentication with the following parameters server hostname the address of your databricks server client id your databricks client identifier client secret your databricks client secret key http path the specific path for http requests authentication methods this authentication with the following parameters server hostname the address of your databricks server client id your databricks client identifier client secret a secret key associated with your client id for authentication http path the http path to the target server to check server hostname and http path https //docs databricks com/aws/en/integrations/compute details to create client id and client secret https //docs databricks com/aws/en/dev tools/auth/oauth m2m capabilities this databricks connector provides the following capabilities run query run query this action call the databricks sql connector for python to run a basic sql command on a cluster or sql warehouse https //docs databricks com/aws/en/dev tools/python sql connector#query data configurations databricks authentication authenticates using client id and client secret configuration parameters parameter description type required server hostname a hostname to the target server string required client id the client id to use for authentication string required client secret the client secret to use for authentication string required http path the http path to the target server string required verify ssl verify ssl certificate boolean optional http proxy a proxy to route requests through string optional actions run query executes a specified sql query on a databricks cluster or sql warehouse, requiring an input of the sql query endpoint method get input argument name type required description sql query string required the sql query to be executed this can be any valid sql command, such as select, insert, update, delete, etc the query should be a string and should not contain any special characters or formatting input example {"sql query" "select from samples nyctaxi trips limit 2"} output parameter type description result array result of the operation result tpep pickup datetime string result of the operation result tpep dropoff datetime string result of the operation result trip distance number result of the operation result fare amount number result of the operation result pickup zip number result of the operation result dropoff zip number result of the operation output example {"result" \[{"tpep pickup datetime" "2016 02 13t21 47 53z","tpep dropoff datetime" "2016 02 13t21 57 15z","trip distance" 1 4,"fare amount" 8,"pickup zip" 10103,"dropoff zip" 10110},{"tpep pickup datetime" "2016 02 13t18 29 09z","tpep dropoff datetime" "2016 02 13t18 37 23z","trip distance" 1 31,"fare amount" 7 5,"pickup zip" 10023,"dropoff zip" 10023}]} response headers header description example content type the media type of the resource application/json date the date and time at which the message was originated thu, 01 jan 2024 00 00 00 gmt