
When this happens...

Automatically do this!
Create Anomaly Score
Create Centroid
Create Prediction
Find Resource
List Sources
List Datasets
List Sample
List Model
List Time Series
List Evaluations
List OptiMLs
List Clusters
List Anomaly Detectors
List Predictions
List Batch Predictions
List Forecasts
List Anomaly Scores
List Batch Anomaly Scores
List Topic Distributions
List Batch Topic Distributions
List Projections
List Batch Projections
List Scripts
List Executions
When this happensTriggers
A trigger is an event that starts a workflow.
Triggers when a new resource is created
Action is the task that follows automatically within your Databricks integrations.
Creates a new Databricks cluster for processing tasks.
Create a directory by a path.
This action deletes a job by ID.
Creates a repo in the workspace and links it to the remote Git repo specified.
Delete a Repo
Calculates the anomaly score of a data instance.

Gain insights into how viaSocket functions through our detailed guide. Understand its key features and benefits to maximize your experience and efficiency.

Unlock your team's potential with 5 straightforward automation hacks designed to streamline processes and free up valuable time for more important work.

Workflow automation is the process of using technology to execute repetitive tasks with minimal human intervention, creating a seamless flow of activities.
To start, connect both your Databricks and BigML accounts to viaSocket. Once connected, you can set up a workflow where an event in Databricks triggers actions in BigML (or vice versa).
Absolutely. You can customize how Databricks data is recorded in BigML. This includes choosing which data fields go into which fields of BigML, setting up custom formats, and filtering out unwanted information.
The data sync between Databricks and BigML typically happens in real-time through instant triggers. And a maximum of 15 minutes in case of a scheduled trigger.
Yes, viaSocket allows you to add custom logic or use built-in filters to modify data according to your needs.
Yes, you can set conditional logic to control the flow of data between Databricks and BigML. For instance, you can specify that data should only be sent if certain conditions are met, or you can create if/else statements to manage different outcomes.
Databricks is a cloud-based data platform that provides a collaborative environment for data engineering, data science, and machine learning. It offers a unified analytics platform that simplifies data processing and enables organizations to build and deploy AI models at scale. With its powerful integration capabilities, Databricks allows teams to work together seamlessly, accelerating innovation and driving data-driven decision-making.
Learn MoreBigML is a leading machine learning platform that provides a wide range of tools and services for creating, deploying, and managing machine learning models. It offers an intuitive interface and powerful APIs to help businesses and developers harness the power of machine learning for predictive analytics, anomaly detection, clustering, and more.
Learn More