Get ahead
VMware offers training and certification to turbo-charge your progress.
Learn moreMicroservice based Streaming and Batch data processing for Cloud Foundry and Kubernetes.
Spring Cloud Data Flow provides tools to create complex topologies for streaming and batch data pipelines. The data pipelines consist of Spring Boot apps, built using the Spring Cloud Stream or Spring Cloud Task microservice frameworks.
Spring Cloud Data Flow supports a range of data processing use cases, from ETL to import/export, event streaming, and predictive analytics.
The Spring Cloud Data Flow server uses Spring Cloud Deployer, to deploy data pipelines made of Spring Cloud Stream or Spring Cloud Task applications onto modern platforms such as Cloud Foundry and Kubernetes.
A selection of pre-built stream and task/batch starter apps for various data integration and processing scenarios facilitate learning and experimentation.
Custom stream and task applications, targeting different middleware or data services, can be built using the familiar Spring Boot style programming model.
A simple stream pipeline DSL makes it easy to specify which apps to deploy and how to connect outputs and inputs. The composed task DSL is useful for when a series of task apps require to be run as a directed graph.
The dashboard offers a graphical editor for building data pipelines interactively, as well as views of deployable apps and monitoring them with metrics using Wavefront, Prometheus, Influx DB, or other monitoring systems.
The Spring Cloud Data Flow server exposes a REST API for composing and deploying data pipelines. A separate shell makes it easy to work with the API from the command line.
The Spring Cloud Data Flow Microsite is the best place to get started.
Bootstrap your application with Spring Initializr.