Skip to content
This repository was archived by the owner on May 14, 2025. It is now read-only.

Commit 8a4fc3a

Browse files
mminellacppwfs
authored andcommitted
Documentation for Task Continuous Deployment
This commit adds documentation for how the continuous deployment process works for 2.3. Updates per code review
1 parent 24f4e4c commit 8a4fc3a

File tree

5 files changed

+70
-3
lines changed

5 files changed

+70
-3
lines changed
Loading
Loading
Loading

spring-cloud-dataflow-docs/src/main/asciidoc/index.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,8 @@ Sabby Anandan; Marius Bogoevici; Eric Bottard; Mark Fisher; Ilayaperumal Gopinat
2323
:github-code: https://github.com/{github-repo}
2424
:microsite-version: master
2525

26-
:dataflow-asciidoc: https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/master/spring-cloud-dataflow-docs/src/main/asciidoc
26+
//:dataflow-asciidoc: https://raw.githubusercontent.com/spring-cloud/spring-cloud-dataflow/master/spring-cloud-dataflow-docs/src/main/asciidoc
27+
:dataflow-asciidoc: http://localhost:8000/src/main/asciidoc
2728

2829
:docker-http-source-rabbit-version: 2.1.0.RELEASE
2930
:docker-time-source-rabbit-version: 2.1.0.RELEASE

spring-cloud-dataflow-docs/src/main/asciidoc/tasks.adoc

Lines changed: 68 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ A task application is short lived, meaning it stops running on purpose, and can
1515
A use case might be to scrape a web page and write to the database.
1616
The https://cloud.spring.io/spring-cloud-task/[Spring Cloud Task] framework is based on Spring Boot and adds the capability for Boot applications to record the lifecycle events of a short lived application such as when it starts, when it ends and the exit status.
1717
The https://docs.spring.io/spring-cloud-task/docs/{spring-cloud-task-version}/reference/htmlsingle/#features-task-execution-details[TaskExecution] documentation shows which information is stored in the database.
18-
The entry point for code execution in a Spring Cloud Task application is most often an implementation of Boot's CommandLineRunner interface as shown in this https://docs.spring.io/spring-cloud-task/docs/{spring-cloud-task-version}/reference/htmlsingle/#getting-started-writing-the-code[example].
18+
The entry point for code execution in a Spring Cloud Task application is most often an implementation of Boot's `CommandLineRunner` interface, as shown in this https://docs.spring.io/spring-cloud-task/docs/{spring-cloud-task-version}/reference/htmlsingle/#getting-started-writing-the-code[example].
1919

2020
The Spring Batch project is probably what comes to mind for Spring developers writing short lived applications.
2121
Spring Batch provides a much richer set of functionality than Spring Cloud Task and is recommended when processing large volumes of data.
@@ -36,6 +36,7 @@ Before we dive deeper into the details of creating Tasks, we need to understand
3636
. <<spring-cloud-dataflow-task-launch>>
3737
. <<spring-cloud-dataflow-task-review-executions>>
3838
. <<spring-cloud-dataflow-task-definition-destroying>>
39+
. <<spring-cloud-dataflow-task-cd>>
3940

4041
[[spring-cloud-dataflow-create-task-apps]]
4142
=== Creating a Task Application
@@ -1159,4 +1160,69 @@ Spring Cloud Data Flow will have a suffix appended to it consisting of `-scdf-<t
11591160
For example if I create a schedule name of `my-schedule` for a task definition of `tstamp` the
11601161
schedule name will look like `my-schedule-scdf-tstamp`.
11611162

1162-
NOTE: The `scdf-` separator is configurable by setting the `spring.cloud.dataflow.task.scheduleNamePrefix` property with the value you prefer.
1163+
NOTE: The `scdf-` separator is configurable by setting the `spring.cloud.dataflow.task.scheduleNamePrefix` property with the value you prefer.
1164+
1165+
[[spring-cloud-dataflow-task-cd]]
1166+
== Continuous Deployment
1167+
As task applications evolve, you want to get your updates to production. This section walks through the capabilities that Spring Cloud Data Flow provides around being able to update task applications.
1168+
1169+
When a task application is registered (<<spring-cloud-dataflow-register-task-apps>>), a version is associated with it. A task application can have multiple versions associated with it, with one selected as the default. The following image illustrates an application with multiple versions associated with it (see the timestamp entry).
1170+
1171+
image::{dataflow-asciidoc}/images/dataflow-task-application-versions.png[Task Application Versions, scaledwidth="50%"]
1172+
1173+
Versions of an application are managed by registering multiple applications with the same name and coordinates, _except_ the version. For example, if you were to register an application with the following values, you would get one application registered with two versions (2.0.0.RELEASE and 2.1.0.RELEASE):
1174+
1175+
* Application 1
1176+
** Name: `timestamp`
1177+
** Type: `task`
1178+
** URI: `maven://org.springframework.cloud.task.app:timestamp-task:2.0.0.RELEASE`
1179+
* Application 2
1180+
** Name: `timestamp`
1181+
** Type: `task`
1182+
** URI: `maven://org.springframework.cloud.task.app:timestamp-task:2.1.0.RELEASE`
1183+
1184+
Besides having multiple versions, Spring Cloud Data Flow needs to know which version to run on the next launch. This is indicated by setting a version to be the default version. Whatever version of a task application is configured as the default version is the one to be run on the next launch request. You can see which version is the default in the UI as this image shows:
1185+
1186+
image::{dataflow-asciidoc}/images/dataflow-task-default-version.png[Task Application Default Version, scaledwidth="50%"]
1187+
1188+
=== Task Launch Lifecycle
1189+
In previous versions of Spring Cloud Data Flow, when the request to launch a task was received, Spring Cloud Data Flow would deploy the application (if needed) and run it. If the application was being run on a platform that did not need to have the application deployed every time (CloudFoundry for example), the previously deployed application was used. This flow has changed in 2.3. The following image shows what happens when a task launch request comes in now:
1190+
1191+
image::{dataflow-asciidoc}/images/dataflow-task-launch-flow.png[Flow For Launching A Task, scaledwidth="50%"]
1192+
1193+
There are three main flows to consider in the preceding diagram. Launching the first time or launching with no changes is one. The other is launching when there are changes. We look at the flow with no changes first.
1194+
1195+
==== Launch a Task With No Changes
1196+
1. A launch request comes into to Data Flow. Data Flow determines that an upgrade is not required, since nothing has changed (no properties, deployment properites, or versions have changed since the last execution).
1197+
1198+
[start=5]
1199+
5. On platforms that cache a deployed artifact (CloudFoundry at the writing of this documentation), Data Flow checks whether the application was previously deployed.
1200+
6. If the application needs to be deployed, Data Flow does the deployment of the task application.
1201+
7. Data Flow launches the application.
1202+
1203+
That flow is the default behavior and occurs every time a request comes in if nothing has changed. It is important to note that this is the same flow that Data Flow has always executed for launching of tasks.
1204+
1205+
==== Launch a Task With Changes That Is Not Currently Running
1206+
The second flow to consider when launching a task is whether there was a change in any of the task application version, application properties, or deployment properties. In this case, the following flow is executed:
1207+
1208+
1. A launch request comes into Data Flow. Data Flow determines that an upgrade is required since there was a change in either task application version, application properties, or deployment properties.
1209+
2. Data Flow checks to see whether another instance of the task definition is currently running.
1210+
1211+
[start=4]
1212+
4. If there is not another instance of the task definition currently running, the old deployment is deleted.
1213+
5. On platforms that cache a deployed artifact (CloudFoundry at the writing of this documentation), Data Flow checks whether the application was previously deployed (this check is always be no in this flow).
1214+
6. Data Flow does the deployment of the task application with the updated values (new application version, new merged properties, and new merged deployment properties).
1215+
7. Data Flow launches the application.
1216+
1217+
This flow is what fundementally enables continuous deployment for Spring Cloud Data Flow.
1218+
1219+
==== Launch a Task With Changes While Another Instance Is Running
1220+
The last main flow is when a launch request comes to Spring Cloud Data Flow to do an upgrade but the task definition is currently running. In this case, the launch is blocked due to the requirement to delete the current application. On some platforms (CloudFoundry at the writing of this document), deleting the application causes all currently running applications to be shut down. This feature prevents that from happening. The following process describes what happens when a task changes while another instance is running:
1221+
1222+
1. A launch request comes into to Data Flow. Data Flow determines that an upgrade is required, since there was a change in either task application version, application properties, or deployment properties.
1223+
2. Data Flow checks to see whether another instance of the task definition is currently running.
1224+
3. Data Flow prevents the launch from happening because other instances of the task definition are running.
1225+
1226+
NOTE: Any launch that requires an upgrade of a task definition that is running at the time of the request is blocked from executing due to the need to delete any currently running tasks.
1227+
1228+

0 commit comments

Comments
 (0)