You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on May 14, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: spring-cloud-dataflow-docs/src/main/asciidoc/tasks.adoc
+68-2Lines changed: 68 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@ A task application is short lived, meaning it stops running on purpose, and can
15
15
A use case might be to scrape a web page and write to the database.
16
16
The https://cloud.spring.io/spring-cloud-task/[Spring Cloud Task] framework is based on Spring Boot and adds the capability for Boot applications to record the lifecycle events of a short lived application such as when it starts, when it ends and the exit status.
17
17
The https://docs.spring.io/spring-cloud-task/docs/{spring-cloud-task-version}/reference/htmlsingle/#features-task-execution-details[TaskExecution] documentation shows which information is stored in the database.
18
-
The entry point for code execution in a Spring Cloud Task application is most often an implementation of Boot's CommandLineRunner interface as shown in this https://docs.spring.io/spring-cloud-task/docs/{spring-cloud-task-version}/reference/htmlsingle/#getting-started-writing-the-code[example].
18
+
The entry point for code execution in a Spring Cloud Task application is most often an implementation of Boot's `CommandLineRunner` interface, as shown in this https://docs.spring.io/spring-cloud-task/docs/{spring-cloud-task-version}/reference/htmlsingle/#getting-started-writing-the-code[example].
19
19
20
20
The Spring Batch project is probably what comes to mind for Spring developers writing short lived applications.
21
21
Spring Batch provides a much richer set of functionality than Spring Cloud Task and is recommended when processing large volumes of data.
@@ -36,6 +36,7 @@ Before we dive deeper into the details of creating Tasks, we need to understand
@@ -1159,4 +1160,69 @@ Spring Cloud Data Flow will have a suffix appended to it consisting of `-scdf-<t
1159
1160
For example if I create a schedule name of `my-schedule` for a task definition of `tstamp` the
1160
1161
schedule name will look like `my-schedule-scdf-tstamp`.
1161
1162
1162
-
NOTE: The `scdf-` separator is configurable by setting the `spring.cloud.dataflow.task.scheduleNamePrefix` property with the value you prefer.
1163
+
NOTE: The `scdf-` separator is configurable by setting the `spring.cloud.dataflow.task.scheduleNamePrefix` property with the value you prefer.
1164
+
1165
+
[[spring-cloud-dataflow-task-cd]]
1166
+
== Continuous Deployment
1167
+
As task applications evolve, you want to get your updates to production. This section walks through the capabilities that Spring Cloud Data Flow provides around being able to update task applications.
1168
+
1169
+
When a task application is registered (<<spring-cloud-dataflow-register-task-apps>>), a version is associated with it. A task application can have multiple versions associated with it, with one selected as the default. The following image illustrates an application with multiple versions associated with it (see the timestamp entry).
Versions of an application are managed by registering multiple applications with the same name and coordinates, _except_ the version. For example, if you were to register an application with the following values, you would get one application registered with two versions (2.0.0.RELEASE and 2.1.0.RELEASE):
Besides having multiple versions, Spring Cloud Data Flow needs to know which version to run on the next launch. This is indicated by setting a version to be the default version. Whatever version of a task application is configured as the default version is the one to be run on the next launch request. You can see which version is the default in the UI as this image shows:
In previous versions of Spring Cloud Data Flow, when the request to launch a task was received, Spring Cloud Data Flow would deploy the application (if needed) and run it. If the application was being run on a platform that did not need to have the application deployed every time (CloudFoundry for example), the previously deployed application was used. This flow has changed in 2.3. The following image shows what happens when a task launch request comes in now:
1190
+
1191
+
image::{dataflow-asciidoc}/images/dataflow-task-launch-flow.png[Flow For Launching A Task, scaledwidth="50%"]
1192
+
1193
+
There are three main flows to consider in the preceding diagram. Launching the first time or launching with no changes is one. The other is launching when there are changes. We look at the flow with no changes first.
1194
+
1195
+
==== Launch a Task With No Changes
1196
+
1. A launch request comes into to Data Flow. Data Flow determines that an upgrade is not required, since nothing has changed (no properties, deployment properites, or versions have changed since the last execution).
1197
+
1198
+
[start=5]
1199
+
5. On platforms that cache a deployed artifact (CloudFoundry at the writing of this documentation), Data Flow checks whether the application was previously deployed.
1200
+
6. If the application needs to be deployed, Data Flow does the deployment of the task application.
1201
+
7. Data Flow launches the application.
1202
+
1203
+
That flow is the default behavior and occurs every time a request comes in if nothing has changed. It is important to note that this is the same flow that Data Flow has always executed for launching of tasks.
1204
+
1205
+
==== Launch a Task With Changes That Is Not Currently Running
1206
+
The second flow to consider when launching a task is whether there was a change in any of the task application version, application properties, or deployment properties. In this case, the following flow is executed:
1207
+
1208
+
1. A launch request comes into Data Flow. Data Flow determines that an upgrade is required since there was a change in either task application version, application properties, or deployment properties.
1209
+
2. Data Flow checks to see whether another instance of the task definition is currently running.
1210
+
1211
+
[start=4]
1212
+
4. If there is not another instance of the task definition currently running, the old deployment is deleted.
1213
+
5. On platforms that cache a deployed artifact (CloudFoundry at the writing of this documentation), Data Flow checks whether the application was previously deployed (this check is always be no in this flow).
1214
+
6. Data Flow does the deployment of the task application with the updated values (new application version, new merged properties, and new merged deployment properties).
1215
+
7. Data Flow launches the application.
1216
+
1217
+
This flow is what fundementally enables continuous deployment for Spring Cloud Data Flow.
1218
+
1219
+
==== Launch a Task With Changes While Another Instance Is Running
1220
+
The last main flow is when a launch request comes to Spring Cloud Data Flow to do an upgrade but the task definition is currently running. In this case, the launch is blocked due to the requirement to delete the current application. On some platforms (CloudFoundry at the writing of this document), deleting the application causes all currently running applications to be shut down. This feature prevents that from happening. The following process describes what happens when a task changes while another instance is running:
1221
+
1222
+
1. A launch request comes into to Data Flow. Data Flow determines that an upgrade is required, since there was a change in either task application version, application properties, or deployment properties.
1223
+
2. Data Flow checks to see whether another instance of the task definition is currently running.
1224
+
3. Data Flow prevents the launch from happening because other instances of the task definition are running.
1225
+
1226
+
NOTE: Any launch that requires an upgrade of a task definition that is running at the time of the request is blocked from executing due to the need to delete any currently running tasks.
0 commit comments