Spring Batch Batch - Job, Flow, Split

Job creation and invocation

After a job is successfully created, Spring Batch executes the configured job by default at project startup.Often in normal business processing, we need to trigger jobs manually or on a regular basis, so this introduces two executors, jobLauncher and jobOperator.

JobLauncher Configuration

Here we call jobLauncher through the API interface of the web, passing in the parameters of job through the interface.The Job invoked is specified based on the Bean name when the job is created.

@Bean
public JobLauncher jobLauncher() {
    SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
    jobLauncher.setJobRepository(jobRepository());
    jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor()); //Convert to Asynchronous Task
    jobLauncher.afterPropertiesSet();
    return jobLauncher;
}
   @Autowired
    private JobLauncher jobLauncher;
 
    @Autowired
    private Job jobLaunchDemoJob;
 
    @GetMapping("/{job1param}")
    public String runJob1(@PathVariable("job1param") String job1param) throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
        System.out.println("Request to run job1 with param: " + job1param);
        JobParameters jobParameters = new JobParametersBuilder()
                .addString("job1param",job1param)
                .toJobParameters();
        jobLauncher.run(jobLaunchDemoJob,jobParameters);
        return "Job1 success.";
 
    }

JobOperator Configuration

    @Bean
    public JobOperator jobOperator(){
        SimpleJobOperator operator = new SimpleJobOperator();
 
        operator.setJobLauncher(jobLauncher);
        operator.setJobParametersConverter(new DefaultJobParametersConverter());
        operator.setJobRepository(jobRepository);
        operator.setJobExplorer(jobExplorer);
        operator.setJobRegistry(jobRegistry);
        return operator;
    }
    @Autowired
    private JobOperator jobOperator;
 
    @GetMapping("/{job2param}")
    public String runJob1(@PathVariable("job2param") String job2param) throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobInstanceAlreadyExistsException, NoSuchJobException {
        System.out.println("Request to run job2 with param: " + job2param);
 
        jobOperator.start("jobOperatorDemoJob","job2param="+job2param);
 
        return "Job2 success.";
 
    }

Finally, the timer task call, through the corn expression, executes when the condition is met

    @Scheduled(fixedDelay = 5000)
    public void scheduler() throws JobInstanceAlreadyCompleteException, JobExecutionAlreadyRunningException, JobParametersInvalidException, JobRestartException, JobParametersNotFoundException, NoSuchJobException {
        jobOperator().startNextInstance("jobScheduledDemoJob");
    }

<br/><br/><br/>

Job Nesting

<br/>

In some cases, a job needs to be nested to execute another job, and a step is needed.Nested execution of a child job in a step and then the nested step in the parent job is sufficient.

1. Jobs can be nested. Nested Jobs refer to them as child jobs and nested Jobs refer to them as parent jobs.

2. A parent Job can have more than one child Job;

3. Child Jobs cannot run independently and need their parent Job to start;

As shown in the diagram

<br/><br/> The program configuration to execute the job shown above is as follows:

    /*Get a step*/
    @Bean
    public Step childStep(){
        return stepBuilderFactory.get("child").tasklet((StepContribution contribution, ChunkContext chunkContext) -> {
            System.out.println("son step Execute...");
            return RepeatStatus.FINISHED;
        }).build();
    }

    /*Get a job*/
    @Bean
    public Job childJob(){
        return jobBuilderFactory.get("childJob")
                .start(childStep())
                .build();
    }

    /*Get a step for nesting job s*/
    /*
     * new StepBuilder("childStepJob")Create a factory class for Step, and childStepJob specifies the name of a Step;
     * new JobStepBuilder Create a factory class to embed a job in a step
     * repository Used to store execution information for a job, step, if a step has been executed and a job calls the step again, it will be prompted that the step has been executed and will not be executed
     * What this code means: Create a step named childStepJob in which a job named childJob is embedded and executed
     * */
    @Bean
    public Step nestedStep(){
        return new JobStepBuilder(new StepBuilder("childStepJob"))
                .job(childJob())
                .repository(jobRepository)
                .build();
    }

    /*I get a step whose name is parent*/
    @Bean
    public Step parentStep(){
        return stepBuilderFactory.get("parent").tasklet((StepContribution contribution, ChunkContext chunkContext) -> {
            System.out.println("father step Execute...");
            return RepeatStatus.FINISHED;
        }).build();
    }

    /*Get a job with the name parent, notice that it's not parentJob, parentJob is just a function name*/
    @Bean
    public Job parentJob(){
        return jobBuilderFactory.get("parent")  
                .start(nestedStep())
                .next(parentStep())
                .build();
    }

}

<br/> Add configuration to application.propertis: ``` spring.batch.job.names=parentJob ```

<br/><br/> Two jobs, childJob and parentJob, are defined in the program. ChildJob executes a step named child; parentJob executes a step named parent and childStepJob, and childStepJob executes a job named childJob.

The log is printed as follows.

Job: [SimpleJob: [name=parentJob]] launched with the following parameters: [{}]
Executing step: [childStepJob]
Job: [SimpleJob: [name=childJob]] launched with the following parameters: [{}]
Executing step: [child]
//Child step execution...
Job: [SimpleJob: [name=childJob]] completed with the following parameters: [{}] and the following status: [COMPLETED]
Executing step: [parent]
//Parent step execution...
Job: [SimpleJob: [name=parentJob]] completed with the following parameters: [{}] and the following status: [COMPLETED]

<br/> From the log, you can see that only parentJob is executed, first a step named childStepJob in parentJob is executed, then an embedded Job named childJob is executed when a child StepJob is executed, and then a step named parent in parentJob is executed

<br/><br/><br/>

Job parameters

<br/>

1.JobParameters Role: Can be used to convey information while Job is running

2. Pass in as a key---->value key-value pair, in the code we get("key") to get the value

3.job's parameters are available throughout the life cycle of the job's step, and we can pass in the required parameters according to different business processing logic.

Call the procedure demo as follows:

@Bean
    public Job myJobParametersDemoJob(){
        return jobBuilderFactory.get("myJobParametersDemoJob")
                .start(myJobParametersDemoStep())
                .build();
    }
 
    @Bean
    public Step myJobParametersDemoStep() {
        return stepBuilderFactory.get("myJobParametersDemoStep")
                .listener(this)
                .tasklet(((contribution, chunkContext) -> {
                    System.out.println("Parameter is : " + params.get("info"));
                    return RepeatStatus.FINISHED;
                })).build();
 
    }
 
    @Override
    public void beforeStep(StepExecution stepExecution) {
        params =  stepExecution.getJobParameters().getParameters();
 
    }
 
    @Override
    public ExitStatus afterStep(StepExecution stepExecution) {
        return null;
    }

<br/><br/><br/>

Creation and Use of Flow

Step is a stand-alone, sequential processing step that contains complete input, processing, and output.However, in enterprise applications, we are faced with more situations where steps are processed in a certain order.So how to maintain the execution order between steps is something we need to consider.Spring Batch provides Step Flow to solve this problem.

Flow has the following characteristics:

  1. flow is a collection of Steps that specifies the conversion relationship between Steps and Steps.

  2. Creating a Flow can achieve the effect of reuse, allowing it to be reused between different Job s;

  3. Use FlowBuilder to create a Flow, similar to Job, using start(),next(), and end() to run the flow;

<br/>

Sequential Flow

<br/> ``` //Create a Flow object indicating which Step s the Flow object contains @Bean public Flow jobFlowDemoFlow1(){ return new FlowBuilder<Flow>("jobFlowDemoFlow1") .start(jobFlowDemoTwoStep1()) .next(jobFlowDemoTwoStep2()) .build(); }

@Bean public Job jobFlowDemo(){ return jobBuilderFactory.get("jobFlowDemo") .start(jobFlowDemoFlow1()).end() .build(); }


 
 <br/> <br/>
##Decision Maker
 <br/>
Dynamic Customization Decider Determines Flow Execution Order

1.Decision: Provides us with a conditional decision on which Step to proceed next

2.JobExecutionDecider: Interface, Provides Decision Conditions
 
 <br/>

1. Customize a MyDecider to return "EVEN" from cardinality and "ODD" from even number according to the number of calls;

public class MyDecider implements JobExecutionDecider { private int count = 0; @Override public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecution) { count ++; if (count % 2 == 0) return new FlowExecutionStatus("EVEN"); else return new FlowExecutionStatus("ODD"); }



 
 <br/>
 2. Call MyDecider in job, evenStep when returning "EVEN", oddStep when returning "ODD".

@Bean
public Step oddStep(){
    return stepBuilderFactory.get("oddStep")
            .tasklet(((stepContribution, chunkContext) -> {
                System.out.println("oddStep");
                return RepeatStatus.FINISHED;
            })).build();
}

@Bean
public Step evenStep(){
    return stepBuilderFactory.get("evenStep")
            .tasklet(((stepContribution, chunkContext) -> {
                System.out.println("evenStep");
                return RepeatStatus.FINISHED;
            })).build();
}

@Bean
public JobExecutionDecider myDecider(){
    return new MyDecider();
}

@Bean
public Job flowDecisonDemoJob(){
    return jobBuilderFactory.get("flowDecisonDemoJob").start(firstStep())
            .next(myDecider())
            .from(myDecider()).on("EVEN").to(evenStep())
            .from(myDecider()).on("ODD").to(oddStep())
            .from(oddStep()).on("*").to(myDecider())
            .end()
            .build();
}

<br/><br/><br/>


#Parallel Processing Split

<br/>

Parallel processing is important when you encounter large amounts of data or when each ITEM process takes time.The Split-Partitioned-Merge process typically occurs in parallel processing

![file](https://graph.baidu.com/resource/2223b0a952f98c501fd7701583220953.png)

<br/>

// tag::jobstep[] @Bean public Job readandwritedbJob() { Flow flow1 = new FlowBuilder<SimpleFlow>("asyncFlow1").start(step2()).build(); Flow flow2 = new FlowBuilder<SimpleFlow>("asyncFlow2").start(step3()).build(); return jobBuilderFactory.get("readandwritedbJob") .incrementer(new RunIdIncrementer()) .flow(step1()) .split(new SimpleAsyncTaskExecutor()).add(flow1,flow2) .end() .build(); }

@Bean
public Tasklet tasklet1() {
    return new ReadAndWriteDbTask();
}

@Bean
public Tasklet tasklet2() {
    return new ReadAndWriteDbTask2();
}


@Bean
public Tasklet tasklet3() {
    return new ReadAndWriteDbTask3();
}

@Bean
public Step step1() {
    return stepBuilderFactory.get("step1")
            .tasklet(tasklet1())
            .build();
}

@Bean
public Step step2() {
    return stepBuilderFactory.get("step2")
            .tasklet(tasklet2())
            .build();
}

@Bean
public Step step3() {
    return stepBuilderFactory.get("step3")
            .tasklet(tasklet3())
            .build();
}
// end::jobstep[]




<br/><br/><br/>






#Listen for Job Execution

<br/>

1.Listener: A way to control Job execution

2. Listeners can be implemented through interfaces or annotations

3. Provide listener interfaces at all levels in spring-batch, from the job level to the item level

(1)JobExecutionListener(before..,after..);

(2)StepExecutionListener(before..,after..);

(3)ChunkListener(before..,after..);

(4)ItemReaderListener;ItemWriterListener;ItemProcessListener(before..,after..,error..);


<br/>

When a task finishes or starts executing, some processing needs to be done.JobExecutionListener can be used at this time:

public interface JobExecutionListener { void beforeJob(JobExecution jobExecution); void afterJob(JobExecution jobExecution); }

<br/>
Add by:

@Bean public Job footballJob() { return this.jobBuilderFactory.get("footballJob") Implementation class for.Listener (sampleListener ())//JobExecutionListener ... .build(); }

<br/>
It is important to note that the afterJob method will be executed regardless of whether the batch task succeeds or fails, so add the following judgment:

public void afterJob(JobExecution jobExecution){ if( jobExecution.getStatus() == BatchStatus.COMPLETED ){ //job success } else if(jobExecution.getStatus() == BatchStatus.FAILED){ //job failure } }

<br/>
In addition to implementing the interface directly, you can use the @BeforeJob and @AfterJob annotations.



Reference resources:

https://blog.csdn.net/wuzhiwei549/category_8573774.html

https://blog.51cto.com/13501268/2177746

Tags: Programming Spring

Posted on Mon, 09 Mar 2020 13:33:53 -0400 by stew