It’s fairly simple to create quartz jobs and trigger them manually on the fly. In a maven module, we’ll start by adding quartz dependency to our pom.xml like this:


Let’s create a class for our manual job:

public class ManualJob {
    public static final Logger log = Logger.getLogger(Loggable.class.getName());

    private String name;
    private String groupName;
    private JobDetail jobDetail;
    private JobDataMap jobDataMap = new JobDataMap();

    public ManualJob(String firstName, String groupName,Class<? extends Job> clazz, Map<? extends java.lang.String,?> jobDataMap){ = firstName;
        this.groupName = groupName;
        if (jobDataMap != null)this.jobDataMap.putAll(jobDataMap);
        this.jobDetail = buildJob(firstName, groupName, clazz);
    public ManualJob(String firstName, String groupName,Class<? extends Job> clazz){
        this(firstName, groupName, clazz,null);

    private JobDetail buildJob(String firstName, String lastName, Class<? extends Job> clazz){
        return JobBuilder
                .withIdentity(JobKey.jobKey(firstName, lastName))

    public String getName() {

    public String getGroupName() {
        return this.groupName;

    public JobDetail getJobDetail() {
        return this.jobDetail;

    public JobKey getKey(){
        return JobKey.jobKey(, this.groupName);

Name and groupName uniquely identify a job in the pool. Before firing up the job using our ManualJob we need to provide the method that will be executed by our quartz jobs. That’s done by implementing the Job interface like:

public class JobExecutor implements Job{
    public static final Logger log = Logger.getLogger(Loggable.class.getName());
    public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
        StringBuffer dataString = new StringBuffer();
        for(Map.Entry<String,Object> e:jobExecutionContext.getMergedJobDataMap().entrySet()){
            dataString.append(e.getKey()).append(" : ").append(e.getValue()).append(" || ");
        }"============JOB RUNNING with data============ "+dataString.toString());


execute method will be executed by the triggered job. Let’s now put our ManualJob to action:

public static void main(String... args) throws Exception{
        JobDataMap dataMap = new JobDataMap();
        dataMap.put("username", "pgupta");
        ManualJob job1 = new ManualJob("job1","manual",JobExecutor.class);
        ManualJob job2 = new ManualJob("job2","manual",JobExecutor.class);
        Scheduler scheduler = StdSchedulerFactory.getDefaultScheduler();"Starting the Scheduler... ");
        scheduler.start();"Scheduler started... ");
        dataMap.put("app", "sample");


JobDataMap can be used to pass the parameters (key and value pairs) to the job which can be retrieved using the JobExecutionContext in execute method.