In my last post I took a gander at the coolness that is the Program Job Server (PGS). I also hinted at the fact that you could leverage this server to improve your scheduling of data integration (DI) jobs creates with SAP Data Services. I really didn’t want to leave you hanging on that thread. Let’s take a look at how you can leverage the SAP Business Objects PGS to incorporate your DI jobs into your job flows.
There are great tools on the market to manage DI job flows. Consider Tidalsoft, for example. They’ve built a pretty impressive stack that allows you to centrally manage tons of job types across several different product suites. This obviously requires budget that perhaps you weren’t necessarily counting on this fiscal year when your project got ramped up.
Enter the awesome PGS. We can pretty easily integrate the jobs flows from DI with the report schedules within an existing BOE environment. This post assumes you already have a properly configured BOBJ Enterprise server with CMS, PGS, and Input FRS up and running, and a properly configured DI environment with at least one job up and ready to schedule in your repository.
- Within the DI admin console, click through to Management/CMS Connectionand get ready to drop in a new connection to BOE.
- When selected, click the Add button to invoke the screen to get started with a new CMS.
- Provide a single CMS name (and port if applicable), and provide an administrator user name for simplicity sake. Click Apply to continue (or Test) if you want to give it a whirl first).
- Before we go any further, there is a subtle issue with DI and the way it looks at BOE. It is looking for Input FRSs that have a name that begins with the string “Input”. To make this work, you actually have to log into the CMC, go to Servers, and rename the Input File Repository(s) to start with the string “Input”. Silly…but necessary. Seems like a throwback to XI R2 and earlier when the Input FRS was named as such.
- Browse to your list of available jobs for your repository and select Add Schedule.
- While scheduling the job, note we can now select BOE Scheduler and the CMS Name that we already dropped in to the admin console.
- To initially commit the job to the BOE scheduler, you do have to set an initial schedule. However, once submitted to the scheduler, go and log into the CMC and note a new Data Services folder has been created along side all the reports that exist, and two objects have been dropped in.
That’s it. We can now go crazy with all the cool scheduling capabilities already built into BOE. We can set events that fire when the DI jobs complete, setting dependencies between distinct DI jobs, or even DI jobs that fire reports. We can also leverage the PGS scheduling intervals as well as the custom calendars.
Hello,
I have gotten the scheduled job to show up in BOE. I go in and set the logon parameters. Then when i hit “Run Now”, it says “Pending”, then “Running”, then “Success” (BOE goes through these status within 30 seconds and the job itself takes at least 15 minutes to run). Well when i then go back to the Data Integrator server to check if the job has run, nothing has actually happened.
What do i need to do?
Greg,
I have a couple of things for you to try:
1) Check your Job Server to see if its running as local system or a domain users. Ensure the domain account is using a expired password.
2) Are there any errors in the logs? Does it process rows?
3) Check your Data store to ensure the account being used can access your source / target tables.
4) On the server, run Server manager, edit the job server configuration and resync the local respository with the job server.
Hope these help.
Shawn
(EV Technologies)