If you’ve been following the developments in the BI4 world over the last year, then you’ve undoubtedly noticed the long-awaited and much needed addition of the Monitoring engine. For years I used to jibe SAP about how BOBJ was really designed as a stand-alone, small use application. BI4 is the first major step towards a truly “enterprise” application. The monitoring engine is some good evidence of this. Where system administrators used to be blind to internal operations of BI platform, we now have unprecedented visibility.
Those of you who know me, know I’m not satisfied with just a pretty dashboard on the front. I want to take it apart, see what makes it tick, and how it works. You don’t have to look far to start scratching your head. I did as soon as I saw that the monitoring engine uses an Apache DerbyDB database by default. Derby? Really?
If you aren’t famililar with Derby, it is basically an RDBMS system that runs completely inside of a Java Runtime Engine. Derby is great for developers, because if you’re already developing in Java, it doesn’t take much more knowledge to create a database for your application as well. It works how you want it to, and you have complete control over it.
But there’s the rub. In my experience, even the best application developers tend to be crummy data modelers. They build data structures that suit their application operations, and don’t necessarily think about what that data might be used for later on.
So when I saw that SAP decided to go with Derby as a part of the BI4 platform, I was really confused.
First of all, we already need two database schemas to run the BI4 platform as it is, and we get to pick between several different flavors. BI4 comes with SQL Server Express by default for Windows, and DB2 for Unix or Linux. These typically work just fine for our CMS and Auditing Data Stores. But now we have Derby in the mix as well. In SAP’s defense, they did add the capability to move the Monitoring database to co-exist in the same schema as Auditor. I’ll discuss that process in just a moment. So my first point of confusion is, why introduce another RDBMS platform into an already complex application mix as it is? Why not just make it a part of the Auditing Data Store out of the box?
My second major confusion is around the data being stored in the Monitoring engine. If you’ve been around the BOBJ world as long as I have, then you know that there are nearly no customers out there that don’t want to do something with their system data at some point or another. I mean, this is a BI tool. We’re data people to begin with. I think it is a fair bet that we’re going to want to report off of that monitoring data. I do.
So the flaw with Derby as a component of BI4 has a couple of facets here. Derby really stinks for reporting off of. Don’t believe me? Try it and let me know how that goes. The BI4 Admin Guide clearly says you cannot connect the provided universe directly to the operational data store for Monitoring. You have to report off of the backup. But the shortest backup interval you can specify is 1 hour.
So by that one design choice, SAP just cut off hope of any real-time reporting from the Monitoring DB. And by its very nature, doesn’t monitoring really need to be timely? I suppose for trending data, or massive aggregate reports you might run on a monthly basis. But BI4 hasn’t been out for even a whole year yet, and I’ve already had customers asking for real-time reports and dashboards from Monitoring.
To make matters a tad more confusing, only the trending data from the monitoring engine is stored in the Derby database. The actual alerts that are thrown are already stored in the Auditng Data Store.
So I went on a mission, which I affectionately call “Demolition Derby“.
In Part 2 of this blog, I’ll go over the steps to move your monitoring data out of Derby and into an Enterprise-class database alongside of your Auditng Data Store. Stay Tuned!
7 thoughts on “Demolition Derby, Part 1: Derby, maybe?”