Since the beginning of storage and retrieval of data for the purposes of analysis, the concepts of Data Mart and Data Warehouse have been in vogue. The early pragmatic models in this area were the logical model proposed by Ralph Kimball and the CIF (Corporate Information Factory) proposed by Bill Inmon. These two technologies were for architecting a solution suitable for business decision making. Primarily these two models, though following different approaches, operated on structured (relational) data to begin with and culminated in building an EDW (or the so called Enterprise Data Warehouse). As technologies and trends gained momentum, so did these approaches to modeling the data and the associated processes such as integrating the data from various sources, managing the data sourced by way of storage and analysis, and finally presenting the data to end users for decision making. At times a hybrid model combining KIMBALL and INMON approaches proved to be successful as a holistic solution to arriving at the EDW.
Enter varied sources of data in a variety of formats (structured, unstructured) along with high volume and velocity (the speed at which the data is being created or sourced), and there came the need not only to source such data but also to process and store the same in a timely manner and at the same time make it available to the end user for deeper analysis. The paridgm of data got transformed into the notion of All Data (including Big Data that refered to mostly unstructured data or in fact any data coming in at high velocity and with high volume) as also the notion of end user now being a business user who not only had to ensure that the right data was used for the right job but also make sure that for certain data, the analysis and decision making needed to be done in near real time. This posed a challange of processing the data as it was being created for some of the data if not for all of it.
And the Corporate Information Factory approach of Inman which started from top-down was not always best suitable for such purposes. And the logical model of Kimball needed to be extended to incorporate these business requirements. For the purposes of this post, the terms "logical model", logical dimensional model" and "logical data model" are used interchangeably.
2. Using the Logical Data Model for extending Essbase and OBIEE
Thus evolved the logical (dimensional) data model into one that formed the foundation of data that could be flexibly extended to increase scalability, availability, and managebility of a EDW/BI solution. It encapsulates the data elements and the business processes to enable BI and analytics. This blog post shall describe such a solution using Oracle Essbase and OBIEE based on the logical dimendional data model. The Logical Data Model can be used to design once and transform multiple times and the author suggests that it be made the blueprint for a holistic BI solution. The pertinent technical facts of using this logical model as a blueprint for such a solution are as follows:
i. What ever be such a solution, it is the pragmatics of architecting a logical data model and the related EDW in a ROLAP-based implementation augmented by data marts and MDX cubes that eventually delivers an extensible and flexible solution. The data is logically modeled and an EDW is created to feed the cubes. The Data foundation is instantiated as a ROLAP star schema in the EDW.
ii. The translation of the data occurs with two major data groupings - metrics and measurements needed for analysis via a fact table and the various descriptors and criteria pertaining to the measurement via dimensions. This is essentially a star schema that provides the necessary design for performance. Also, such a design conforms the data in a manner that business users are familiar with since the business groupings are mapped with the data groupings.
iii. Modeled entities and attributes are established to create Related Logical Objects that contain both structured and unstructured objects.
iv. BI data stores for analytical and operational analysis are established within them. ROLAP can be viewed as virtually the same as MOLAP. This means the ROLAP star schema in an RDBMS is logically equivalent to the dimensional model in Essbase. The physical implementation can be done using the logical objects (that have been modeled after business requirements). Additionally these can be any related Big Data objects.
v. Data discovery can be established based on the above Big Data elements - that can encompass both unstructured and structured objects.
vi. A Data Integration layer that ties together all data sources.
The so established logical data foundation as stated above defines a logical layer of data (business data) that can populate and "feed" analytical and operational BI systems downstream.
Now that the one data foundation is architected (ideally to be done outside of Essbase Studio), we can use it to extend Essbase in the following manner:
i. Use the cube as a data mart and as a dimensional model specifying dimensions and their measures.
ii. Use a ROLAP star schema in an RDBMS more specifically sourced from an EDW as the dimensional model to use for analytical BI and analytics.
In either case, logical data model needs to be used so as to incorporate specific business requirements that get mapped to the dimensional model. In case of star schema, this is needed to translate the star to business users requirements.
The pragmatics of extending Essbase using the logical model involves architecting an EDW based on OLTP requirements and the logical data foundation as a star schema. Have Essbase source data directly from the EDW via XOLAP or ROLAP star. This allows for operational BI, analytical BI and operational reporting. Note that using XOLAP or ROLAP both require the dimensional model to feed the cube.
Use the cube as a data mart as in (i) above when there is a unique need of a department requiring to perform MOLAP/analytics and not just flat reporting. XOLAP can be used to directly and transparently source the data from OLTP/RDBMS and feed the cube via the logical model (in Essbase Studio or Essbase Acceleration Wizard in OBIEE 12c).
Using a ROLAP star is beneficial in case of enterprise reporting and analytics wherein the ROLAP star can subsequently be leveraged to feed the cube. Note that the ROLAP star is based on the logical dimensional model so architected. It can be done in Essbase Studio or Essbase Acceleration Wizard in OBIEE12c. This way it serves the purpose of analytical BI, hybrid ROLAP/MOLAP analytics and Data discovery off Big Data objects. This can offer descriptive, predictive, and prespcriptive analytics.
3. Extending BI using logical dimensional model
For implementing Oracle BI, the logical dimensional model stated translates to a Common Information Model encompassing the metadata and semantic layer that is architected and resides in the OBIEE RPD (Repository file) via the Business Modeling and Mapping (BMM) section of OBIEE. This in a way extends BI capabilities beyond the normal feasible domains of BI reporting as the extended data model is now in the BI semantic layer.
This blog post examined the logical data model as a foundation of data for a full-fledged BI Solution from both Essbase and OBIEE perspectives. It offered suggestions in regard to extending Essbase cubes and BI based reporting and analytics for both.