Analytic Advisors - 18 years of direct experience

You get solid advice and services when selecting us to help with your projects or with software and hardware purchases. Business Intelligence (BI) systems have proven over years to provide strategic intelligence and standard reporting. Alexicon provides the needed guidance and assistance when upgrading or enhancing these systems. We have 18 years of direct experience with enterprise analytic design, large deployments and supporting data models loaded by solid Extract, Transform and Loading (ETL) capability. These enterprise data models support visualizations and reporting.

The key to a strong enterprise analytic system is integrating multiple data sources in a common database so users can contrast and compare data across areas of the business with visualization and BI tools. Intelligence is gained when Managers, Analysts, Super-users and Users are able to slice-and-dice dimensions in this integrated and tuned environment with key numeric values like Revenue, COGS, Expenses, KPIs and metrics like dollars, quantities, time durations, counts and computations. Key discoveries or finds can then be “baked in” or “integrated” with existing company reporting systems for mass use which improves company operations through shared information.

Balance your Deployment to get the maximum benefit

Below is a quote about BI and its beginnings around strategic intelligence:

“The term "Business Intelligence" was originally coined by Richard Millar Devens.... in 1865. Devens used the term to describe how the banker, Sir Henry Furnese, gained profit by receiving and acting upon information about his environment, prior to his competitors."

Source: Wikipedia

Superfast databases have become the new norm for BI systems sourcing from the EDW. These superfast databases are comprised of software and hardware platform components. They are Massively Parallel Processing (MPP) databases used for very large datasets with advanced speeds for large query workload processing and large computations across massive data sets. Traditional databases are like one cylinder motors and MPP databases have multiple cylinders that can exceed 100. MPPs chew through large datasets and perform computations like "Market Basket Analysis" where retailers work to understand purchase behavior of customers. It completes these types of computations in minutes versus hours in the traditional one cylinder databases. Hadoop and Spark (Large Cluster Computing Frameworks) now allow even bigger calculations or computations on 10's of billions of records which lets businesses understand and analyze even more extreme datasets in minutes or seconds. Results can also be fed to the MPP/EDW for integration with structured company data and existing reporting systems or explored with newer Big Data visualization tools.

As datasets have grown, Big Data systems and smaller databases need business logic, summarizations and calculations to be handled in the database layer. Many systems handle these important operations in the middle layer or BI Server (2) and/or in the frontend or Visualization (1) layer. Moving to bigger data and faster systems requires different and newer data model builds rather than just trying to compensate with additional software and hardware purchases that provide less than acceptable performance gains because of existing slow data models. Alexicon can provide help and advice to avoid performance problems while opening up analysis capabilities as data scales by pushing Visualization or BI Server level work to the database layer.

Below are major parts of a BI or Analytics system:

  1. Visualizations
  2. Business Intelligence (BI) Server
  3. Database

Big Data systems do the heavy lifting in the database and not at the Visualization or BI Server level.

1. Visualizations

Traditionally, BI has been focused on users being able to run standard reports with prompt interactivity and even “ad hoc” abilities or a “self service facility” to build reports. Dashboards evolved from these systems and gained popularity in the late 2000’s with Mobile BI, BI Search and Interactive Visualization tools following in recent years. In addition, statistical client tools have been used throughout all these periods and continue to be a powerful tool that complements current “Big Data” efforts to assist with in-database computations.

2. Business Intelligence (BI) Server

Frontend visualizations, dashboards and reports typically depend on the Business Intelligence Server (2) which sits in the middle of the overall BI Landscape.

The BI Server is used to aggregate reasonable data sets and present those data sets to Frontend Visualizations for OLAP or static use. The server acts as a broker between the Users and the Database. It provides a place to administer users and to organize content or visualization objects (typically thousands of users and thousands of visualization objects).

The key is to use the BI Server as a pass-through for the hard work performed at the Database layer (quick data throughput to users). The Database should perform aggregations and computations where possible. In addition, BI Server Meta Data should follow the same rule and should complement or be a pass-through for Meta Data at the Data Warehouse layer.

3. Database

The Database is where all the sorting, aggregations and computations should be performed. This utilizes the databases computing power and passes summarized results to the BI Server layer which passes the summarized data to the Visualization layer. One example is a simple need to aggregate records for five regions on 20 billion base records. The database will produce five rows with the region names and accompanying numeric values as columns and pass those to the BI Server which renders those five rows in the Visualization layer.

While this is an extreme case for a few rows, these types of aggregations are critical for databases to handle (big or small). Typically the BI Server is overworked because this basic rule is not followed. While 20 billions rows would definitely break most BI Servers, there are still many systems that run large aggergations in the BI Server and sometimes even in the Visualization layer. Users experience slowness with response times as a result. With Big Data this becomes painfully clear quickly like in our example. BI Servers start failing as data size and velocity increases.

Often performance problems creep up over time through incremental builds with no Big Data strategy. Alexicon provides the needed guidance to transition traditional data warehouses to exploit Big Data computing power for efficient and effective reporting through the BI Server and at Visualization layers.

Gartner points out importance around Data Quality, Governance & Stewardship as follows:

Source: Magic Quadrant for Data Quality Tools, 7 October 2013

The "Data Model"

Data Warehouses depend on the right data model to produce data structures for advanced Visualizations. Traditionally, this has been done with heavy transforms in the Extract, Transform and Loading (ETL) process which are still needed in most cases. As data Volume, Variety and Velocity (3V's) continues to increase, more advanced ways on ingesting data into large data structures will advance with more modern ETL features and tools that remove much of the manual work in the ETL process. Trifacta is used for this purpose while loading Hadoop.

Financial reporting is an example where data models must be accurate to the penny or decimal place. Many larger or Big Data sets are used for exploring and have been called Data Lakes (raw data). EDWs typically have data stored, prior to ETL work, in Operational Data Stores (ODS). ETL is used to clean and transform data from the ODS (raw data) and place clean data that is conformed to Master Data (dimensions) and fact tables in the EDW. This is where precise loading of the EDW matters for corporate reporting accuracy. Data Lakes or raw data used directly can provide valuable research insights; however, are not clean enough for EDW use. There is still creative tension in the software industry around financial type accuracy and exploration or data science work with very large data sets.

We have 18 years of progressive experience building advanced data models for enterprise BI and analytic systems across many industries and functional company areas. Our data models provide thorough analysis capability and quick response times. We have found that most large data warehouses do not have the KPI and metric analysis capabilities needed in today's fast-changing and competitive world.

It is surprising what performance gains can be achieved when a company develops the right analytic systems and unleashes a large number of users with dashboards and/or ad hoc BI tools (traditional and modern BI tools). These tools provide needed summary and drill-down views for standard reporting needs and act as a discovery facility for users to gain new insights (self-service facility). These systems also need to be backed by the best fit user deployment model, appropriate IT governance and shared stewardship with business sponsors and users.

Contact Us to learn more about Products and Services










«Home Page