Published in TDAN.com January 2003
Next-Wave Mainstream Analytics: Big Problems, Big Data
Although 80% of current analytic (aka business intelligence) deployments involve basic query and reporting (Q&R), advanced analytic solutions will increase to 30% of deployments by
2004 and 50% by 2006. Advanced analytic needs come in a variety of forms, represented by various problem-type examples (e.g., simulation, prediction). Performance woes will not be driven
entirely by more complex functionality, but also by business problems with “big data” (e.g., subtransactional, genetic, and weather analytics). Doug Laney
Business Intelligence 2003/04: Not Your Father’s Business Decision-Making Software
Increasingly, businesses are focusing on enterprise analytics strategies to leverage mega-investments in ERP, CRM, and SCM application packages. Yet fundamental technical and philosophical
differences exist between transaction processing and business decision-making IT solutions. A business intelligence (BI) solution must integrate and align to deliver maximum business
value. Just as clearly, if left unarchitected and unmanaged, such vital business analytics remain a huge shadow IT cost. As a strategic business initiative, BI projects can offer “breakaway”
competitive advantage. Aaron Zornes
Kicking Kicked-Back Consultants Out
In a tight economy, consultants’ buck-making ingenuity can turn disingenuous. Formal and unadvertised relationships with vendors can skew their ability to advise clients effectively. Therefore,
enterprises issuing RFPs for professional services work should consider demanding various disclosures (particularly if the RFP contains tasks involving technology selection). Such
disclosures should include 1) financial stakes in any vendors, 2) referral or reseller relationships with any vendors, 3) participation on any vendor’s board of directors or board of advisors, and
4) relevant familial relationships with any vendors. Doug Laney
Introducing “Subtransactional” Data
Information that enterprises capture, collect, manage, and leverage has evolved from summary and low-instance data (e.g., products, customers, financials) to actual business events or transactions
(e.g., sales, production). But the Web now offers a more granular window into what happens between business events, and enterprises have begun amassing “subtransactional” data reflecting
the atomic activities that comprise or precipitate transactions. Such monitor-generated data (e.g., Web/network traffic) offers further insight into transactions and the ability to
affect them. By 2005/06, subtransactional data will make up 80% of the data that enterprises manage. Doug Laney
Preparing for the Decision Disconnect
Corporate business decisions are made based on the current knowledge of the particular decision maker. Business intelligence tools are supposed to help support these decisions; however, in many
cases, the varying levels of aggregation and access to historical and strategic data can lead to conflicting decisions by different decision makers. For example, a micro-analytic
decision to reorder a low-running stock item may be in direct conflict with a macro-analytic decision to discontinue selling that particular item. Companies should reduce the risk of wrong
decisions by establishing an information architecture that includes strategic directions and processes to double-check high-impact decisions. Andreas Bitterer
SAP Business Information Warehouse “Means Business” But Needs Tuning
The SAP R/3 enterprise resource planning (ERP) solution captures a tremendous amount of information which is often difficult to analyze. SAP’s Business Information Warehouse (BW)
is a powerful set of enterprise analytic capabilities to access this information. The IT organization must optimize this investment via a solid understanding of the technical infrastructure pain
points. SAP’s BW requires fine-tuning as indicated by other enterprises’ “best practice” experiences. This will enable businesses to leverage their SAP R/3 investments to achieve maximum value.
Aaron Zornes
Cranking Up Analytic Performance Incrementally Can Break the Bank
The current dour economy has most enterprises seeking to squeeze performance out of existing analytic platforms, gravitating toward generic, traditional, and ubiquitous platforms and applications.
Most are also risk-averse to advanced analytic solutions that imply “changing the way we operate.” This cost-saving approach of standardized infrastructure, however, can result
in large capital outlays for incremental traditional data processing technology not optimized for analytics (i.e., mainstream hardware and RDBMS suited for transaction processing). Doug
Laney
Agnosticism Toward Analytic Acceleration Alternatives
As user and data volumes increase, enterprises tend to polarize around how to improve or maintain the performance of analytic applications. That is, in lieu of specialized software
solutions (e.g., analytic databases, hierarchical storage management) designed to accelerate analytic performance, they either forego functionality or information depth/breadth, or they load up on
hardware. Doug Laney
Enterprise Applications’ Data Appetite Causes Information Constipation
Million-dollar investments in ERP, CRM, and SCM largely remain frustrated by an inability to generate aggregated, timely information out of these “data jailhouses.” Most painful
is that highly coveted 360-degree customer views are blocked by a lack of integrated data across legacy systems and discrete front-office applications. Moreover, the pace of business increasingly
dictates a unified, real-time view of info across systems. During 2003/04, IT and business processes will become even more codependent, with the “pragmatic” real-time enterprise becoming first a
reality and then a necessity. Aaron Zornes
Divergent Evolutionary Paths for Analytic and Operational Infrastructures
By 2005/06, severe data volumes from the capture of subtransactional business events will force enterprises to rethink the value of aggregate analysis at the expense of
performance, and lead them to accept statistical sampling for strategy derivation and operational quality assurance. By 2007/08, 75% of enterprises will maintain an analytic infrastructure stack
entirely distinct from the operational/transaction technology set. Doug Laney
Data Cleansing Leaders Eating Challengers’ Dust
A complete data quality suite offers the ability to cleanse, identify/match, standardize, profile, and enrich data. Challengers in this market have been expanding their
capabilities while leaders have remained somewhat complacent. For instance, DataFlux (SAS) and Group 1 now offer real-time business/consumer intelligence augmentation, and Vality now integrates
data profiling as part of Ascential Software’s integrated platform. However, leaders Trillium Software and Firstlogic have incorporated little to no enrichment or profiling capabilities. Doug
Laney
Avoiding Mergers and Accusations Due to Information Mismatch
Although the M&A climate has moderated in the past year, enterprises still tend to focus primarily on financial, product, and personnel alignment during M&A due diligence. However, what
more often affects M&A success is the ability of the involved enterprises to integrate their information assets. Yet most do not have the tools (e.g., data profiling) or methods (e.g.,
information auditing) to determine the impact of differences in information semantics and structure. This results in the enterprises continuing to operate independently and
uncooperatively for an extended time, and M&As failing to achieve expected economies of scale. Doug Laney
Assimilating Analytic Output Goes Beyond Strategic Decision Making
Through 2003/04, we believe most analytic technology buyers’ behavior will be based on their perception that analytics are still viewed as a strategic solution, not something that is
business-enabling via tactical or operational performance enhancement. Still, we will see inklings of emergent interest in automated business process performance through direct
assimilation of analytic output. Doug Laney
Accelerating Analytics
Throwing incremental hardware and DBMS software at analytic performance problems may be no more than an expensive stop-gap solution – albeit one that ensures consistency with existing IT standards.
Enterprises should also consider infrastructure components designed for high-order analytics or utilities to optimize environments dynamically. When internal resources and skills are
limited, hosted solutions should be adopted. Doug Laney
Do You Know Where Your Software Licenses Are?
Global 3000 organizations have purchased thousands of software licenses over time, with most of them untracked and impossible to locate. In addition, many licenses that are tied to a particular
hardware serial number end up as shelfware, because they get lost during hardware replacement, adding significant cost through repurchasing of available software. Companies must closely
track hardware-bound software licenses to negotiate ways to move software from one system to another. Also, if software budgeting cannot be centralized, the purchasing function should
maintain a license database to obtain economies of scale and a clear understanding of which licenses are deployed and where they are. Andreas Bitterer
Enterprise Information Integration: “Clash of the Titans”
In addition to customer data integration (CDI) solutions, mega-application vendors are planning to extend data synchronization infrastructure to support employee and
supplier/partner master data as well – enterprise information integration (EII). Data/process model primacy, system-of-record dominance, multi-level integration points, and extreme scalability will
be crucial to such EII initiatives during 2003/04. Aaron Zornes
Enterprise Information Integration: System-of-Record Hegemony
A “single point of truth” is desired, if not legally required, for many application areas. Furthermore, to reduce processing costs and service/sales/marketing errors, it is necessary to have a
single system act as the “system of record.” Systems such as DWL and Siebel provide a master console capability to manage and control “master-level” access. Unfortunately, consolidating
master data in a single database can also lead to “hot spot” bottlenecks unless the DBMS optimizer technology specializes in real-time performance for certain reads and writes (good
examples are HP’s NonStop SQL and IBM’s Informix). Aaron Zornes
Care and Feeding of ERP/CRM “Adolescent” BI Solutions
Most enterprises face a challenge when provisioning enterprise analytics, given the multi-vendor enterprise application envrionment (e.g., PeopleSoft HR and Siebel sales automation) typical in most
organizations. A major challenge arises because each of these mega-vendors acts as if it is the ordained “center of gravity” for information (e.g., customer information) and
provides integrated free or low-cost business intelligence (BI) capabilities. SAP R/3 and PeopleSoft 8.0 users are fortunate because the BI offerings from these mega-vendors provide solid
functionality and integration. Aaron Zornes
Used by permission of Doug Laney, META Group, Inc.
Copyright 2002 © META Group, Inc.