Showing posts sorted by relevance for query business intelligence analytics. Sort by date Show all posts
Showing posts sorted by relevance for query business intelligence analytics. Sort by date Show all posts

Monday, November 21, 2011

Parsing Business Intelligence (BI)

Business intelligence (BI) is now a vital imperative of 21st century enterprise. Yet, a unified view about just what BI is and does for enterprise is only now emerging. Prof Ronald K Klimberg and Prof Virginia Miori (2010) address the challenge by parsing BI as follows:
Organizations, both corporate and academic, have been rushing to the table with their own BI groups and programs. Though these groups share a common title, BI, they do not share a common understanding of all that BI comprises. The establishment of their functions follows from the specific strengths and expertise within all of these organizations. This shared limitation was not based on a lack of inclusiveness, but merely a lack of cohesive vision. Further, consider that industry’s definition of business intelligence is by and large quite different from academia’s definition. More so, within industries and within academia, these definitions also vary. The definition of BI seems to depend heavily upon your particular perspective or training. What then is business intelligence...?

Despite the appearance of BI in both academia and industry, until now the field has lacked a clear definition. Not all aspects of BI will be exploited in every situation, but it is still important to know what the future holds. Within this structure, BI was broken down into three significant areas: business information intelligence (BII), business statistical intelligence (BSI) and business modeling intelligence (BMI). Specialists exist in all of these areas, but the importance of the intersection and unions of these areas needs to be emphasized. True intelligence results from the melding of all of these technologies and tools....

Figure 1 presents a cohesive vision of business intelligence as a melding of technologies, models, techniques and practices. The three circles of the Venn diagram each represent areas of study and application that had previously been considered quite distinct: 1. information systems and technology, 2. statistics, and 3. OR/MS [Operations Research/Management Science]. It serves to encapsulate the broadening definition of BI. With this new vision, we may now characterize BI from each of three viewpoints as: business information intelligence (BII), business statistical intelligence (BSI) and business modeling intelligence (BMI). Each of the viewpoints has particular business aspects, and academically speaking, courses that are independent of the other viewpoints. Conversely, each viewpoint can work together or utilize techniques/skills from one or possibly two of the other disciplines. For example, data mining, which requires a high level of statistical knowledge as well as the availably of necessary data, may require significant IT skills and/or knowledge. Further, if data mining analysis demands a systematic process of analysis, modeling skills may be required.

Business analytics (BA), within our framework, is classified as a combination of business statistics intelligence (BSI) and business modeling intelligence (BMI): BA = BSI + BMI. BI is the union of the areas of BA, BI and business information intelligence (BII): BI = BII + BA or more specifically BII + (BSI + BMI). As evidenced in the data-mining example, black and white distinctions between disciplines can quickly become gray.
Figure 1: Business intelligence/business analytics breakdown

Note that modeling is a central practice in BI. Follow the link below to read the entire article.

Read More

Source: Klimberg, R K & Miori, V (2010, October), Back in Business, INFORMS, 37(5).

Related Posts

Sunday, February 12, 2012

Self-Service Business Intelligence and Analytics Means Just That

The mantra of "self-service" is now reaching a crescendo in the business intelligence and analytics community. Jorgen Heizenberg of CapGemini acknowledges budget constraints in the current economy make self-service business analytics an enterprise imperative. However, he cautions that security monitoring by information technology (IT) departments is still required.
The current state of our economy is also impacting IT budgets. That’s a fact that nobody can deny. At the same time the need for relevant information has increased considerably. Organizations are more and more focusing on their customer and need supporting data. That is another fact. As a result IT is reconsidering its position (back to the core?) whilst the business is waiting for the much needed report or analysis. This need for faster time to information and less IT involvement has given rise to something that is often called Business or Self Service Reporting (SSR). Traditionally BI reports are created by the IT department. SSR allows business users to do this for themselves using end user oriented query and reporting tools.
Read More

Jorgen Heizenberg

From where I sit, the supervisory involvement of IT in the production of business intelligence and analytics is abating, though security monitoring will continue. However, the commoditization of IT means that budget-constraints will limit IT's capacity to manage business analytics projects directly. Moreover, the demand for analytics itself is expanding at a rate that IT cannot now contend with internally given existing or diminishing resources. Self-service business analytics are the future, which means that IT's involvement in producing business intelligence and analytics will flag with time.

Source: Heizenberg, J (2012, January 18), Self Service Reporting Good! Traditional BI Bad?, CapGemini.

Related Posts

Friday, August 20, 2010

Tame versus Wicked Business Intelligence Problems

I recently commented on a post in a business intelligence forum, and I wanted to share that comment here for others to consider. The forum was discussing the challenges of implementing business intelligence and analytics:


One of the concerns I have with automating business intelligence is that many problems that I am confronted with from my clients are simply not "tame" (in other words, problems we understand, and in which ample data is available). Solving "tame" problems is much different than solving "wicked" problems (in which we have little understanding of the variables or conceptual framework, and in which little pertinent or valid data is available). Clearly, "tame" problems should be automated as much as possible. However, "wicked" problems are often the "writ large" of business intelligence requirements. Not all problems can or should be automated as the first step in solving the problem at hand. Also, automated business intelligence must be transparent enough that validation and reliability testing can be performed manually. Business intelligence is a vital growth area for enterprise in the 21st century, but we must still take care to do the job right in accordance with best research practices.

Tuesday, June 08, 2010

Embedded versus Embodied Decision Support

Within the realm of decision support methodologies, two very different paradigms are vying for the attention of enterprise. The first is what I call the embedded approach to decision support. The embedded approach to decision support emphasizes scientific logic and rigor, and is grounded firmly in the traditional disciplines of operations research, systems analysis, decision analysis, and risk analysis[1].

Embedded Decision Support Methodologies

The second still emerging approach is what I call the embodied approach to decision support. The embodied approach to decision support traces its roots to the information technology movement and enjoys critical acclaim for its potential for automated performance monitoring, business intelligence, and business analytics.

Embodied Decision Support Methodologies

Note that the content-validity of the embedded approach is widely accepted amongst professional researchers and analysts as a body of knowledge. The literature underlying the embedded approach is vast and rich in empirical evidence supporting the validity and reliability of its methods. This extant literature regarding the embedded approach is synonymous with the disciplines of operations research, systems analysis, decision analysis, and risk analysis.

The content-validity of the still emerging embodied approach remains in question. Researchers and analysts are still debating many of the terms used in the embodied approach, and a broad consensus regarding what exactly business intelligence and business analytics entail is not yet evident. The existing literature supporting the effectiveness of embodied methods is mostly descriptive with scant empirical evidence to support its validity and reliability as a proven decision support methodology.

The significance of the differentiation between embedded and embodied methods lies in the warranties that each provide the decision maker. An impressive quality of the embedded approach is that all the terms and concepts used are clearly defined and widely accepted by professional researchers and analysts thus enabling users to articulate universally their findings and recommendations.

In contrast, the lack of consensus regarding the validity and reliability of embodied methodologies limits the utility of what we know to be business intelligence and business analytics. Indeed, the methodological frameworks for business intelligence and business analytics are still emerging in the form of dashboard reporting systems and other untested visualization methods that some researchers argue can lead to cognitive distortions of the evidence uncovered by such methods.

Future consilience between the practitioners of embedded and embodied methods is far from complete or even certain. More empirical evidence will be needed before the embodied approach can be fully converged or enjoined within the deeper conceptual foundations of embedded methodologies. In the mean time, decision makers are advised to take precautions to ensure that embedded methodologies take the lead in verifying and confirming the findings and recommendations of embodied technologies.

[1] Note that risk analysis is not to be confused with risk management, which is a different function and discipline all together.

Friday, February 10, 2012

Analytics: Hotter Than Ever

Timo Elliott is predicting that 2012 will be the year that analytics takes the lead as a business driver in today's economy:
The real trend this year is not the technology. It’s about helping business people make better decisions, and actually change the way companies do business. Analytics has always been about transforming business, but the recent huge changes in analytic technology have created interesting new opportunities for business innovation....

In particular, companies want better visibility about what’s going on in their market, and increased organizational agility in order to be able to deal with change fast. It’s like driving in the fog without a map – in order to survive, you should invest in better visibility, brakes, and steering to be able to spot and avoid fast-moving objects looming out of the fog.

Analytics provides these capabilities: business intelligence to peer into the road ahead, risk-management to provide fast alerts to new obstacles, and flexible financial planning systems to help swerve around them....

Many companies are going beyond "just" improving their existing analytic capabilities, using analytics in new ways to change the way they do business. Instead of analytics being something that is used to monitor and eventually improve a business process, analytics is becoming a more fundamental part of the business process itself.
Read More

Timo Elliott

Let's face it, analytics are hotter than ever, especially in today's competitive economy.

Source: Elliott, T (2012, February 10), 2012: The Year Analytics Means Business, Business Analytics.

Related Posts

Thursday, May 17, 2012

Knime for Business Intelligence

For business analysts and firms seeking to expand their skills and capabilities from localized analytics into the broader realm of business intelligence processes and solutions, check out Knime. According to Knime's website:
Knime (Konstanz Information Miner) is a user-friendly and comprehensive open-source data integration, processing, analysis, and exploration platform. From day one, Knime has been developed using rigorous software engineering practices and is used by professionals in both industry and academia in over 60 countries.
Knime delivers robust features that encompass the full spectrum of business intelligence production requirements, including tools for: a) integrating multi-source data via open database connectivity (ODBC) and real-time processes; b) diverse analytic tools for data mining such as clustering, decision trees, rule induction, neural networks, association rules, scoring, meta-analysis, and more; and c) state-of-the-art presentation tools that easily integrate with existing ad-hoc reporting, automated dashboard, and systems actuation platforms. Knime is also actively supported by third-party extensions that integate Knime with R (Project R), Excel (Microsoft), and other widely-used integration, analytics, and presentation platforms.


Knime is the missing application that analysts have long-sought to enable self-service production of business intelligence. Anyone seeking to understand and manage the entire business intelligence production process will find Knime to be didactically useful.

Follow the link below to learn more.

Knime

Related Posts

Tuesday, May 25, 2010

High-Level Systems Components and Integration Now the Priority

“High-level” and “low-level” are terms used to describe and classify systems. High-level systems are generally more abstract than low-level systems, which tend to focus on discrete data specificities within the system rather than on how the system produces information as a whole.

Typical Systems Stack for Bespoke Risk Analytics Production

The graphic above depicts a typical business systems stack that includes data warehousing, integration, and analytical components from multiple vendors. Note that the data warehousing and integration systems appear as low-level components, while the analytical systems appear as high-level components. A key objective of this system is to throughput data into the hands of analysts on a self-help basis.

Over the past decade, systems engineers have worked diligently to install the lower-level components of their systems stacks, including the hardware and software associated with data warehousing and rudimentary integration. However, progress on the upper-level components of these systems stacks has typically lagged. As a result, the realization of the business intelligence (BI) vision in firms has generally been limited to simple performance monitoring with only marginal successes in higher-order analytics production.

The emerging shift in priority from low-level to high-level systems components, and from performance monitoring to higher-order analytics production also means shifting certain decision prerogatives away from information technology (IT) departments toward subject matter experts and analysts. The fact is that BI is not only a production process that requires systems, but also a thinking process that requires both ad hoc and post hoc analysis and testing by subject matter experts. The future of BI requires restoration of the decision support function. Moreover, analysts rather than technologists must assume greater responsibility and leadership over the overall BI effort.

While the shifting emphasis from lower to higher-level systems components brings value-adding potential in the form of higher-level analytics production, this shift also introduces risks and responsibilities that firms must consider in order to better align IT investments with expanding BI requirements. High-level systems components and integration are now the priority.

Related Posts:

Performance Monitoring versus Analytics

Business Intelligence for the Masses Comes Alive

Business Intelligence Requires Thinkers

Saturday, March 12, 2011

The Compleat Business Intelligence (BI) Analyst

According to Gert H N Laursen and Jesper Thorlund (2010), the business intelligence analyst is "a bridge builder between the company and its technical environment" (p. 134).
[Business intelligence] analysts need to master three professional competencies to be successful: business, method, and data. We can add to this certain key personal competetencies: the ability to listen and to convince. These are necessary if a task is to be understood, discussed with all involved parties, and delivered in a such a way that it makes a difference to business processes and thereby becomes potentially value-adding.... All in all, it sounds as if we need a superman. And that might not be far off, considering the fact that this is the analytical age. (Laursen & Thorlund, p. 101)
Global enterprise is looking for more than a few good people who can fill this standing order for expertise.


Source: Laursen & Thorlund (2010), Business Analytics for Managers: Taking Business Intelligence Beyond Reporting, Hoboken, NJ: John Wiley & Sons.

Related Posts

Saturday, August 03, 2013

Taming Big Data: The Emergence of Self-Service Business Intelligence (BI)

The infographic below created by IBM (2013) seeks to clarify key differences between so-called "small data" and "big data." The migration of data analytics from relational databases to in-memory systems is a vital step toward self-service production of business intelligence (BI)

[Click image to expand]

The migration of data from relational databases to in-memory database systems is good news for business intelligence (BI) analysts. Said another way, the era of self-service BI production has finally arrived.

Source: Taming Big Data: Small Data vs Big Data, Huffington Post.

Related Posts

Monday, March 15, 2010

Advanced Analytics Not Information Technology

I recently fielded a forum question about the cost-creation versus value-adding capabilities of information technology (IT) and advanced (i.e., bespoke) analytics in enterprise. Here is how I responded:

Regarding the linkages between information technology (IT), advanced analytics, and value, I would gently suggest that IT is a cost center, and advanced analytics are the value-adding proposition. In other words, don't go to the IT department if you are seeking to activate value-adding analytics (though I will concede that IT does have an effective role in business intelligence [BI] production, which is very different from advanced analytics in my view).

Unfortunately, IT solution providers know full well that advanced analytics is what creates value, and so IT firms will typically "bundle" various analytic offerings with a proposed IT solution in an effort to bamboozle the client into believing that scarce IT dollars can buy both transaction management and advanced analytical services together in one "big" IT installation deal. Buyers of IT solutions should therefore beware.

What is needed today is for IT managers to yield the analytics space to subject matter experts with analytical solutions that stand separate from the data warehousing infrastructure, while seeking to reduce costs in IT by exploiting the economies of scale that IT solutions typically contribute to the cost analysis.

Again, IT is a cost center, while advanced analytics (separate from BI) are the value-adding activity.

Tuesday, February 01, 2011

Self-Service Business Intelligence in 2011

The following is an excerpt from an interview between Mark Brunelli of Search Business Analytics and James Kobielus of Forrester Research:
How do you see the world of self-service BI progressing in 2011?

Self-service BI in 2011 will become the only BI approach that the new generation of information workers will ever encounter. Fundamentally, the way it’s going for us is that everybody wants to have the prestige clients on their desktop for BI -- an in-memory client like a Tibco or a PowerPivot or ClickView. So, what we’re going to see are these in-memory clients that support essentially [light] data mining with interactive visualization. [This will allow users] to bring millions and eventually billions of rows into memory and do some really sophisticated analyses. IT very much wants to go this route since IT doesn’t want to have to build cubes any longer. They don’t want to have to build all of the integration logic if the user can be given front-end tools that they can use to build their own visualizations and to pull data from the data warehouse.
I agree with James Kobielus that the widespread deployment and acceptance of self-service (i.e., "in-memory") business intelligence technologies have arrived...

Source: Brunelli, M (2011, January 28), The Top BI Trends and Analytics Technology Predictions for 2011, Search Business Analytics.

Tuesday, March 22, 2011

Business Intelligence versus Business Analytics


What’s the difference between Business Analytics and Business Intelligence? The correct answer is: everybody has an opinion, but nobody knows, and you shouldn’t care.

Read More

Tuesday, April 06, 2010

Excel in the Future

According to Nenshad Bardoliwalla (2009) of Enterprise Irregulars, Excel (Microsoft) will sustain its lead in the end-user business intelligence (BI) market through 2010 and beyond:
Excel will continue to provide the dominant paradigm for end-user BI consumption. For Excel specifically, the number one analytic tool by far with a home on hundreds of millions of personal desktops, Microsoft has invested significantly in ensuring its continued viability as we move past its second decade of existence, and its adoption shows absolutely no sign of abating any time soon. With Excel 2010’s arrival, this includes significantly enhanced charting capabilities, a server-based mode first released in 2007 called Excel Services, being a first-class citizen in SharePoint, and the biggest disruptor, the launch of PowerPivot, an extremely fast, scalable, in-memory analytic engine that can allow Excel analysis on millions of rows of data at sub-second speeds. While many vendors have tried in vain to displace Excel from the desktops of the business user for more than two decades, none will be any closer to succeeding any time soon. Microsoft will continue to make sure of that.
Excel remains my preferred financial modeling and risk analysis platform for all the reasons cited above (although the copy of Excel on my computer has been "souped-up" for professional use). Visit my website linked elsewhere on this page to learn more.

Source: Bardoliwalla, N (2009, December 1), The Top 10 Trends for 2010 in Analytics, Business Intelligence, and Performance Management, EnterpriseIrregulars.com.

Related Posts:

Visual Basic for Applications (VBA) 7.0

Why Spreadsheets?

Tuesday, April 08, 2014

Business Intelligence and the Analytics Leader

A succinct and instructive Venn diagram describing the business intelligence space, including the central (and essential) role of the analytics leader.

[click image to enlarge]

Related Posts

Sunday, September 05, 2010

Business Intelligence = Data + Analytics

We might all find inspiration from IBM's recent advertising regarding the future of business intelligence, data, and analytics.



For more information, visit:

A Smarter Planet

Related Posts:

The Art and Science of Data Enjoined

Nature by Numbers

The Fourth Paradigm: Data-Intensive Scientific Discovery

Thursday, June 07, 2012

Business Intelligence Process Integration via Knime

The business intelligence (BI) production path integrates data access, transformation, analysis, visualization, and exploitation into a unified process. Knime (Konstanz Information Miner) is a professional open-source software package that integrates all of these functions onto a single platform. According to the Knime website:
Knime, pronounced [naim], is a modern data analytics platform that allows you to perform sophisticated statistics and data mining on your data to analyze trends and predict potential results. Its visual workbench combines data access, data transformation, initial investigation, powerful predictive analytics and visualization. Knime also provides the ability to develop reports based on your information or automate the application of new insight back into production systems.
Knime is supported by an expanding number of third-party extensions that enable interfacing with Excel (Microsoft), R (Project R), BIRT (Eclipse), WEKA (University of Waikato), and more. Expand the graphic below to see how Knime integrates various functions and processes via its discrete process simulation features.


Knime Function Nodes [click to enlarge]

Follow the link below to learn more about Knime and its powerful BI production features for enterprise.

Knime

Related Posts

Friday, September 09, 2011

Prototypal Business Intelligence Stack for Ad Hoc Self-Service Risk Analytics

Although business intelligence (BI) stacks often mix solutions from various vendors, the following is a prototypal BI stack for ad hoc self-service risk analytics:


Learn More

Related Posts

Tuesday, September 06, 2011

Business Intelligence & Risk Analytics via Excel + ModelRisk

ModelRisk 4 (Vose) is the most advanced business intelligence and risk analytics software platform ever created. ModelRisk adds critical functionality to Excel (Microsoft), including tools for simulations, object modeling, distributions, correlations, forecasting, optimization, and more.


Learn More

Wednesday, May 15, 2013

Business Intelligence (BI) versus Data Science

David Smith at Revolutions (2013) compares business intelligence (BI) with data science as follows:


Read More

The discipline of analytics is constantly evolving, or so it seems...

Source: Smith, D (2013, May 15), Statistics vs Data Science vs BI, Revolutions.

Related Posts

Sunday, November 29, 2009

ModelRisk 3.0: Best in Class Solution for Excel-Based Risk Analysis

I have been a practicing risk modeler and analyst now for over fifteen years, and during that time, I have worked and trained with a variety of popular spreadsheet-based software tools, including Crystal Ball (Oracle) and @Risk (Palisade). Each of these applications enables users of Excel (Microsoft) to incorporate simulations and optimizations into models. However, neither Crystal Ball nor @Risk offers a comprehensive software solution that combines simulation and optimization with stochastic object modeling, time-series forecasting, and multivariate correlation. ModelRisk 3.0 from Vose Software (Ghent, Brussels) combines all of these features into a single application that works seamlessly with Excel.
The tools and techniques made available in ModelRisk have been developed from Vose Consulting’s experience in assessing risk in a broad range of industries over many years, and goes far beyond the Monte Carlo simulation tools currently available. ModelRisk has been designed to make risk analysis modeling simpler and more intuitive for the novice user, and at the same time provide access to the most advanced risk modeling techniques available. (Press release, Vose Software, May 11, 2009)
The latest version of ModelRisk is the most advanced spreadsheet-based risk-modeling platform ever developed and currently stands as the best in class software solution for quantitative risk analysis, forecasting, simulation, and optimization. ModelRisk enables users to build complex risk analysis models in a fraction of the time required to develop custom-coded applications. Open database connectivity further extends the business intelligence capabilities of this integrated platform by enabling access to essentially any data warehousing system in use today.
“Good risk analysis modeling doesn’t have to be hard, but the tools just weren’t there to make it easy and intuitive. So we asked, “If we could start from the beginning, what would the ideal risk analysis tool be like?” says David Vose, Technical Director of Vose Software. “ModelRisk is the result. Users of competing spreadsheet risk analysis tools will find all the features they are familiar with in ModelRisk, but ModelRisk throws open the doors to a far richer world of risk analysis modeling. Better still, ModelRisk has many visual tools that really help the user understand what they are modeling so they can be confident in what they do, and ModelRisk costs no more than the older tools available. We also have a training program second-to-none: the people teaching our courses are risk analysts with years of real-world experience, not just software trainers.”
ModelRisk 3.0 now includes:
  • Over 100 distribution types
  • Stochastic ‘objects’ for more powerful and intuitive modeling
  • Time-series forecasting tools such as ARMA, ARCH, GARCH, and more
  • Advanced correlations via copulas
  • Distribution fitting of time-series data, including correlation structures
  • Probability measures and reporting
  • Integrated optimization using the most advanced, proven methods available
  • Multiple visual interfaces for ModelRisk functions
  • User library for organizing models, assumptions, references, simulation results, and more
  • Direct linking to external databases
  • Extreme-value modeling
  • Advanced data visualization tools
  • Expert elicitation tools
  • Mathematical tools fors numerical integration, series summation, and matrix analysis
  • Comprehensive statistical analytics
  • World class help file
  • Developers’ kit for programming using ModelRisk’s technology
Currently, no other competing software package on the market offers the same comprehensive list or range of features found in ModelRisk 3.0, which is now my primary risk modeling, forecasting, and business intelligence platform. For more information, follow the link below.

Learn More