Category Archives: Case Study

Business Intelligence TCO

Business Intelligence TCO

Interestingly, despite requiring fewer full-time equivalent employees (FTEs) to support BI deployments, the Best-in-Class are capable of completing BI projects, from start to finish, both on budget and within expected time frames. Additionally, they are delivering BI capabilities to more enterprise users than their counterparts.

This report investigates the key factors that organizations consider important in controlling the total cost of business intelligence implementations. Thirty-seven (37) organizations (20 percent) were found to be Best-in-Class at managing the total cost of ownership (TCO) of their business intelligence solutions. Thirty-one percent of the 26 Microsoft (Nasdaq: MSFT) BI users that took part in the survey achieved Best-in-Class status — a higher proportion of its installed base than any other major BI vendor. Figure 1 highlights the number of survey respondents that reported using each vendor’s software, together with the percentage of that vendor’s customers that achieved Best-in-Class.

Interesting article on CRM Buyer

The Petabyte BI World – Wired

Sensors everywhere. Infinite storage. Clouds of processors. Our ability to capture, warehouse, and understand massive amounts of data is changing science, medicine, business, and technology. As our collection of facts and figures grows, so will the opportunity to find answers to fundamental questions. Because in the era of big data, more isn’t just more. More is different.

This month’s Wired magazine carries one of the most important growing concerns of the scientific community, the uncontrollable growth of data. This growth of data in many directions is nearly killing theories as everything is becoming more and more data controlled.

There are a series of articles ranging from what data miners are digging today to elaborate algorithms that predict air ticket prices to how we can monitor epidemics hour by hour.

If you are a BI entusiast or not, this month’s Wired cover story will challenge all your predictions about science and technology, even if you have a petabyte of data to support it !! Read it, like, right now !!

Patient Satisfaction Enhanced With BI

Some progress on the healthcare BI Apps. Read more from dBusiness News.

In the newly released benchmark report “Business Intelligence in Healthcare: Have Providers Found a Cure?,” Aberdeen Group, a Harte-Hanks Company , found that Best-in-Class organizations achieved a 15% increase in patient satisfaction scores through the use of Business Intelligence (BI) and analytical tools. This study collected data from nearly 100 healthcare providers and found that these organizations are increasingly deploying BI tools in the hospital in order to combat the challenges of rising healthcare costs and the pressing need to enhance patient care.

Prior Aberdeen research revealed that healthcare organizations have been hesitant to deploy analytical tools, lagging behind industry norms in both adoption and maturity of BI implementations. The challenge many hospitals face is making sense of a tangled web of disparate back-end data sources. Showing a lucid connection between analytical capability and enhanced quality of care is often a complicated task. Through the use of BI and analytical tools, healthcare organizations have been able to leverage financial and clinical data in order to better manage patient flow, streamline their operations, and deliver an elevated standard of patient care. Best-in-Class organizations have been able to achieve these performance improvements through an efficient combination of organizational capability and technology enablers such as HIS (Hospital Information Systems). Drawing on a solid foundation of organizational capability the Best-in-Class were able to drive an 11% reduction in overtime incurred, a stark contrast to all other organizations that experienced a 7% increase in overtime incurred.

How Oil Companies Use BI To Maximize Profits

Given the frantic oil price rise, here is some insight into how oil companies use Business Intelligence to maximize profits. Great & Must Read.

Every Wednesday morning, the shouts and hand gestures that make the Nymex trading floor in New York frantic begin to calm. Petroleum traders are waiting for the release of data from the U.S. Energy Information Administration (EIA) on countries’ inventories of crude oil and gasoline, as well as world crude prices.

At 10:30 a.m., the EIA’s website sees a storm of activity: 1,000 page views per second for 15 seconds, says Charlie Riner, a lead analyst for the site. Oil companies, commodities traders, analyst firms, and government agencies in the United States and other countries have written bots to collect the data. Then traffic ebbs.

In the oil and gas business, you are what you own. The amount of crude waiting to be refined, or the already-processed liquid in storage tanks ready to be sold and delivered, represents much of a company’s value at a given moment. As a refiner, Valero buys barrels of oil to heat and pressurize into other products, such as diesel fuel, asphalt and lubricants. The $95 billion downstream company owns 17 refineries that together can produce 3.1 million barrels of product per day.

But Valero doesn’t sell that much in a given day so it must store finished goods until they’re ready to be shipped to customers. The company tracks its own inventory movements the way a first-time mother studies her infant. How much of which products did we sell this morning? How about now? And now?

Market analysts run inventory reports “a few hundred times a day,” says Kirk Hewitt, vice president of accounting processing optimization . As the cost of crude fluctuates during trading hours, Valero sales and marketing staff want frequent updates so they can sell products at the most profitable price and buy crude to feed their refineries at the best price.

“We’re dealing with a commodity whose price changes every second,” Hewitt explains. “So our margins change every minute. Our costs change every minute.”

Reduce Business Intelligence Costs – The Tata Nano Way

Recently Tata Motors, an Indian based auto company, launched the world’s cheapest car called Nano. The car is a four-door, five-seat hatch, powered by a 30 HP Bosch 624 cc four stroke engine. The Nano is capable of 65 miles an hour. There is a small trunk, big enough for a duffel bag. The Nano, which cost just $2,500, will change the face of not only the Indian car market, but the global auto industry. It is the best example for other industries like IT at a critical time when worldwide they are straining all effort to come with different ways to stay competitive in declining market conditions. This is a very interesting innovation that has taken place. These other industries need to see from this how they can give the best value to their customers at an unbelievable price. There is slowdown in the U.S. economy after years of exhilarating growth. The IT industry is facing the scene of a slowdown due to U.S. economic worries triggered by the subprime mortgage crisis. In such a tight economy, it is very crucial to innovate different ways to stay competitive. The IT industry needs to provide their customers a dream-come-true experience, like the Tata Nano has given to their customers. They need to create more opportunities that their customers can afford in this weak market condition. The IT industry has to thrive to deliver the best value at low cost if they want to survive in this economy. What is this magic that could bring down the cost of BI solutions drastically without compromising on the quality? We will see in this article the levers that Tata used to reduce the cost of its car. We will compare those levers with possible levers in business intelligence implementations for cost reduction.

DM Review has a great article by Shailesh Kosambia on reducing BI costs using the case study of Tata Nano, the sub $2500 car that got the limelight, couple of months ago.

Its certainly a great read for BI managers and project sponsors. Take a spin !!

Deploying the Integrated Customer Database

An excellent case study by Andres Perez, on how a company tried to deploy a single integrated customer database and practical challenges that they faced from financial to ROI questions. A must read.

The demand for integrated information has created a vendor response that has spawned a market for what many call customer data integration (CDI) or master data management (MDM). These approaches are characterized in many ways; however, they are typically presented as a “federation” or “consolidation” of disparate databases and applications to present an “integrated” or “unified” view of the customer, product, supplier, etc. The vendors offering customer relationship management (CRM) tools, CDI or MDM capabilities usually focus on facilitating and accelerating data movement from one or more databases or files to another using extract, transform and load (ETL), messaging (message queues), and other capabilities. How are these “solutions” meeting the customers’ expectations? In a previous article, I mentioned that data movement increases costs (adds more complexity to the information management environment), information float or delays (whether batch or messaging), reduces semantic value (much semantic value is casted in the context of the existing applications), and significantly increases the opportunity for introducing information defects. Customers are realizing that these “solutions” are more focused on attacking the symptoms (e.g., moving data around faster) instead of attacking the root cause (e.g., keeping the information integrated in one place in the first place).