Monthly Archives: April 2009

Microsoft Unveils Apps for Crime-Fighting Data Mining

Once again, software is fighting crime. Microsoft unveiled a suite of tools and initiatives for law-enforcement groups “specifically designed to improve public security and safety,” the company said.
..
..
It’s also the latest example of law enforcement officials arming themselves with better technology to help fight crime. The FBI, for instance, said that new database and data-sharing efforts have resulted in solving a number of difficult highway serial killings.

Gathering that data is key. That’s why Microsoft this week said it is giving a free tool to INTERPOL called the Computer Online Forensic Evidence Extractor (COFEE), an application that “uses common digital forensics tool to help officers at the scene of the crime.”

The company is working on a mobile version for future release, said Richard Domingues Boscovich, senior attorney for Microsoft’s Internet security program, told InternetNews.com in an e-mail.

A larger tool set for large-scale crimes is Microsoft Intelligence Framework, which is aimed at helping intelligence and law enforcement agencies coordinate information to detect and prevent terrorism, and to solve organized and major crime cases. The framework offers tools for storing and analyzing evidence and information across a variety of sources

From EarthWeb article.

Rising Tide in the Data Warehouse vs. Data Mart Debate

Is building an enterprise data warehouse (EDW) the best path to business intelligence (BI)? It’s a perennially vexing question that — thanks to a couple of recent trends in BI and data warehousing (DW) — has taken on new life.

The value of the full-fledged EDW seems unassailable. Over the last half-decade, however, some of the biggest EDW champions have moderated their stances, such that they now both countenance the existence of alternatives and, under certain very special conditions, are even willing to admit they’re useful. The result is that although the EDW is still seen as the Holy Grail of data warehousing, departmental (and even enterprise) data marts are now countenanced as well.

Active EDW giant Teradata Inc. is the foremost case in point, but other players — including relative newcomer Hewlett-Packard Co. (HP), which is in the high-end DW segment (by its acquisition of Knightsbridge Solutions) and markets Neoview, a DW appliance-like offering — are staking out similar ground. (In addition to Neoview, HP also partners with both Microsoft Corp. and Oracle Corp. to market appliances in the 1 to 32 TB range.)

Interesting debate on TDWI.

VMware’s vSphere 4, OS for the Cloud

VMWare VSphere 4

The cloud computing space is getting heated-up. VMWare’s latest major update to their existing ESX server will be an addition to the portfolio of cloud computing software.

VMware on April 21 launched vSphere 4, a major update to its ESX Server hypervisor, declaring it to be the first operating system specifically engineered for cloud computing. It is the first major upgrade to the product since 2006.

vSphere 4 amounts to a rebuild of VMware’s core virtualization platform. Fundamentally, it combines virtual resources in the data center into one centrally managed pool of computing power. It will be made available in the second quarter of 2009, the company said.

vSphere 4′s second purpose is to facilitate delivery of IT infrastructure as a service to enterprises, so IT departments can build their own private cloud systems to provide business services internally for the company and for trusted partners, supply chain participants and other business associates.

From eWeek’s article.

Small Business Intelligence Summit 2009

The Small Business Intelligence Summit (SBIS) will kick off the first of a nationwide series of events in Dallas, Texas on Thursday, May 14, with intensive sessions to help small businesses succeed in the current challenging economy. The events will allow existing and developing companies to connect and collaborate on solutions that support their business in today’s environment and well into the future.

The events will feature expert speakers, including sessions with:

– Chuck Wilsker, president and CEO of the Telework Coalition and co-
author of numerous articles, including “Unleashing the Hidden Productivity
of Your Small Business. Unified Communications: A Key Component of Your
Telework Program”
– Bob Burg, corporate speaker and author of “Endless Referrals: Network
Your Everyday Contacts Into Sales”
– Christopher Justice, CEO of Sparksight, a small business-marketing
agency
– Small and Mid-size Business experts from Avaya, a global leader in
communications technology.

From the Press Release. More from the official site – SBIS09.

Clearing the Air on Cloud Computing!

From New York Times BITS Blog, a post on McKinsey’s recent study on Cloud Computing.

The McKinsey study, “Clearing the Air on Cloud Computing,” concludes that outsourcing a typical corporate data center to a cloud service would more than double the cost. Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center.

“The industry has assumed the financial benefits of cloud computing and, in our view, that’s a faulty assumption,” said Will Forrest, a principal at McKinsey, who led the study.

Owning the hardware, McKinsey states, is actually cost-effective for most corporations when the depreciation write-offs for tax purposes are included. And the labor savings from moving to the cloud model has been greatly exaggerated, Mr. Forrest says. The care and feeding of a company’s software, regardless of where it’s hosted, and providing help to users both remain labor-intensive endeavors.

Clouds, Mr. Forrest notes, can make a lot of sense for small and medium-sized companies, typically with revenue of $500 million or less.

Next Generation Healthcare Analytics

Over at The Health Care Blog, Deb Bradley, Vice President, Client Solutions at Verisk Health in Waltham, Massachusetts writes about some examples of the next generation healthcare analytics. Most of us who do analytics engineering as apart of our day jobs would agree, healthcare is on area where analytics should grow vastly. There is a lot of data that can be intelligently massaged to answer some of the most challeging health related questions.

Medical claims, pharmacy claims, lab values, HRAs, genetic markers, biometrics – the abundance of data is having an immediate impact on how analytics shape healthcare. Next generation analytics are bringing attention to health and wellness rather than disease-specific guidelines, and generating novel approaches to value-based medicine and care management.

Traditionally, analytics, such as predictive modeling, have been used to identify individuals for chronic care management and to set rates. New predictive models, however, include financial and clinical algorithms, which allow healthcare organizations to implement advanced ways to identify, manage and measure risk across and within a population.

SPSS Rebrands Its Analytical Offerings

The new version of the SPSS modeling product — the erstwhile Clementine — is now known as PASW Modeler 13; its text analysis product (formerly Text Mining for Clementine) is now PASW Text Analytics 13. SPSS says that, over the course of the year, the rest of the SPSS product line will update under the PASW umbrella — including Statistics and Data Collection.

David Vergara, director of product marketing for SPSS, explains that the change was intended to help customers and prospects understand what the products are doing and how each offering pieces together within the broader portfolio.

Aside from the name change, the new versions of SPSS products focus on usability — and not just for data experts. Wettemann says that SPSS has “recognized that moving beyond the data analyst audience is where you get the real power.” PASW Modeler 13 features a drag-and-drop interface, and functionality that will appeal to business users. Two integral updates include a “comments” tool, in which users can flag notes within the software, and automated data preparation. Data automation mitigates human error and avoids common issues in data quality.

From Destination CRM.

Cross-Site Scripting takes over Twitter

Twitter

Cross site scripting is getting to be a common security vulnerability for online services. And Twitter that allows 140 characters per tweet wasn’t an exception.

The worms exploit a common vulnerability in Web applications called cross-site scripting, which allows someone to inject code into Web pages others are viewing.

In this instance, Twitter users who clicked on the name or image of anyone sending the worm messages would get infected and then send the message on to all that person’s followers. Anyone viewing an infected user’s profile would also get infected and pass the worm on.

“What we’re seeing was it was possible for codes to be embedded, small pieces of JavaScript, into people’s profiles. This should be fairly elemental to filter out,” he said.

While the attacks were mostly a nuisance, they could have been dangerous if spyware or other malware had been downloaded onto Twitter users’ computers, Cluley said.

To avoid such JavaScript-based attacks, you can turn off JavaScript in your browser. Instructions for doing this are here. You can also use utilities such as NoScript, an open-source Firefox extension, Hayter recommended.

SAP Enhances SAP BusinessObjects Edge Solutions

SAP BusinessObjects Polestar Delivers the Value of BI to Everyone

With intuitive BI tools and increased support for customers using non-SAP applications, users gain better visibility into business information from across their organization–regardless of their skill level or IT system.

Business users within organizations of any size need simpler, more intuitive BI tools that allow them to quickly search, explore and retrieve business information.

With SAP BusinessObjects Polestar provided as part of SAP BusinessObjects Edge BI, now employees can use an easy-to-use keyword search to find information from any data source. This solution displays results like reports and easy-to-read dashboards, automatically creating visual representations of data, such as charts and graphs.

SAP BusinessObjects Polestar has intuitive data exploration and visualization capabilities that let users drill down into a particular topic, like sales by region, by simply clicking on the report, data set or dashboard. No prior BI training or IT expertise is required.

From PR Newswire

The Flaws of the Classic Data Warehouse Architecture

This CDWA has served us well the last twenty years. In fact, up to five years ago we had good reasons to use this architecture. The state of database, ETL, and reporting technology did not really allow us to develop something else. All the tools were aimed at supporting the CDWA. But the question right now is: twenty years later, is this still the right architecture? Is this the best possible architecture we can come up with, especially if we consider the new demands and requirements, and if we look at new technologies available in the market? My answer would be no! To me, we are slowly reaching the end of an era. An era where the CDWA was king. It is time for change. This article is the first in a series on the flaws of the CDWA and on an alternative architecture, one that fits the needs and wishes of most organizations for (hopefully) the next twenty years. Let’s start by describing some of the CDWA flaws.

The first flaw is related to the concept of operational business intelligence. More and more, organizations show interest in supporting operational business intelligence. What this means is that the reports that the decision makers use have to include more up-to-date data. Refreshing the source data once a day is not enough for those users. Decision makers who are quite close to the business processes especially need 100% up-to-date data. But how do you do this? You don’t have to be a technological wizard to understand that, if data has to be copied four or five times from one data storage layer to another, to get from the production databases to the reports, doing this in just a few seconds will become close to impossible. We have to simplify the architecture to be able to support operational business intelligence. Bottom line, what it means is that we have to remove data storage layers and minimize the number of copy steps.

Great read from BEye Network. Part 1 and Part 2.