The Business Intelligence Blog

Slicing Business Dicing Intelligence

Archive for the ‘Data Integration’ tag

New Data Integration Option For Amazon’s EC2 Service  

From InfoWeek’s blogpost by John Foley -

Open source software company SnapLogic has introduced a version of its data integration framework that’s tuned for Amazon (NSDQ: AMZN).com’s Elastic Compute Cloud, or EC2, Web service. It gives developers and IT departments the option of doing their data integration work in Amazon’s cloud rather than on their own servers.

Two-year-old SnapLogic’s framework consists of a design tool, metadata repository, server, and connector modules for Apache, Oracle (NSDQ: ORCL), Salesforce (NYSE: CRM), and other data sources. In May, the company released SnapLogic 2.0 as a VMware appliance. The framework is available free under the General Public License (v2) or via two subscription license options with various levels of support. InformationWeek profiled SnapLogic as our Startup Of The Week on May 31.

share the post:
  • Twitter
  • Facebook
  • LinkedIn
  • FriendFeed
  • del.icio.us
  • Google Bookmarks
  • HackerNews
  • Live

The article has

one response

Written by Guru Kirthigavasan

July 9th, 2008 at 8:04 am

Informatica Positioned In Leaders Quadrant In Data Quality Tools  

From Informatica Press Release, this is an important win for Informatica.

Informatica Corporation (NASDAQ: INFA), the leading independent provider of data integration software and services, today announced that it has been positioned by Gartner, Inc. in the leaders’ quadrant in the 2008 Magic Quadrant for Data Quality Tools report.

Ted Friedman and Andreas Bitterer, authors of the report state, “leaders in the market demonstrate strength across a complete range of data quality functionality, including profiling, parsing, standardization, matching, validation and enrichment. They exhibit a clear understanding and vision of where the market is headed, including recognition of noncustomer data quality issues and the delivery of enterprise-level data quality implementations. Leaders have an established market presence, significant size and a multinational presence.”

According to the report, “growth, innovation and volatility (via mergers and acquisitions) continue to shape the market for data quality tools. Investment on the part of buyers and vendors is increasing as organizations recognize the value of these tools in master data management and information governance initiatives.” The complete report, including the quadrant graphic, is available on the Informatica web site at http://www.informatica.com/dq_mq/.

share the post:
  • Twitter
  • Facebook
  • LinkedIn
  • FriendFeed
  • del.icio.us
  • Google Bookmarks
  • HackerNews
  • Live

The article has

2 responses

Written by Guru Kirthigavasan

June 20th, 2008 at 6:01 am

SmartStream Banks on Informatica to Accelerate Customer ROI  

From the Press Release -

Informatica Corporation (Nasdaq: INFA), the leading independent provider of data integration software, today announced that SmartStream Technologies, a leading provider of software to the financial services industry, is OEMing the Informatica PowerCenter data integration platform as part of its flagship Transaction Lifecycle Management (TLM) solutions.

In making Informatica PowerCenter the foundation of its SmartStream TLM Business Integration (TLM BI) offering, SmartStream is empowering those customers with complex data environments to accelerate the return on investment of their TLM deployments through the high-performance and cost-effective integration of data involved in transaction cycles.

“The increasing drive to streamline global banking practices means our software needs to manage highly complex and rapid transactions across platforms and different banks. By using Informatica, rather than continually creating bespoke data interfaces, we can enable a faster ROI while freeing our professional services teams to provide more value to customers,” said Neil
Vernon, head of SmartStream’s Product Management Group. “We selected Informatica to power TLM BI following an evaluation where they scored highest against our key criteria of usability, reusability and performance. In addition, it was critical that TLM BI have the focus and support of a recognized best-of-breed vendor such as Informatica.”

share the post:
  • Twitter
  • Facebook
  • LinkedIn
  • FriendFeed
  • del.icio.us
  • Google Bookmarks
  • HackerNews
  • Live

The article has

no responses yet

Written by Guru Kirthigavasan

June 17th, 2008 at 6:02 am

ETI Does Hassle-Free Data Integration  

One data integration veteran may be worth a second look. Evolutionary Technology International (ETI) was founded 18 years ago, and unlike many of its data integration competitors, ETI markets its own (homegrown) legacy connectivity solutions.

Getting at mainframe data, after all, is ETI’s bread-and-butter business: its ETI Solution V6 boasts connectors to a wide range of data sources, including legacy platforms of all kinds (e.g., MVS and VSE on the mainframe side, OS/400 and VMS in the minicomputer segment), Unix, Windows, Linux, and all major relational database management systems.

ETI Solution V6 bundles data profiling, data cleansing, and data monitoring facilities, along with change-data-capture capabilities.

ETI’s special sauce, says Wyatt Ciesielka, vice-president of North American sales, is its code-generation engine, which produces extracted, cleansed, and transformed data in the form of an executable compiled in a variety of languages — including C, Java, and COBOL.

Read more at ESJ.

share the post:
  • Twitter
  • Facebook
  • LinkedIn
  • FriendFeed
  • del.icio.us
  • Google Bookmarks
  • HackerNews
  • Live

The article has

no responses yet

Written by Guru Kirthigavasan

May 14th, 2008 at 6:51 pm

Deploying the Integrated Customer Database  

An excellent case study by Andres Perez, on how a company tried to deploy a single integrated customer database and practical challenges that they faced from financial to ROI questions. A must read.

The demand for integrated information has created a vendor response that has spawned a market for what many call customer data integration (CDI) or master data management (MDM). These approaches are characterized in many ways; however, they are typically presented as a “federation” or “consolidation” of disparate databases and applications to present an “integrated” or “unified” view of the customer, product, supplier, etc. The vendors offering customer relationship management (CRM) tools, CDI or MDM capabilities usually focus on facilitating and accelerating data movement from one or more databases or files to another using extract, transform and load (ETL), messaging (message queues), and other capabilities. How are these “solutions” meeting the customers’ expectations? In a previous article, I mentioned that data movement increases costs (adds more complexity to the information management environment), information float or delays (whether batch or messaging), reduces semantic value (much semantic value is casted in the context of the existing applications), and significantly increases the opportunity for introducing information defects. Customers are realizing that these “solutions” are more focused on attacking the symptoms (e.g., moving data around faster) instead of attacking the root cause (e.g., keeping the information integrated in one place in the first place).

share the post:
  • Twitter
  • Facebook
  • LinkedIn
  • FriendFeed
  • del.icio.us
  • Google Bookmarks
  • HackerNews
  • Live

The article has

no responses yet

Written by Guru Kirthigavasan

February 13th, 2008 at 8:26 am