Category Archives: Cloud Computing

Gartner’s 2012 Hype Cycle for Emerging Technologies

This broad scenario portrays a world in which analytic insight and computing power are nearly infinite and cost-effectively scalable. Once enterprises gain access to these resources, many improved capabilities are possible, such as better understanding customers or better fraud reduction. The enabling technologies and trends on the 2012 Hype Cycle include quantum computing, the various forms of cloud computing, big data, complex-event processing, social analytics, in-memory database management systems, in-memory analytics, text analytics and predictive analytics. The tipping point technologies that will make this scenario accessible to enterprises, governments and consumers include cloud computing, big data and in-memory database management systems.

via Gartner’s 2012 Hype Cycle for Emerging Technologies Identifies “Tipping Point” Technologies That Will Unlock Long-Awaited Technology Scenarios.

Cloud computing major storing content by 2016: Gartner

“Gartner predicts that worldwide consumer digital storage needs will grow from 329 exabytes in 2011 to 4.1 zettabytes in 2016,” the study said.

This includes digital content stored in PCs, smartphones, tablets, hard-disk drives (HDDs), network attached storage (NAS) and cloud repositories.

“Gartner predicts that worldwide consumer digital storage needs will grow from 329 exabytes in 2011 to 4.1 zettabytes in 2016,” the study said.

This includes digital content stored in PCs, smartphones, tablets, hard-disk drives (HDDs), network attached storage (NAS) and cloud repositories.

via Business Line : Industry & Economy / Info-tech : Cloud computing major storing content by 2016: Gartner.

Is there sunshine ahead for cloud computing?

The global cloud-computing market is expected to reach $241 billion in 2020, up from $41 billion in 2010, according to Forrester Research. That long-term potential is reflected in the highflying stocks of companies actively involved in the concept.

A stumbling block, however, is concern over the security of data when a client firm can no longer control it on its own premises. Hackers and crashed systems are, after all, among a company’s worst nightmares.

And while the cloud is a definite boon to smaller firms, more established companies have already made significant investments in equipment and staffing. There is also confusion over what cloud computing really is and who provides it.

The field’s successful pioneer is Inc., a well-managed company that over the last decade effectively introduced this cost-saving business model. It offered a monthly subscription service that allowed firms to simply go to their Web browsers, point to and begin using it. That turned out to be a good financial deal for its clients as well as for its shareholders.

Good Read…

Microsoft Gives the Cloud to Scientists

More in NYTimes

The software maker has started grafting popular scientific databases and analysis tools onto its Windows Azure cloud computing service. This basically means that researchers in various fields can get access to a fast supercomputer of their very own and pose queries to enormous data sets that Microsoft keeps up to date. For the time being, Microsoft will allow some research groups to perform their work free, while others will have to rent calculation time on Azure via a credit card.

These moves have turned Somsak Phattarasukol, a graduate student at the University of Washington in Seattle, into a big fan of Microsoft.

Mr. Phattarasukol, like many researchers, is accustomed to waiting in line for access to large, public computers and to twiddling his thumbs – sometimes for days – as the machines work on his requests. It’s a frustrating process only made worse as the databases the researchers deal with swell alongside the time it takes to perform the analysis.

Microsoft officially opened access to the scientific bits of Azure this week, but Mr. Phattarasukol got early access to the system. He’s part of a team that’s trying to create a biofuel from bacteria that produce hydrogen gas. The work has required the research team to compare the makeup of various bacterium strains against an extensive protein database, as they try to figure out which bits of genetic code can prompt higher hydrogen gas production.

6 Security ‘Must Haves’ For Cloud Computing

According to Gartner, to achieve effective and safe private cloud computing deployments, security, as it exists in virtualized data centers, needs to evolve and become independent of the physical infrastructure that includes servers, Internet Protocol (IP) addresses, Media Access Control (MAC) address and a lot more.

However, it must not be bolted on as an afterthought once companies move from enterprise deployments, to virtualized centers, to private/public cloud.

While the basic components of security in information management remain the same — ensuring the confidentiality, integrity, authenticity, access and audit of information and workloads — a new, integrated approach to security will be required.

More from CMSWire

In Interview – Consider CloudHosting Your Business Intelligence

// Jaspersoft’s experience with more than 100 successful cloud BI deployments has made us realize that a partnership, best-of-breed approach to cloud BI is the best way to go. BI as a service through on-demand SaaS (News – Alert) deployments are generally singular offerings that are overstretched, offer limited flexibility, and generally need to be built from the ground-up, resulting in costly down-time and high implementation costs. One of the best practices that we’ve established from our multiple launches is that customers need to have a cloud hosting-enhanced BI solution with a lean framework. Jaspersoft’s lean architecture based on web-based open standards coupled with experts in cloud management and BI consulting results in a proven solution than can meet a myriad of business needs. ..

More from an interview with Karl Van den Bergh, vice president of product strategy at Jaspersoft.

Dominant Player in Cloud Computing?

Read Write Web makes a study at Cloud Computing and analyzes the future leader in this area. Interesting Read.

In a way, this runs against the grain of existing technology landscape and our history with successful innovations. Maybe that is why we love the idea of the cloud itself?

It’s too big to own: One big reason to doubt a single dominant force in the cloud is that it feels like owning the Internet. Even Cisco with its strengths can’t make such a claim. Perhaps the cloud is the perfect market, where the barriers of entry are low enough that continual evolution will occur.
It’s a movement, not a layer: Another argument against the cloud having a dominant player is its fuzzy definition. There are many parts and pieces to it, and it’s not clear today what it would mean to “win” the cloud computing market.

Portability will keep vendors in check: If customers demand solutions where they can move from vendor to vendor freely, it will impact the landscape. Companies with cloud solutions in the marketplace could be required by these customers to remove barriers to moving data and services between different entities. Additionally, standards and best practices may emerge that allow companies and individuals to move freely between providers. In this world, it will become a fluid market that prevents vendor lock and promotes pricing and trust as brand differentiators.

Investing in the Cloud

Satish Dharmaraj, former founder and CEO of Zimbra talks about the trends he is looking at in his role at Redpoint Ventures, a Silicon Valley venture firm.

There are two areas Redpoint is looking at for cloud computing and virtualization.

Taking applications that used to run behind firewall and moving them to the cloud. This as a big trend for SMB and emerging for enterprise-class applications. SMBs are enjoying this trend now because they don’t have large IT departments already in place. In some cases, Redpoint also thinks that large enterprises will adopt these. It gives them more freedom of choice.

The second thing Redpoint is looking at is where large enterprises have data centers that are becoming a private cloud, and running vendor software on your own infrastructure that has been packaged for virtualization footprint. The new data center is an on-demand set of services that supports elastic computing. In the future, there will be similar advantages the public cloud offers, but for internal departments. They will be able to order computing services with a Web form and expect their delivery in hours, rather than weeks or months. With this will come applications for billing, provisioning and configuration management. Redpoint ahs invested in one company already in this space, VMOps, which is considered a IaaS (Infrastructure as a Service) company.

Additionally, there is a big trend in service providers with Web hosting operations (like 1&1 and Savvis). They are finding that they can cut costs by 1/10th by moving dedicated server business to virtual server business. Most of dedicated servers are running at 10% of the time and it makes sense for them – and for their customers – to reduce the data center footprint and cost infrastructure.

2010, Year of the Cloud

Whether you like it or not, if you consume tech juice on a regular basis, this year or even through this decade get ready to be swamped with news of the cloud. Yes, the cloud is the king and we are its citizens.

So this blog will go out of way to mention some juicy juice of Cloud Computing. There have been previous sightings of cloud here but lets start with one more common man’s primer, What is cloud computing and how do I use it?.

Developer of Mass Opinion BI, Creates New Computational Framework

From the Press Release:

WiseWindow, developer of Mass Opinion Business Intelligence, the next generation of web measurement, today announced that company founder and chief technology officer, Rajiv Dulepet, has been named advisor and architect for a new project funded by the National Institute of Health and executed by Caltech. The open-source project will develop a web-based bio-computational tool that allows bio-scientists and bio-computation engineers to “crunch data in the cloud” for large-scale tasks such as processing gene sequence data sets on a large cluster of computers. The new tool allows scientists to save considerable time that’s now spent waiting for computations on their desktops by moving these operations to the cloud, thereby freeing up their computers for other work.

“Working as a lead advisor to Caltech on cloud computing is both a privilege and passion for me,” said Dulepet. “It allows me to exercise skills in Internet data gathering and analysis as well as computational framework development.”