Advertisment

Intel ‘inside’ Big Data through hadoop

author-image
DQI Bureau
New Update

The market for Hadoop continues to draw new players. Intel, known more for its computer processing hardware, has got into the Hadoop software business, asserting that it wants to spur growth of big data analytics deployments in large data centers. Apache Hadoop is a software framework that supports data-intensive distributed applications. It has become one of the most important technologies for managing large volumes of data, and has given rise to a growing ecosystem of tools and applications that can store and analyze large data sets on commodity hardware.

Advertisment

With Intel releasing its Apache Hadoop distribution, the third major revision of its work on Hadoop, citing significant performance benefits and claiming it will open source much of its work and push it back upstream into the Hadoop project.

According to Intel, the open platform was built from the ground up, on Apache Hadoop, and will keep pace with the rapid evolution of big data analytics. Intel says its distribution is the first to provide complete encryption with support of Intel AES New Instructions (Intel AES-NI) in the Intel Xeon processor. Silicon based encryption allows organizations to more securely analyze their data sets without compromising performance.

Speaking at a briefing organized at Singapore to mark the occasion, Patrick Buddenbaum, director of Enterprise Computing said, "Intel is committed to contributing its enhancements made to use all of the computing horsepower available to the open source community to provide the industry with a better foundation from which it can push the limits of innovation and realize the transformational opportunity of big data."

However the firm said it will retain the source code for the Intel Manager for Apache Hadoop, the cluster management part of the distribution. Intel said it will use this to offer support services to data centers that deploy large Hadoop clusters.

Advertisment

How Intel is Building its Hadoop Stack

In the chart above, the portions marked in light blue are the areas where Intel has made substantial performance enhancements, while the modules in gray are pieces Intel is pulling directly from the Apache Hadoop project. Any code Intel has created to enhance any of the modules highlighted in light blue will be eventually contributed back to the Apache Hadoop community; though, whether the community accepts these changes into the Hadoop stack is another matter.

The dark blue area at the top is Intel's Manager for Apache Hadoop, the portion which is not open sourced. This tool will be used to deploy, configure, manage, and monitor the Intel Hadoop stack, and it will integrate with other Intel tools, such as its data center manager power utilization amongst others.

Advertisment

Jason Fedder, APAC & PRC general manager, Data center Group added that the Hadoop framework has enormous potential saying, "Hadoop will be a foundational layer within organizations that they can build a variety of stacks on top of through a horizontal distribution. Many organizations are looking for a large, stable company to support them so they can invest for the long term. We feel we're a good bet for a variety of players. At the same time deploying and managing should be simple for IT managers because it has management capabilities, which can automatically configure performance tuning."

On the security side, Intel assured that its distribution is the first to offer ‘complete encryption' on its Xeon processors, promising enterprise customers that their data sets will be secure without sacrificing performance power. According to the company, with its enhancements and hardware, a terabyte of data, analysis of which had taken up to four hours, was now possible within 7 minutes.

The go-to-market strategy behind this venture will have multiple routes: System Integrators (SIs), Independent Software Vendors (ISVs), Original Equipment Manufacturers (OEMs), and training partners. According to Intel, by partnering with all the key segments of the technology ecosystem, Intel aims to equip these partners with the capability to address their customers' big data challenges and to create new revenue streams. By working with training partners, Intel plans to engage and educate customers about use-case scenarios and opportunities big data holds for a range of businesses.

Advertisment

Patrick Buddenbaum admitted that Intel plans to work along with ISVs whose core competence is at the applications solutions level, but he asserted that Intel will take care of the foundational level. The company plans to offer this framework by partnering with Cisco, Flytxt, Revolution Analytics, and Quanta QCT amongst others with local support in the Asia-Pacific region. According to Intel, customers ranging from governments and telecom service providers will be the first to see hardware-optimized software offerings for their big data needs. The company also believes that organizations in markets such as Australia, India, Korea, and Singapore will capitalize on the Hadoop Manager ecosystem by driving transformational projects on several fronts.

Set to roll out worldwide, the Intel-Hadoop deployment will be delivered through an annual subscription with technical support through solution vendors and service providers.

While the firm's direct income from the Hadoop distribution will come from support services, one cannot discount the indirect income from Xeon processor sales. The fact is, Intel sees an opportunity in software, and it is not about to miss this one. Like everyone else, Intel knows it needs to be a software and services company, and Hadoop is a good way to go about it.

Advertisment