News

Hadoop solidified for production duty

After nearly seven years of development and fine tuning, the Apache Hadoop data processing framework is finally ready for full production use, the developers of the software announced Wednesday.

The project team behind Apache Hadoop has released version 1.0 of their platform. “Users can be much more confident that this release will be supported by the open source community,” said Apache Hadoop vice president Arun Murthy. “There is no more confusion over which version of Hadoop to use for which feature.”

Three new additions in particular helped make this release worthy of the 1.0 designation, Murthy explained. End-to-end security is the chief feature. Hadoop can now be secured across an entire network, using the Kerberos network authentication protocol. As a result, enterprises can now trust their Hadoop deployments with sensitive and personal data. The second feature, the webhdfs REST API (representational state transfer application programming interface), can be used to interact with Hadoop using Web technologies that many administrators and programmers easily understand, making Hadoop more applicable to more organizations. Finally, this version is the first to fully run HBase, which gives administrators a familiar relational database-like structure to store their data.

Lucene developer Doug Cutting, along with Mike Cafarella, created Hadoop in 2005 as an implementation of Google’s MapReduce algorithm, a technique for analyzing data spread out across many different servers. Cutting would later go on to work for Yahoo to help the portal company use the technology to aid in its search service, an implementation that was eventually spread across over 40,000 servers.

Hadoop can be used to store and analyze large data sets, often called big data. Although originally designed for aiding large search services, the technology is increasingly finding a home within enterprises as well, Murthy said. The project has at least 35 code committers, and hundreds of other contributors.

Using Hadoop for data analysis can be handy for data sets that are too large for traditional relational databases, or in cases where the organization collects lots of data but doesn’t know yet what analysis needs to be done on that data. JPMorgan Chase uses the technology for fraud detection and risk management. EBay is using the technology to build a new search engine for its auction service.

The technology has also gotten a lot of commercial support. Startups Cloudera, Yahoo-spinoff Hortonworks and MapR all offer commercial distributions of the software. IBM has incorporated Hadoop into its InfoSphere BigInsights data analysis package, and Microsoft has a copy of Hadoop running on its Windows Azure cloud service.

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines