Add-on Solution for Symantec Cluster File System
Making Apache Hadoop enterprise ready
This is a Press Release edited by StorageNewsletter.com on August 21, 2012 at 3:00 pmSymantec Corp. announced an add-on solution for Symantec’s Cluster File System that enables customers to run big data analytics on their existing infrastructure by making it highly available and manageable.
Apache Hadoop offers customers value to drive revenue by helping analyze data for business insights, however many existing data solutions lack the data management capabilities and built-in resilience to overcome the cost and complexity of increasing storage and server sprawl.
By working closely with Hortonworks, Inc., the new Symantec Enterprise Solution for Hadoop offering provides a scalable, resilient data management solution for handling big data workloads to help make Apache Hadoop ready for enterprise deployment.
Symantec’s Cluster File System is an enterprise solution to address big data workloads.
With Symantec Enterprise Solution for Hadoop,
organizations can:
- Leverage their existing infrastructure by scaling up to 16PB of data including structured and unstructured data
- Avoid over provisioning on both storage and compute capacity
- Run analytics wherever the data sits, eliminating expensive data moves
- Make Hadoop highly available without a potential single point of failure or a performance bottleneck
Leveraging Existing Infrastructure
and Avoiding Over Provisioning
IT administrators have spent considerable time and resources consolidating their data centers and reducing their footprint through virtualization and cloud computing. Taking advantage of big data analytics should leverage this consolidation of storage and compute resources. Symantec Enterprise Solution for Hadoop enables customers to run Hadoop while minimizing investments in a parallel infrastructure – reducing the storage footprint to reduce cost and complexity.
Analyzing Data Where it Resides
and Eliminating Expensive Data Moves
The first step in making the Hadoop infrastructure work is to funnel data for analysis. By enabling integration of existing storage assets into the Hadoop processing framework, organizations can avoid time consuming and costly data movement activities. Symantec Enterprise Solution for Hadoop allows administrators to leave the data where it resides and run analytics on it without having to extract, transform and load it to a separate cluster – avoiding expensive and painful data migrations.
Ensuring Hadoop is Highly Available
In an Apache Hadoop environment, data is distributed across nodes with only one metadata server that knows the data location – potentially resulting in a performance bottleneck and single point of failure that could lead to application downtime. To meet the need for timely insights, Symantec Enterprise Solution for Hadoop provides file system HA to the metadata server while also ensuring analytics applications continue to run as long as there is at least one working node in the cluster. Since the Hadoop file system is replaced with Symantec’s Cluster File System, each node in the cluster can also access data simultaneously, eliminating both the performance bottleneck and single point of failure.
Pricing, Availability and Support
The Symantec Enterprise Solution for Hadoop is available to existing Cluster File System customers at no additional charge. It supports Hortonworks Data Platform (HDP) 1.0 and Apache Hadoop 1.0.2. Customers running HDP 1.0 will be able to get Hadoop support and training from Hadoop partner Hortonworks, a commercial vendor promoting the innovation, development and support of Apache Hadoop.
"Customers can’t afford to let the challenges of implementing big data translate into management challenges within the infrastructure they’ve worked so hard to build," said Don Angspatt, VP of product management, Storage and Availability Management Group, Symantec. "Our Enterprise Solution for Hadoop helps connect Hadoop’s business analytics to the existing storage environment while addressing key challenges of server sprawl and HA for critical applications. It’s now entirely possible to get the big data solution you want from the infrastructure you’ve got."
"Hortonworks is excited to partner with Symantec to provide customers robust business analytics without having to rebuild their IT infrastructure," said Mitch Ferguson, VP of business development, Hortonworks. "The Hortonworks Data Platform is built upon the most stable version of Apache Hadoop and Symantec provides the market leading storage management and HA software with their Cluster File System to enable seamless implementation. We look forward to providing best-in-class support and training to help customers run Hadoop in their existing environment and drive their businesses forward to the next level."
"Enterprises are looking to leverage the power of Hadoop analytics to see if Hadoop-supported applications can drive critical business decisions. Therefore, IT management will need the confidence that their Hadoop-related infrastructure is prepared to stand up to the demands of a production data center environment," said John Webster, senior partner of Evaluator Group. "Symantec Enterprise Solution for Hadoop is one example of a tool designed to smooth the transition from pilot project to production by addressing key data center challenges including HA, security, data protection, and data governance."