2020 Predictions From 50 Storage Vendors – Part One
Top 3: cloud, NVMe, Kubernetes
By Philippe Nicolas | January 9, 2020 at 2:24 pmA few days following the 2019 vendors’ facts and review publication, we publish our classic annual vendors’ predictions for 2020.
We collect 50 opinions and consolidate them to find some majors trends and directions for the storage industry:
- Without any surprise, cloud arrives #1 in various flavors (multi, hybrid, on-premises, private and we choose to add the edge in that category)
- New devices and connectivity around Flash, NVMe(-oF), SCM/PM, QLC
- Kubernetes of course and containers
- Analytics and AI
- Object storage
Here is the first part of this report, the second one being published tomorrow.
Actifio (Ash Ashutosh, CEO)
1. AI will dominate new apps and make data re-use even more important
While the singularity has not happened yet, we’re approaching it in software development. As data becomes more heavily governed, controlled and collected, ML will be a predominant way for it to be leveraged. Gartner says enterprise adoption of AI and ML has tripled in the last year alone, and that 37% of organizations have now embraced AI/ML. IDC predicts that by 2025, at least 90% of new enterprise apps will embed AI/ML. Data-powered decisions, including the use of robotic process automation (RPA) will eclipse human-powered decisions in terms of the volume of decisions cloud vendors like Google and IBM have dramatically lowered the barriers to adoption and use of AI/ML, leveling the playing field for organizations of any size to become data-driven. We’ve seen this phenomenon more recently with gene splicing that reduced the cost from millions of dollars to around $20. MIT’s new Schwarzman College of Computing will lay the foundation for computing as a basic skill, just as math is, and further enable exponential adoption of computing in everyday life. Similar to the rapid evolution of test data management (TDM) from an esoteric practice more than 10 years ago to a ubiquitous part of every organization today, I see the rise of analytics data management (ADM) as the new process for rapid adoption of AI/ML and improving the accuracy of data-driven decisions. One thing is certain: all this ML will play a major role in making data an even more strategic asset.
2. Hardware lock-in is back … pretending to be software
There is a new gen of deduplication appliance vendors looking to displace monolithic, previous-gen hardware platforms. But they bring some of the same dangers that customers came to regret with the legacy storage vendors of the previous era. The wolves are dressed in the sheep’s clothing of buzzwords like “scale-out” and “software-based” but make no mistake: users are locked in to expensive “certified” hardware. Nothing scales out more than cloud object storage (which is what Actifio leverages for storing backups while still providing instant mount and recoveries). A truly modern data management platform should run on any compute or storage, on-premises or in the cloud, and work with a variety of performance and capacity requirements. In 2020, more IT teams will recognize the trap and embrace multi-cloud copy data management and cloud object storage.
3. Another election year of data insecurity
Hacking, ransomware and data leaks are alleged to have played a central role in the US election process of 2016. Many candidates are back on the campaign trail in 2020, some for the first time and others with legacy election infrastructure. In order to capture and store sensitive donor information, these candidates will need to dig deep into their databases to find the personal identifiable information to reach out. To maintain the best donor experience and the most effective fund-raising and voter turnout operations while improving security and data privacy, campaign managers will turn to software platforms that help them ensure that campaign information stays secure from intrusion, private, up-to-date, and immediately accessible for strategic uses. In November we will see if DARPA’s $10 million contract commitment to secure, open-source election system hardware prototypes will have a positive effect. Given the volumes of information that will be stored, it will need to be able to securely and rapidly manage the data – and recover it quickly and efficiently in the case of a breach. Confidence in the integrity of the voting process is the backbone of a functioning democracy.
Atempo (Luc d’Urso, CEO)
1. New Data Management solutions fueled by AI to tackle the challenge of the explosion of unstructured data volume
2. Cyberattacks will continue, tape should be a good choice
3. Local value added MSPs to address data and digital sovereignty challenge
Bamboo Systems (Tony Craythorne, CEO)
The server industry is in constant growth mode, with data centers ever-expanding. These unabated increases have centered around the industry’s continued reliance on inefficient legacy server architecture, resulting in the cost of operating data centers to greatly exceed what it costs to build them. Data center power consumption is now a global issue.
The inefficiencies in legacy server architecture design mean that today’s data centers currently produce 2% of the world’s greenhouse gas emissions and collectively consume over 3.5% of the world’s power production. In fact, it is predicted that the server market will overtake the airline industry in greenhouse gas production next year. Something must be done and in the near future. We predict we will soon see more energy efficient data center solutions, with new, ground breaking low power chip designs, as this is now a pressing economic and ecological priority.
Caringo (Adrian Herrera, VP marketing)
1. Archives will provide increased benefits and value beyond just cheap and deep storage.
2. Object storage will blur the lines between primary and secondary storage for use cases that require enhanced throughput.
3. On-demand use cases will increase the need for a 3 tier storage approach incorporating NAS, on premises object storage and cloud archive or on premises tape.
Cloudian (Neil Stobart, VP global system engineering)
1. Edge computing will turn the hybrid cloud model on its side
The hybrid cloud storage model is characterised by the ability to seamlessly move data between on-premises and public cloud environments. This data mobility, combined with the flexibility and scalability hybrid storage provides, has made it the solution of choice for many businesses worldwide. However, the hybrid storage strategy will be turned on its side in 2020 as momentum behind Edge Computing continues to build. As a result of the rapid growth of the IoT and 5G networks, data collection and the processing of AI applications will be needed at the edge. Analytics will have to be applied near to where the data is being created and where sensors are located – e.g. traffic signals, surveillance cameras and smart cars. As such, there will be an increasing need for a different form of hybrid storage – one that fulfills the need for robust storage near the edge of the network and enables businesses to move analysed data sets between the edge and on-premises or public cloud systems.
2. Processing AI and ML workloads will require object storage
As data volumes continue to grow, one of the key challenges facing businesses is how to unlock the full strategic value of this data. This is especially true when dealing with AI and ML workloads. That’s why 2020 will see more organisations capitalising on object storage to create structured/tagged data from unstructured information and use metadata to make sense of the flood of data being generated.
With object storage, data is defined with unconstrained types of metadata and located from a single API. This is in contrast to traditional file storage, which defines data with limited metadata tags (such as the file name, date created and date last modified) and organises it into different folders – making it much less searchable and harder to analyse than object storage. For example, a traditional X-ray file would only include basic metadata like creation date, owner, location and size, while an X- ray object could include metadata that identifies the patient’s name, age, injury details and which area of the body was X-rayed. This extra detail makes it much easier to locate via search. Simply put, object storage architectures make use of metadata in ways transitional file storage doesn’t, making it instrumental in helping to process growing AI and ML workloads.
3. Momentum behind the use of private clouds in hybrid infrastructures will grow
Today, when people talk about the cloud, they usually mean the public cloud. This will change in 2020. The term cloud will become more nuanced as more businesses deploy private clouds and organisations increasingly pursue a hybrid cloud storage strategy that gives them the best of both worlds. Organisations in industries such as healthcare, media and scientific research that have large-scale storage requirements already face unique challenges in managing capacity-intensive workloads that can reach tens of petabytes. By providing the scale and flexibility benefits of public cloud platforms along with the performance, access, security and control advantages of on-premises storage, private cloud platforms address these challenges. Although the public cloud provides businesses with agility and convenience while decreasing infrastructure expenses, it can also come with significant bandwidth and accessibility costs. As such, in 2020 more organisations will take advantage of private clouds in a hybrid cloud infrastructure – storing frequently used data on- prem while continuing to utilize the public cloud for cloud-native applications and DR. Ultimately, businesses can’t afford to rest on their cloud laurels over the coming year, as 2020 is likely to be another year of significant change, Whether it’s recognising the need for edge computing, embracing object storage or adopting hybrid infrastructures, the onus is on businesses to prepare for what’s coming their way.
Clumio (Chadd Kenney, VP and chief technologist)
1. Continued growth in SaaS
SaaS will continue to be the easy button for enterprises to remove the complexity of on-premises infrastructure without needing to rearchitect existing applications. Copying the complexity of on-premises infrastructure to the cloud is no longer a viable option. IDC estimates the global market for SaaS applications revenues is expected to be more than $185 billion by 2022.
2. New policies require new methodologies
Enterprises will be challenged to meet the demands of new data privacy policies such as GDPR and CCPA. There will be a heightened demand for next-gen data protection solutions and methodologies that support these policies and enable fine-grain control over their data assets.
3. Analytics will come from non-traditional sources
For many enterprises, backup is just an insurance policy, but in 2020 enterprises will demand valuable insights from backup data in ways that were never possible with traditional on-premises backup solutions of yesterday.
Cohesity (Lynn Lucas, CMO)
1. More and more CIOs will rely on consumer technology to respond to critical data incidents smarter and faster
Consumerization of IT will become even more commonplace in 2020. IT teams in data infrastructure management will increasingly rely on smartphone apps to collaborate and respond to issues quickly and effectively. They will require mobile access to instances of data on-premises and in the cloud enabling them to have 24×7 visibility across their infrastructure. This infrastructure will enable IT leaders to take action faster, which is critical during time-sensitive occasions such as a potential outage or breach.
2. In 2020, the on-premises vs cloud debate will be much clearer
No doubt that cloud will remain a key part of business IT spend as the public cloud market is expected to globally grow to $299.4 billion by 2022. However, at the start of the new decade, large enterprise organizations will have a much clearer roadmap of what workloads should be in the cloud and what should be on-premises – and will rely on software that can easily navigate between the two environments. And, for those who are storing data in public cloud environments, more organizations will ensure they are taking responsibility for backing up that data in the cloud to avoid data loss or disruption in service in the event of a cloud outage.
3. Poor data management practices will cause brand reputation nightmares for more large enterprises
Millions of consumers, businesses, and public sector agencies are generating an immense amount of data daily. While the bulk of this content are images and video, which contain massive amounts of information, not far behind is data created from machines, database and application usage. Managing all of this data can be incredibly challenging and oversights can lead to data loss, outages, service disruptions and compliance violations – all of which can tarnish brand reputations in a matter of minutes. In the year ahead, we can expect more enterprises to undergo PR nightmares if they have not taken necessary steps to be exceptional stewards of their data.
Commvault
1. Expect CCPA to fuel a data collection and processing backlash (Nigel Tozer, solutions director EMEA)
The California Consumer Privacy Act (CCPA) will highlight data collection and monetization in the US, just as GDPR did in Europe. This will fuel a backlash on data collection and processing in the US, especially around political ad targeting during the 2020 election year. Companies such as Facebook and Google will come under greater pressure to distance themselves from this area, and data analysis companies that are now largely unheard of will be in the news for the wrong reasons.
2. Data analytics moves to the top of the companies’ priority list (Matt Tyrer, technology evangelist)
Data volumes continue to grow, but we’re knowing less and less about that data – which is a huge risk. In 2020, the focus on analytics will be driven by increased regulatory and compliance pressures, risks from data breaches and ransomware, and the need to properly classify data for AI and ML projects. Without “clean” data of value, these AI and ML projects will stumble. Data analytics will support intelligent decision making, feed AI and ML initiatives, and strengthen compliance stances within organizations. Expect more businesses to be hitting this point in their data maturity, where analytics projects take priority.
3. Multi-cloud adoption will increase demand for more diverse data protection capabilities (Penny Gralewski, solutions lead)
As organizations adopt more clouds for different organizational requirements, the need for fast, flexible data protection – able to protect a diverse set of data workloads – will increase. Organizations are choosing different clouds for different use cases, so today’s data protection platforms need to accommodate a variety of cloud use cases, including Platform as a Service (PaaS), containers, and massive databases like SQL Server, MySQL, PostgreSQL, Splunk, SAP HANA and Oracle.
CTera Networks (Aron Brand, CTO)
1. The decline of centralised cloud computing
The first-gen model of centralized cloud computing and storage has now run its course, and most of the new opportunities for enterprise data management reside at the edge.
2. Data growth outside the datacentre
Consider that 75% of enterprise data is created in branch offices, on mobile devices and by IoT-enabled smart devices. Such data growth outside the datacenter is the new reality, and it’s creating a need for enterprises to deploy computing power and storage capabilities at the network edge, aka edge computing.
3. Powerful edge compute will result in more desirable devices
If you take your voice-based home virtual digital assistant from Amazon or Google. More powerful edge compute power would make these devices even more desirable, including:
• Improved response time: your Amazon Echo, Google Home or other smart speaker would perform speech recognition locally and deliver a faster answer
• Offline availability: whether there is a storm, a power cut or a cloud service outage, your device will still be up and running, or ‘always-on’ even when the network is offline
• Improved security and privacy: by minimising the need to send sensitive information to the cloud, data privacy risks are reduced
Data Dynamics (Piyush Mehta, CEO)
1. Storage management will become part of a broader infrastructure management stack
Storage admins will need to learn more skills tied to compute and networking as you will see management of these stacks to consolidate.
2. Data management will continue to be key focus for storage companies
Legacy and new entrants in the storage landscape have realized that a disc is a disc and parity of features/functionality as it relates to storage have become a ‘must’ in order to be in the game. The differentiator is going to be those that can help customers with a better data management solution that can help extract business value out of the data that’s been created.
3. Multi-cloud will become more prevalent
Public cloud vendors will have to provide access and allow for an ecosystem of leveraging data to be moved across providers based on usage. The egress charges will need to decrease as data mobility will continue to be a requirement for enterprises for a variety of use cases.
DataCore (Gerardo Data, CMO)
1. Storage Unification
For decades, the industry has had discrete storage systems for high performance and for lower-cost storage, often from different vendors – a very hardware-centric approach. 2020 will see broader adoption of unified software-defined systems that make intelligent decisions on each piece of data, placing it in the tier it belongs given its performance and cost requirements—optimizing performance, utilization, and cost dynamically.
2. Intelligent Data Placement
Elements including richer metadata management and AI will empower data intelligence, which will increasingly enable software to drive automated decisions on the value of data and where it should be placed – including automatic movement to the cloud.
3. Shift from Hardware-Centric Model Will Accelerate Adoption of Software-Defined Storage
The hardware-centric mindset is not sustainable anymore. IT will increasingly stop thinking about storage systems as discrete, media-centric systems from multiple vendors, as these create islands of storage that become difficult to manage, inefficient in terms of capacity utilization, and nearly impossible to move data efficiently from one system to another. As the industry realizes that it’s the data that
Datrium (Sazzala Reddy, CTO)
1. Ransomware will innovate faster than mechanisms to prevent it
Due to its insane profitability, the proliferation of non-state and state actors, and cybercrimes (including ransomware attacks) will cost the world $6 trillion annually by 2021. Every business should investigate deploying a quick data recovery infrastructure that can help instantly roll back the IT environment to its pre- ransomware state and recover from an attack unharmed. Ransomware recovery will become a budget line item for the majority of CIOs in 2020.
2. Mainstream enterprises will finally embrace DR to the cloud
In 2020, mainstream business will become open to leveraging the cloud as a DR site and will start shutting down their physical DR sites because new cloud DR technologies will make it possible to leverage on-demand cloud resources during a disaster while keeping cloud costs low during the state of normal business operations.
3. SaaSification 2020
vSphere’s move to the cloud will trigger the rise of Cloud 3.0. All products in the enterprise are becoming SaaS products if they’re not already. The convenience and on-demand economics of SaaS products in the public cloud make it an unstoppable trend. VMware has 300,000+ customers with about 70+ million VMs deployed and has become available as a SaaS platform on most public clouds. With one of the world’s biggest enterprise on-prem platforms rapidly transitioning to the SaaS world, it’s inevitable that all the other third-party products in the VMware ecosystem will transform as well and “SaaSify.”
DDN (Alex Bouzari, CEO)
1. AI-enabled, intelligent infrastructures will get deployed at massive scale and disrupt the HPC industry.
2. Multi-cloud, flexible data center architectures will start to bring true operational simplicity, efficiency and agility to the Enterprise.
3. In 2020 DDN will deliver a full range of new intelligent infrastructure solutions ideally suited for enterprise and HPC customers’ demanding AI, IoT, multi-cloud, big data and mixed workloads at scale.
DriveScale (Tom Lyon, chief scientist, Brian Pawlowski, CTO and Denise Shiffman, CPO)
1. The new private cloud will be created with elastic bare-metal infrastructure
While VM-deployed applications will continue to move to the public cloud, more bare-metal private cloud data centers will be created in 2020 to run performance- sensitive bare-metal and Kubernetes-native applications. This is being driven by the growth and investment in data-intensive apps and the need for scale-out data center architectures.
2. 2020 will be the year of the Smart NIC
Smart NICs have been used for a few years by cloud providers to offload network switching and encapsulation, but in 2020 we’ll see SmartNICs for storage applications and in private data centers. Amazon’s Nitro SmartNIC was the first with NVMe storage capabilities, but now both Mellanox and Broadcom have announced similar capabilities in their chips, while high-flying start-ups Pensando and Fungible are building systems revolving around SmartNIC capabilities. Why SmartNICs now? Aren’t these just “front-end processors”? The most compelling feature of SmartNICs is that they provide a separate security domain that cleanly separates user code from provider code. In these days of CPU vulnerabilities like Spectre and Meltdown, it may not be safe to assume the correctness of any security boundaries within a CPU.
3. 2020 will be the year of Kubernetes
And people realize they have a lot more work to do to deploy Kubernetes for their data intensive applications. The realities of adopting containerization for persistent storage will continue to drive Kubernetes evolution. While the introduction of the CSI API for Kubernetes makes it easier for storage providers to integrate into K8s, requirements for cost efficient scale out application deployment on top of inexpensive commodity hardware remains a challenge in how best to deploy these applications over K8s.
ExaGrid (Bill Andrews, CEO)
1. Public clouds such as Amazon AWS and Microsoft Azure will continue to grow but not at the expense of the corporate data center.
Most corporate data centers have concluded that the public clouds cost more than running their own data centers and therefore, the corporate data centers are only doing specific projects with the public cloud providers such as archive data (untouched data), development operations (dev opps) for burstable compute, etc.
2. The SMB will go to cloud solutions but not the public cloud.
They are turning to true MSPs that can provide more hands on value add. SMBs should not be operating data centers as they don’t have the IT expertise. It is better for SMB customers to outsource to MSPs.
3. The move to Kerberos and Containers will slowly begin to show up in production. To date most have been pilot testing. Over time containers will be the wave of the future due to IoT and other distributed data.
FujiFilm Recording Media US (Peter Faulhaber, president)
1. Software-defined tape for object storage will emerge as a popular solution, providing the interface to download data from object storage systems to compatible tape systems using standard S3 APIs
Users will be able to write objects directly to tape in native form, in a self-describing, open format. As a result, object storage users can leverage the value proposition of tape including lowest TCO, reliability and long term archivability.
2. According to IDC, the rapidly growing Data Sphere means demand for persistent storage will be in the multi-zettabyte range in 2020
The big data growth drivers including IoT, AI/ML, HD video (4k, 8k), surveillance, gaming and other apps will demand cost-effective, long term tape storage.
3. New hyperscale data center markets will emerge in 2020 led by Internet and cloud service providers in markets like China and India
This will cause a shift in regional storage from USA-centric to global demand and present new opportunities for tape market growth.
Hammerspace (Brendan Wolfe, VP product marketing)
1. People will move data out of HDFS and onto NFS environments as they look to update their approach to big data analytics.
2. We will see a lot more companies moving stateful applications into Kubernetes and look for technologies to help them protect and manage the associated persistent data.
3. The three big clouds will ramp up the fight for more footprint in on-premises data centers.
Hitachi Vantara (Hu Yoshida, CTO)
1. The demand for large scalable enterprise storage arrays will increase
The increasing demand for core storage is changing. Capacity will be driven by big data and data lakes as well as copies required for exploratory, training, and governance purposes. The trend toward containers will generate swarms of workloads which will require instant storage. IoT will drive real-time applications which will need low latencies. The faster speed of NVMe devices will expose bottlenecks in smaller storage controllers and faster NVMe-oF fabrics will enable increased connectivity to servers. Core storage will also need to connect to edge and cloud and integrate as one pool of managed storage. Large scalable storage arrays with full enterprise capabilities will be the most economical way to address these requirements.
2. Edge to core to multi-cloud replaces discussions around public/private/hybrid cloud
By now the discussions around public/private/hybrid cloud have become moot as digital transformation has driven users to a combination of different cloud implementations for agility and scalability. Digital transformation has also driven smart applications to the edge for mobility and real-time processing. This has increased the complexity of storage management as users seek to balance their business requirements across this multi-vendor, multi-tier storage landscape in an exploding data environment. Business requirements have also increased in complexity due to looming cyber threats, increasing regulations around privacy and transparency, regional requirements, and democratization of data usage. The only way to manage this environment is to take an open, holistic, edge to core to multi- cloud approach to storage and data management.
3. Edge computing will see greater adoption with DataOps packages For the edge
The introduction 5G and IoT will bring more focus to the edge and this will be the new battleground for cloud service providers who are looking to extend their reach beyond the cloud. The arguments for edge computing are compelling: low latency for applications like autonomous vehicles and telemedicine, integration of OT (operational technology) and IT for improved business decisions, and a shorter control loop for better QoS. However, the edge is like the wild west. OT (Operational Technology) facilities like factories and manufacturing sites were run in a closed environment and lack the disciplines that have been honed by IT where application and data had to exist in a shared and open environment. The protocols, networks and systems on the edge are very diverse, and low cost, while IT systems are standardized and generally higher cost. An edge gateway must be used to integrate the two types of systems and data to realize the value of combining OT and IT. Tools like avatars, a software representation of the physical machine will help to bridge the gap. While technology is important, edge success depends on the business use case. A software package like Lumada Edge Intelligence can reduce the time to value by providing actionable, real-time insights to help critical operations be more predictable and manageable. It provides the capabilities for local data operations, ML, streaming analytics and standalone IoT application solutions. When paired with the powerful services from experienced IT and OT professionals, customers can develop a comprehensive DataOps strategy that manages assets holistically from the edge to core to multicloud.
HYCU (Simon Taylor, CEO)
1. Multi-clouds expand in use
It comes as no surprise that multi-cloud use is now becoming mainstream in enterprises globally. We are seeing as many as four or more cloud platforms in use with our customers, regardless of size and geographical location. And, this goes for mission critical and numerous enterprise workloads. To support this requires the need for a multi-vendor and multi-disciplinary strategy for cloud usage for IT today. This echoes a prediction from last year where we see more customers that will need solutions that work across multiple clouds, be they on-premises or public. We will continue to see vendors marketing their solutions as multi-cloud but customers are much savvier and truly understand the value in a cloud-native solution running as a service on any cloud. This will be the year where customers look for ways to help them manage their data the way they want it, in their cloud of choice with a solution that makes it theirs to control.
2. Simplicity is more than just a buzzword
Customers are smarter than ever before. They understand the full value of eliminating complexity from their IT infrastructure. It’s a major reason why companies like Nutanix are so successful. They have found a way to eliminate unnecessary redundancy and streamline IT infrastructure and operations into 1-click simplicity. That is the driving force behind innovation in the next-gen platforms in the industry today that include AWS, Azure and Google Cloud Platform in the public cloud space and Nutanix and VMware in on-premises enterprise cloud space. While customers continue to try to consolidate around the platforms that deliver the most value for the enterprise workloads they need managed, protected and recovered, they want to do this with simplicity in mind. No additional hardware, software or infrastructure required. And, that run natively as a service on the cloud platform of choice. There will be fewer standalone companies in 2020 if they don’t address this fundamental customer need for simplicity.
3. Built for purpose makes a difference
When we introduced the idea of being purpose-built for a customer’s cloud platform of choice, first for on-premises and now for public cloud, it was significant. No one data management and protection platform can do it all. We fundamentally believed and still believe that those solutions that complement the underlying cloud platform of choice without compromising the reason the customer selected it are critical. As more and more of the majority of workloads move to the cloud over time, it makes a difference when customers select and deploy data management and protection solutions that are built for purpose. That means they do more than just look and feel like the platform they are helping to protect and support but that run natively, deploy as a service, are lightweight and keep application consistency intact. This in turn will help companies make the most of their IT investments and allow them to continue to focus on innovation and driving value for their own customers.
Infinidat (Stanley Zaffos, SVP product marketing)
1. 5G in 2020 accelerating the volume and velocity of data collected
2020 will be the year 5G enters the mainstream. The new wireless standard will start generating real value for companies deploying IoT projects. Gartner predicts 66% of organizations will deploy 5G, and 59% will include IoT communications in the use case for 5G. Companies deploying IoT projects will need to plan for the data deluge that is coming. They’ll need to set up infrastructure and processes to filter the data, pre-analyze it, categorize it, store it and dispose of it. Companies that plan well and allow for some flexibility will execute successful projects at reasonable costs. Those that don’t will spend only what they can but be limited in the value they unlock in the limited data they capture and store.
2. AI applications in 2020: ushering in the age of ‘smarter storage’
AI workloads will continue to generate business value in 2020. But, for organizations to increase their reliance on AI, storage vendors will need to make it easier for AI applications to access more data faster, in turn helping the systems learn faster and unlock the value of the data. As we enter 2020, data sets are getting bigger and demands for instantaneous decision making are becoming more prevalent. This puts stress on the training systems. Expect more demand for smarter storage systems to match the escalating intelligence of the applications themselves. We’ll see more investments in tools like software-defined switches to open up more pathways for hardcore analytics; QoS functions to dole out information more strategically; scale-out system architectures; and the ability to deliver data with lower latency.
3. Containers in 2020: creating a more competitive storage environment
In 2020, containers and multi-cloud implementations will continue to accelerate. More enterprises will push to create flexible computing environments where multiple clouds serve specific strategic purposes. They will embrace the flexibility containers promise, creating set-ups where containers can move freely between public cloud, private cloud and on-premises environments. Increased use of containers and Kubernetes will help create a more competitive storage environment. Being able to port workloads seamlessly among diverse environments will diminish the strength of vendor lock-in and put pressure on incumbent storage vendors to innovate in areas that improve financial and operational efficiency: lower acquisition and ownership costs; improve staff productivity via more autonomic operation, cloud integration; and new pricing and service offerings.
iXsystems (Mike Lauth, CEO)
1. SDS (storage software that can operate on third party server hardware) will be the basis for nearly all new storage products
The lines between block, file and object storage are increasingly blurred. Users are benefitting from the agility, expandability and cost structures of Software Defined Unified Storage.
2. Open Source storage will slash the storage costs of many organizations
Data is still growing, but SSD and HDD storage costs are not reducing as quickly. Users will increase their deployments of OpenZFS, HDFS, Gluster, Minio, etc. on industry standard hardware without the vendor lock-in and costs associated with proprietary stacks. Enterprise grade support of Open Storage software is the key enabler of this transition.
3. Hybrid Clouds will be the storage reality of every large organization
The performance of local storage and the long term data retention of geo-distributed clouds are necessary partners. Cloud services have replaced tape as the third copy of data. Data security, storage costs, and migration flexibility are critical.
Kaminario (Eyal David, CTO)
1. Companies will finally enable a true hybrid cloud strategy
Migrating mission-critical applications to the cloud is difficult. By leveraging a common set of shared services that enable companies to decouple the management and movement of data from the infrastructure it runs on, companies will be able to improve business agility and reduce the risk of moving to the cloud.
2. Data science will be democratized
With the explosion of data and the rise of AI and ML, IT organizations will have more ways to quickly analyze and make sense of infrastructure data. This will make it easier for stakeholders throughout the organization, who typically don’t have access to this information, to leverage it.
3. User experience will be vastly improved
5G will enable everything/everywhere edge computing that make technologies such as AR/VR and smart, connected devices possible. Wifi6, meanwhile, will offer users more robust outdoor network operations and high performance in dense environments. The combination of the two means lagging/unavailable service will become a thing of the past.
Kaseya (Michael Sanders, GM of Unitrends and Spanning, both Kaseya companies)
1. Pricing and innovation
There will be a major plateau in storage and compute in the cloud as we are approaching a lot of the physical limits of our current technology faster than we thought. The upcoming storage crunch is speeding up innovation; Microsoft and other vendors are experimenting with innovative solutions like glass storage. If they don’t come up with a solution quickly, however, cloud storage prices might start to go up. In addition to the storage limits, there’s the CPU side. In 2020 expect more workloads to get pushed back to the edge. This means more devices (endpoints) will need to be protected in the coming years.
2. The rise of multi-cloud toolsets
Multi-cloud is already a reality in 2019, but it’s taken a while for the management practices of operating multiple workloads across multiple services, regions, and cloud vendors to mature, while the pressures on business continues to build to do more with less, and yet companies continue to have to “DIY” it themselves. And as workloads start to dominate the decisions behind “which cloud” to make the best use of each vendor’s offerings, expect increased demand for toolsets that can operate natively with different cloud vendors. Sadly, cloud management portals today are generally limited to discovery or cost optimization use cases and fail to tackle the day-to-day operational management pain experienced by today’s CloudOps and DevOps teams. As an example of the rising complexity in the cloud, Amazon Web Services alone has 250 different services (May 2019), each with its own management console and set of APIs.
3. DRaaS is now mainstream
DR-as-a-Service (DRaaS) is now mainstream. Large organizations have adopted DRaaS at the highest rates, however, I expect in 2020 to see the adoption of DRaaS by small and mid-sized organizations to drastically increase as organizations discover that not all DRaaS services require their IT departments to become experts in hyper-scale clouds. As a result, SMBs will outsource DRaaS to experts at a fixed price and with little requirement for their time or technical overview.
Komprise (Krishna Subramanian, president and COO)
1. Edge buildout on the rise
Edge buildout is already happening, and its pace is accelerating with trends like IoT, self-driving cars, and biometrics. IDC has predicted that by 2023, 50% of all enterprise IT infrastructure will be deployed at the edge – up from 10% today. More apps are generating tons of data at the edge, raising the question of why data should not be better understood and managed directly at the edge. Imagine if you could analyze data, figure out what data is useful and needs to be brought back to the datacenter, and directly process the rest of the data at the edge itself without having to first move it all? This is why edge data management will rise in importance in 2020.
2. Multi-cloud gains traction
Most enterprises using the public cloud already have a hybrid/multi-cloud strategy, and enterprises are increasingly choosing to diversify their cloud strategy across two or more clouds. In fact, Forrester Research found this year that 62%+ of public cloud adopters already use 2 or more cloud platforms. As this trend continues, enterprises in 2020 will need a simple way to understand and manage their data sprawl across clouds and the hybrid enterprise, leading to greater demand for cross- cloud data management solutions that are not tied to any particular cloud or storage vendor.
3. AI-driven intelligence grows
For the last couple of years, AI and ML have been a big theme and this trend is continuing to grow. While these have initially been more of marketing buzzwords, the potential of AI in data management is clear – how can you manage something you don’t understand? By using analytics to drive AI/ML, analytics-driven data management solutions can continue to leverage the understanding of data to drive better management. Watch for more exciting developments in this space that leverage deep analytics and data lakes across different storage repositories to help you better manage your data.
Lightbits Labs (Eran Kirzner, CEO)
1. Cloud providers, both large and small, will move to disaggregated storage where storage is handled separately from compute, and away from traditional DAS architecture where storage is located with the compute nodes. From the top-tier cloud providers to the enterprise, the trend will be moving from traditional AFAs or storage controllers to more of a cloud-like architecture. And more enterprises will adopt of NVMe/TCP as opposed to NVMe over other fabrics.
2. Flash prices will continue to drop as they did in 2019, making it increasingly affordable and pushing more and more companies to adopt flash as their preferred storage media for its affordable price/performance. Newer NVMe/TCP technologies will further enhance the appeal and benefits of flash by helping to reduce latency. Emerging flash solutions, including QLC, QTC, SLC and MLC flash, will also see increasing market traction. QLC was introduced in 2019 and therefore had only minimal market adoption. That will change in 2020, particularly among companies that have deployed Global flash Translation Layers (GFTL) to overcome QLC’s inherent issues.
3. As Kubernetes continues its successful assault on the data center, more companies will look for storage for containerized environments that move away from traditional Kubernetes deployments with DAS into more flexible and reliable solutions for persistent container storage that NVMe/TCP can provide.
MemVerge (Charles Fan, CEO)
1. Data center architecture redefined: storage class memory will make way for a memory-centric data center
With an increasing demand from data center applications, paired with the increased speed of processing, there will be a huge push towards a memory-centric data center. Computing innovations are happening at a rapid pace, with more and more computation tech-from x86 to GPUs to ARM. This will continue to open up new topology between CPU and memory units. While architecture currently tends to be more disaggregated between the computing layer and the storage layer, I believe we are headed towards a memory-centric data center very soon. A new MCI (memory-converged infrastructure) layer powered by Storage Class Memory (SCM) will boast a larger capacity of memory for applications along with persistence that makes storage tier obsolete. SCM will be the most disruptive new hardware technology within the data center architecture next year. Over time we expect MCI to replace both the existing memory tier and the performance tier of storage and will become a $10 billion+ market. This transition will take 5-10 years to complete and 2020 will be the first year we start to see any impact.
2. SCM technology will evolve and the market will expand
This year marks year 0 of SCM, with Intel shipping Optane DC Persistent Memory in 2Q. Over the next 2-3 years, I believe there will be other major semiconductor vendors entering the market, while Intel continues to improve its Optane technology. We expect Intel Optane DC Persistent Memory’s capacity to double and its cost will drop to half, every 18 months. In addition to Intel, we will likely see early examples from additional players in this market in the next 2-3 years, effectively validating this new market for SCM. Intel, in particular, plans to release gen 2 of Optane by the end of 2020 – I expect we will see significant improvement in terms of density and speed.
3. Data reasoning will be one of the most critical skills next-gen IT staffers will need to possess
IT is having a coming of age moment. It is slowly improving in terms of deployment of mature data science and ML within the enterprise. IT staffers will need to understand all facets of infrastructure and operationalize platforms well. As big data continues to grow, IT staffers will also need to be able to reason about data and pull actionable insights.
Minio (Anand Babu Periasamy, CEO)
1. Appliance vendors hit the wall in 2020 resulting in plummeting valuations. Shifting to a software only model is not window dressing around pricing. It requires a DNA change.
2. Modern databases (the revenue engine for SAN and NAS) go object storage native. This is a real threat to the traditional enterprise storage vendors.
3. The standards associated with the cloud are increasingly defined by open source. It will become the primary strategy for the largest players in the space – from VMware, Microsoft, Amazon and Google.