Reseller ME catches up with Mike Resseler, director, Product Management, Veeam Software, at the firm’s third edition of customer and partner conference, VeeamON 2017, to understand how and why customers must reconsider their data protection strategies.
What has been Veeam Software’s priorities over the last few years?
Our priority is to understand customers’ problems and what we believe will be business challenges in the future. Many new technologies and services are emerging and we see that organisations’ data are not located all on-premise. In the past, the challenge was that most information was stored in local laptops instead of data centres. Today data resides in the data centre and we believe that will continue for a long time. But what we need to realise is that data will also be with local service providers and other partners. The challenge of the future will be to keep that data available, know the location of different types of data and make it portable to move it to different sites.
From a R&D vision, we believe we can be an enabler to keep that data available and be able to move it to which ever platform necessary. That is the long-term goal. Why is it not feasible today? Because the technology of the different required components is simply not there now. As we see the technologies enhancing, we believe this will be feasible in the future and we want to be enablers for business to be able to do that. Our goal is to provide data management solutions for all rms.
Do you think this will happen by the time the company touches $1.5 billion in 2020?
Most probably not, because we don’t believe that the technology will be ready by then. However, by 2020, we would have certainly made great steps in enabling that story.
Are enterprises thinking of data in this manner yet?
Yes, they are. A part of my role is to talk to customers, from small firms to huge enterprises, and learn about their five-year strategy. Most enterprises have these types of strategies in place but smaller customers usually don’t. It is interesting to see that most challenges that they have and expect to have is around data and how they can ensure they have their data to continue, in case of a security breach or adversity. Data is so important for organisations across verticals.
How should enterprises change their disaster recovery strategies?
Not just enterprises but firms of all sizes should think about a ‘layered concept’ in their disaster recovery strategy. To protect the most valuable data, customers must invest in a layer, which has clustering and synchronisation. Then, they should begin building up layers such as backup and replication, which will be components of the complete strategy. Our solutions will be a few layers within that complete plan. We will never pretend that we are everything. Even SMBs, should think about this layered approach because they are going to have the same challenge – they need their data to be available 24/7.
Will this layered approach work for ransomware attacks?
When it comes to new threats such as ransomware, one part of that strategy should be what we call ‘air gap’. Air gap essentially means you take your data and store it on a platform, and then there is air between that backup. Why is that? As ransomware attacks become much more sophisticated, what we have noticed is that eventually they end up reaching the backup server. Customers might think since they have backups, they have nothing to worry, however today we have ransomware that is intelligent enough to get to these backups and destroy it. We live by the 321 rule – three copies of data, stored on two different media such as tape, disk or cloud, and one being offsite. We prefer the one that is offsite to be air gaps so that there is no computer wire connected. Even if a customer’s entire firm is attacked by ransomware, they can be assured of their data being safe offsite.