An increase in cloud computing has led to an increase in volumes of data organisations have to support. Tackling the challenge effectively needs more than just adding to the bandwidth; it requires an overhaul of the network infrastructure, says Mike Hemes, Vice President EMEA, Silver Peak.
Organisations today require a stable infrastructure to support growing data volumes, resulting from an increase in cloud computing. According to recent research conducted by IDC, data volumes are expected to grow 10-fold by 2020, with data in the cloud set to double.
Organisations are often unaware of the debilitating effect that large volumes of data place on the underlying network, particularly when replicated or shared across a wide area network (WAN). This can lead to business-critical applications jeopardised and large sums of money wasted on applications.
While many organisations focus on the analysis and storage of this data as the top IT challenges, what is equally challenging is the movement of the data. Firstly, network stability and geographical distances have play a large part in the success of IT initiatives and data migration. The farther away the data centre is, the more latency it has to deal with and the longer it will take for the data to be transferred.
Secondly, insufficient bandwidth slows data transfer. Bandwidth is often limited and costly, and the use of MPLS and Internet VPN connections to the cloud can result in packets lost in transit or delivered out-of-order. An average large-size enterprise upgrades its bandwidth every two years to accommodate data growth and extend LAN-like performance out over the WAN. However, this is both time consuming and costly, and does not always address application delivery problems brought on by latency, packet loss and other common issues. Organisations therefore need to grasp the importance of improving the underlying network infrastructure that hampers key business applications instead of merely adding bandwidth.
Thirdly, moving large volumes of data compromises speed. With growing amounts of data coming into an organisation and skyrocketing analysis requirements, incoming data must be analysed as quickly as possible. If the transfer and analysis time takes too long, it is possible that the resulting analysis of the data will be stale and outdated by the time everything is finished.
Conquer the cloud
A business located in the same city as its data centre seldom has a problem accessing the data hosted on the cloud. However, this is an unlikely scenario as enterprise users are often distributed in different parts of the world and it is difficult to trace a data centre’s exact location.
As a result, optimising the WAN has become essential for accessing, analysing and migrating large volumes of data to and from the cloud. WAN optimisation techniques incorporate byte-level deduplication to eliminate data redundancy and packet-order-correction which improves network quality and accelerates IPsec encryption, hence enhancing cloud performance exponentially.
With cloud computing gaining momentum parallel to an increase in data, it is essential that network managers ensure they have the optimum conditions for the data to be on-boarded, accessed and secured as efficiently as possible. A real-time solution that has the scalability to handle such large volumes of data is ultimately crucial to the success of any cloud implementation. By reducing the amount of data sent across the WAN, prioritising key traffic, and eliminating packet loss and data retransmissions, cloud performance is improved, end-users are happy and ongoing telco costs are reduced.
Organisations are becoming increasingly overwhelmed with data, and simply introducing more storage or adding additional bandwidth isn’t sufficient. By taking a network-centric approach, organisations can achieve maximum scalability and flexibility needed to cope with the growing volumes of data in the age of cloud computing, thus ensuring maximum benefit to the business.