Blogs

Firewalls in the cloud era

Wieland Alge, Vice-President and General Manager EMEA, Barracuda Networks, explains how firewalls improve the cloud while the cloud improves them.

The cloud is a much discussed IT subject. Most recently, the discussion has moved to its impact on firewalls, and I recently heard someone say that on-premises firewalls will disappear and move into the cloud. Unlikely as this was, it made me wonder if there was any truth in the statement.

Firewalls will always be needed as they are the sole devices that analyse and control the communication of data and applications. Firewall technology ensures networks are running the way we want them to run. As a result, I came to the conclusion that the question is not whether on-premise firewalls disappear but how firewalls will be influenced by cloud technologies.

In order to address this, we have to look at some of the history.

Enter unified threat management

Ten years ago, the first perimeter architectures consisted of a fast packet processor (the firewall) and a battery of content scanning servers. Each server was dedicated to a specific task (a duty), such as locating spyware or virus scanning. Each was from a different vendor and each was managed separately – it was genuinely best-of-breed and, from a pure performance perspective, it was ideal. However this design is a complicated multi-component perimeter infrastructure, and managing this infrastructure was a real challenge.

We then saw an evolution from a best-of-breed architecture to a Unified Threat Management (UTM) architecture. The UTM fashion was largely driven by some well-meaning industry analysts in a quest to solve implementation issues. But it failed. There was simply no efficient and reliable way one device could do everything and defend against every threat. For example, anti-virus engines in UTM devices have typically been limited solutions with limited capabilities, in comparison to a stand-alone anti-virus solution. So firewall implementation always ended up as a security compromise – a balance between network performance and network security. The ratios of the balance were mostly dictated by the organisation’s sector of operation.

Sadly, both strategies had their limitations and alternatives were sought, which fortunately coincided with the growth of cloud computing. As acceptance of cloud computing grew, organisations looked at utilising the cloud for a variety of uses, including security.

The future is cloudy

Getting back to firewalls and the cloud, firewalls interact with “cloudy IT” in two ways:
1 – As a technology that benefits or utilises the cloud
2 – As a solution which is not only a foe to our enemies (hackers), but a friend to our business-critical applications

The benefits of the cloud: Performance, efficiency and reduced costs
The performance issue: Freeing up on-site bandwidth from the asynchronous workload

As recently as 10 years ago, networks had a plethora of perimeter scanning servers, but when you tried to consolidate them all in one box, hoping to make management easier, it “killed” the performance. The firewalls ran into several problems in trying to keep up with the tasks they had to do, such as analysing, prioritising and blocking the network traffic they dealt with. The issue then was the sheer amount of data that firewalls have to process. Data is increasing far more quickly than hardware can handle it, and it is the sheer availability of bandwidth that causes the problem. There is no such thing as unused network bandwidth. If you physically provide more bandwidth, it will be used, which simply adds to the problem. All this real-time data flow puts a huge strain on the analysing capabilities of the firewalls.

Ideally, it the most secure behaviour for firewalls would be to stop the traffic, analyse things and then send it on its way. However, this causes delays and it is not practical to do this – it is not the way end-users want to work. But from a security aspect it would be exactly what IT wants from firewalls.

So the main challenge for firewalls is to be able to handle the massively increased data throughput without compromising security. The cloud helps to solve this performance problem by pulling out the asynchronous workload from the perimeter and redirecting it to cloud-based content filters. On one hand, this allows one to scale the firewall infrastructure in enterprise environments, as the computing power available in the cloud is nearly unlimited.

UTM device efficiency, but with far more performance

From an administration perspective, nothing changes in comparison to the UTM approach; administrators still have one management console from which they can manage the on-site firewall capabilities like fast packet processing, but also the content filtering capabilities taking place in the cloud.

Reduced costs

So cloud-based, scalable computing power can be used to handle the asynchronous CPU intense content filtering part of a firewall’s function, and make it a cleaner and more predictable environment for fast packet processing. From a cost perspective, this brings us to another benefit, which is that cloud-based scanning methods are far cheaper and more efficient than current firewall architectures.

What the cloud offers users is to have the benefit of the “separation of duty” architecture without the cost associated. Eventually, firewalls will become a far better device and will finally provide a solution to the 15-year-old dilemma, which firewalls in a perimeter architecture struggled with.

The Firewall of the cloud era: Friend to the application and foe to your enemies

In a private cloud or a closed and simple IT architecture, the basic questions asked of the firewall are whether it blocks these attacks, whether it restricts access to that type of system, and whether it limits access to the outside world.

In cloudy IT, the cloud is a strange entity. It is both internal and external. But now parts of the data and applications are somewhere outside the organisation, so the questions become different. Questions that were originally asked of application delivery controllers in the application and internal data centre world are now asked of firewalls.

These questions are numerous. Can you accelerate access to that particular application? Can you prioritise the traffic from this user group to this data? Can you provide access to that particular data? Suddenly, the firewall becomes involved in a lot different ways. And many firewalls are not prepared for that.

Most firewall vendors try to answer the questions of what they can block, but the modern firewall is not a device that blocks or separates the malicious people and items from the controlled part of the network. From an application architecture point of view, the firewall is somewhere in the middle of everything.

The crucial question now is whether a firewall can contribute positively to data application access or not. A firewall is traditionally thought of as a means of causing a problem to the bad guys.

The downside of this approach is that it also causes a problem for the good guys. Everyone knows the excuse of security administrators in organisations – “Apologies but we are down for security reasons”. This excuse is not acceptable any more.

Many people thought that application detection capabilities were primarily used to block bad applications. In reality, it is used to identify applications in order to prioritise them for end-user access, for example SAP access or WAN optimisation techniques to some parts of file sharing network. This is the number-one reason why people use deep application detection. Even moving to the cloud, there will be a firewall somewhere, which is why the role and capability of firewalls of the present and future is very different than it was. It is just that the priorities have changed.

Five points to consider

The end result is to end up with a “firewalls plus cloud” architecture for a unified management capability but with separated engines. You don’t want everything in “one box” but you want to manage it as if it were one box. So to conclude:

1. Cloud-amended firewalls provide the ultimate solution to the performance and management dilemma that has plagued firewalls for the past 15 years.
2. Firewalls should support or accelerate access to application or data.
3. Deep application analysis should be used to prioritise good applications and data access – not only blocking the bad.
4. To accelerate cloud-based applications, you need scalable management to be effective and to maximise the benefit of modern firewall capabilities.
5. If things are too complicated, no one will use them.

Previous ArticleNext Article

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines