United States - Flag United States

Please confirm your currency selection:

Bench Talk for Design Engineers

Bench Talk

rss

Bench Talk for Design Engineers | The Official Blog of Mouser Electronics


Next-generation Firewalls: More than ACL Filters Jeff Fellinge

Network-based firewalls provided an essential first line-of-defense for decades. These devices sit between trusted and untrusted networks and manage—often complex—rulesets that instruct them whether to allow or block traffic based on its source, destination, and type. To create effective rulesets, firewall administrators must have an accurate inventory of their environment’s devices and applications as well as the protocols and ports on which they rely. Mistakes in these rulesets might lead to loss of network connectivity; hence the adage when troubleshooting network connectivity between two devices to “first check the firewall.” Errors could leave sensitive systems exposed to the internet or blocked access for critical corporate functions. Besides complex rulesets, network-traffic profiles have changed over the past ten years. Most new applications use Hypertext Transfer Protocol (HTTP) and Hypertext Transfer Protocol Secure (HTTPS) and have an overloaded port 80—the port assigned to HTTP—and port 443—the port for websites using a Secure Sockets Layer (SSL)—respectively. Malware attempts to hide on these ports, too. Even if you stay on top of ruleset management of your traditional firewall, it might not have the ability to inspect and block some of these modern threats. Fortunately, modern commercial and open-source firewalls include better management and intelligence that help remove you from some of the mundane rule management tasks while providing much needed policing of these high-traffic ports.

Firewalls began as simple filters that blocked and allowed network traffic based on their source, destination, and protocol. These firewalls were often physical devices with two or more network interfaces that logically separated networks into zones—for example, the internet and an internal corporate network. Over time, firewalls added support services like network address translation (NAT), dynamic host configuration protocol (DHCP), and domain name server (DNS) services in an attempt to position them as an “all-in-one” device. Even the simplest rules on these firewalls required administrators to understand the basics of infrastructure services and networking concepts such as network addressing, ports, and protocols. Firewall and rules complexity increased when firewall vendors began to add virtual private networks and port forwarding and server publishing. All these services and concepts remain important features in modern firewalls, but many now include more advanced features like the inspection of encrypted communications as well as application intelligence to support security policy decisions at the application layer.

Managing network traffic strictly by protocol alone is no longer adequate for several reasons.
 

  • First, most cloud-based, software-as-a-service applications and Internet of Things (IoT) appliances communicate over HTTP and, if they are security savvy, over encrypted HTTPS. In the past, a firewall administrator might construct rules based solely on inbound or outbound HTTP and HTTPS in an attempt to manage internet web surfing. Similarly, they might create firewall rules for file transfer addressing FTP (ports 20/21), and remote access over Secure Shell (SSH) (port 22) or terminal services (port 3389). Firewall administrators would assign most applications using a specific protocol a corresponding port, which enabled them to govern these applications by regulating the use of that protocol. A shift has occurred and today many applications—whether they are web surfing, file transfer, or remote access—use HTTP or HTTPS, and a firewall administrator might only see traffic on port 80/443 regardless of the activity and nature of the application.
     
  • Second, applications now frequently encrypt their payloads making it nearly impossible for packet inspection without more advanced techniques.
     
  • Third, critical supporting protocols like DNS and NTP (Network Time Protocol) are routinely whitelisted and allowed to pass between network security zones unrestricted. Attackers knew this too and began to hijack these ports—whether to hide their communications behind encryption or to make fake DNS “requests” that hid stolen corporate secrets.

Firewall vendors have continually evolved their products to help address these concerns through advances in existing as well as new technologies.

Encryption

Modern firewalls technologies take advantage of faster processors and distributed architectures to allow them to dynamically decrypt-inspect-encrypt encrypted traffic passing through it. This allows HTTPS and SSH traffic to be decrypted on the firewall, the payload inspected against defined criteria, and decisions made whether to block the traffic or re-encrypt and deliver it. To alleviate privacy concerns and adhere to geopolitical requirements, look for firewall features that let you define specific policy elements so you can determine what types of communication can be decrypted and what must stay private.

Application-Aware Rules

Firewalls live and breathe by their ability to differentiate and regulate traffic by distinguishing characteristics contained within the network communication stream. Application-aware firewalls—including web-application firewalls—allow a firewall to dive deeper into specific protocols—like HTTP—to determine whether the application using that protocol is legitimate and approved. These technologies can detect and block rogue applications that hijack other protocols for their own communication. Take for example a piece of malware that attempts to contact another command and control server over port 53 (DNS). A traditional firewall rule configured at layer 4, the transport layer, to allow port 53 traffic might allow this malware through. However, an application-aware firewall inspecting the same packet at layer 7, the application layer, might be able to determine that the traffic it detects on port 53 does not represent true DNS communication and will flag it as illegitimate traffic and block it. Firewalls with application detection capabilities can also distinguish different applications like whether to block social media or peer-to-peer file transfer applications even if they all use HTTP. Blacklisting and whitelisting websites based on IP and DNS information is not new, but now firewalls can recognize and categorize applications based on the payload itself instead of relying on just a database of destinations.

Object-based Rules

Defining network policies based on objects can help simplify the rulesets that you must directly manage. For example, it may be easier to define a new object called “web servers,” which itself contains all the individual web-server objects. You can then create a single rule that permits access from the “internet” zone to the “web servers” object over the “web protocols” object. For a human, this means perhaps only one firewall rule to manage. The object memberships can be quickly verified as can the list of appropriate “web protocols” included in that object. As web servers come and go or as their IP addresses change, you can update the inventory of the individual web server objects and rest assured the rules will automatically update to reflect the latest changes. Auditors like this too because they can see which objects are members of which groups and what rights that group has. This object-based approach might also make it easier to find gaps or mistakes compared to scrolling through lines of individual firewall rules defined by IP addresses and protocol port numbers.

Identity

Traditional firewalls relied on IP addresses to build their rules. Though the IP address is still a popular attribute to identify the communication of a specific device, it might not be enough to correlate activity to a person. Wireless access, mobile users, virtualized containers, IPv4 vs. IPv6 addressing, and cloud deployments have all made attribution of an action to an IP address more difficult. More capable firewalls can recognize and integrate with federated identities and directory services to authenticate users before allowing access.

Metrics and Measurements

Business intelligence software built directly into the firewall helps firewall management and significantly extends reporting capabilities beyond the infamous “top talkers” reports. These tools allow the firewall administrator to dynamically pivot and slice network communication data into actionable reports and in many cases, these reports can be drilled down to reveal otherwise hidden insights.

Cloud and Distributed Deployments

Virtualized firewall solutions are required as more organizations integrate infrastructure as a service (IaaS) into their traditional on-premise models. Major cloud vendors provide firewall and network security features; however, these firewall and network security features may be lacking when compared to some standalone firewalls. Most firewall vendors now offer cloud-based firewalls that link with their on-premise network firewalls and host-based firewalls to create one cross-platform firewall service managed by a single security policy.

Firewalls will continue to play a central role in isolating networks. It is an exciting time to see the evolution of these core technologies as networks and cloud applications mature. You may pay a premium for cutting-edge firewall services through hefty subscription fees; however, these leading technologies will continue to spark innovation in open source and more broadly accessible firewall technologies.

Key Points:

  • Firewall technologies continue to evolve to keep up with new attacker techniques.
  • Application intelligence and user authentication services further help firewalls distinguish actual user traffic.
  • Next-generation firewalls abstract the firewall administrator from managing complex rules through flexible, powerful and often intuitive object models and user interfaces.


« Back


Jeff Fellinge has over 25 years’ experience in a variety of disciplines ranging from Mechanical Engineering to Information Security. Jeff led information security programs for a large cloud provider to reduce risk and improve security control effectiveness at some of the world’s largest datacenters. He enjoys researching and evaluating technologies that improve business and infrastructure security and also owns and operates a small metal fabrication workshop. 


All Authors

Show More Show More
View Blogs by Date

Archives