Security Delusions Part 1: A History of Cloud Compunction

Organizations are unearthing the potential of digital transformation, but security often remains a gatekeeper to this path of promised potential, largely due to its own delusions about what modern infrastructure means. As Herman Melville wrote in Moby Dick, “Ignorance is the parent of fear” – and security is too frequently hindered by its fear of the new and the agile precisely because of its ignorance about blossoming technologies.

In this blog series, drawn from my QCon talk last year, I will explore the history of infosec’s errant gatekeeping in the face of new technologies, and how we can encourage security to embrace new technologies to enable the business, rather than get in its own way.

Let us take a trip down memory lane, back to the early 2010s when “cloud transformation” reached sufficient significance to warrant concern by security professionals. These concerns presented as simplistically as fears of “storing data online,” extending to fears of shared resources, data loss, insider threat, denial of service attacks, inadequate systems security by cloud service providers (CSPs), supply chain attacks. 

However, the crux of the matter was rooted in a loss of control. No longer would security teams maintain the security of infrastructure themselves. No longer would their programs be anchored to firewalls. While the general IT question of the moment was usually, “What happens if our connectivity is interrupted?”, the question for IT security was, “How can we keep things secure if they aren’t directly under our control?”

For those of you who joined infosec more recently or who are interested observers from other disciplines, you may wonder why the prior model fostered such a sense of control. Traditional information security programs centered around the firewall – the first line of defense for the organization’s internal network perimeter, the anchor of the perimeter-based strategy, the key producer of netflow data that populated dashboards and provided signal for correlation across products.

Image result for security architecture firewall ids
Figure 1 – Fittingly in Comic Sans
Image result for ngfw diagram
Figure 2 – The Next-gen Firewall (NGFW) did not change things much

The Defense in Depth model became quite popular, one that advised a “multi-layered approach” to security (which is not wrong in the abstract). The first line of defense was always network security controls, starting with the firewall and its rules to block or allow network traffic. Intrusion prevention systems (IPS) worked just behind the firewall, ingesting data from it to analyze network traffic for potential threats. Fancier enterprise infosec programs segmented the network using multiple firewalls – what a SANS Institute paper called the “holy grail of network security.”1

But the transition to cloud erodes the traditional enterprise perimeter, and thus erodes the firewall’s position as center of the security universe. Thus, one can view the cloud transition as a Copernican Revolution for enterprise information security. And with such a shift, it is perhaps natural for enterprise infosec teams to reject it, wary of their relevance in this new world.

Survey data throughout the years covering infosec’s skepticism towards cloud helps fill in this picture. In 2012, Intel performed a survey on “What is holding back the cloud?” discovering that the top three security concerns regarding private cloud were related to control2. 57% of respondents cited concern over their inability to measure the security measures by CSPs, 55% cited lack of control over data, and 49% cited lack of confidence in the provider’s security capabilities.

Source: Intel

After the loss of control followed concerns of “lack of visibility into abstracted resources,” and “uneasiness about adequate firewalling.” Hypervisor vulnerabilities were widespread concerns for both public and private cloud – though with the benefit of hindsight, they never materialized for the typical threat model (even today). And concerns about adequate firewalling more than anything reveal the stickiness of the network perimeter security model.

By 2014, 66% of security professionals surveyed by Ponemon3 said their organization’s use of cloud resources diminished their ability to protect confidential or sensitive data. 64% said cloud makes it difficult to secure business-critical applications, and 51% said the likelihood of a data breach increases due to the cloud. In 2015, a survey by the Cloud Security Alliance (CSA) highlighted that 71% of respondents view the security of cloud data as a big red flag. 38% said their fear over loss of control kept them from moving data into cloud-based apps – thankfully fewer who believed the same in Intel’s 2012 poll.

Source: Ponemon

Distilling these fears, it is absurd in hindsight that enterprise defenders could believe that a few people maintaining a firewall could outmatch the security efforts and measures of Amazon, Google, or Microsoft. It can only be through the endowment effect – that people overvalue things which they already possess – with a bit of sunk cost fallacy that could lead to such a hubristic conclusion.

Looking back, seldom were the major CSPs hit by publicly-disclosed data breaches. Salesforce has no known major data breaches, outside of disclosure that attackers were using fake websites to phish customers in 2014. Heroku, a Salesforce subsidiary, disclosed a vulnerability in early 2013 that could lead to the potential to access customer accounts – but they did not appear to possess proof of an actual breach. AWS, GCP, and Azure have no known breaches outside of customer misconfiguration.

The most notable CSP breaches include Dropbox in 2012 (68 million usernames + passwords), Evernote in 2013 (50 million usernames), and Slack in 2015 (500 thousand usernames). Yet, these were all breaches of user account databases, rather than evidence of customer accounts or storage repositories being breached themselves.

Despite these sentiments, cloud adoption inexorably marched onwards, and security teams mostly had to shut up and deal with it. The notion that security teams could secure infrastructure better than Amazon, Microsoft, or Google finally became fringe – but truthfully it only did so within the past two years or so, well after operations teams realized that the CSPs could provide more performant infrastructure than most could manage on their own.

The reality of cloud systems is that misconfigurations present the biggest concerns, such as an S3 bucket that is accidentally publicly exposed. Gartner indeed suggests that “through 2020, 80% of cloud breaches will be due to customer misconfiguration, mismanaged credentials or insider theft, not cloud provider vulnerabilities.”4 Luckily, there are considerable resources to assist with the misconfiguration problem – more on that in the third part of this series.

Another reality is that security operating expenses can decrease when using the CSP’s native security controls, according to McKinsey research5. This research suggests that an enterprise with an annual budget of $200 million would spend just under $12 million per year on security, which is $5 million less than they would spend if they did not use the CSP’s native security controls.

This is not surprising. One large security vendor’s web malware protection system sells for over $100,000, as does its email protection system, both of which are deployed as blinky boxes on the network. A larger security vendor’s next-gen firewall (NGFW) starts at $50,000, though higher throughput models quickly reach $150,000 or more. An even larger security vendor’s firewall (with five year support) is priced near the $200,000 mark.

One might assume that a transformation resulting in less hardware to manage and less expense would be a welcome one – but this was not the case for enterprise infosec teams and cloud transformation. Of course, some CISOs readily embraced the potency and efficiency of cloud adoption, but even today, you can still find CISOs reticent to acknowledge cloud’s security benefits. 


During this history lesson, we saw that the security industry’s palpable fears of cloud computing did not match the eventual reality. Infosec was not only late to the cloud party, but often unnecessarily stalled organizational transformation, too. 

Much of security’s reluctance was driven by status quo bias — the stickiness of the defense-in-depth perimeter security model that gave security considerable control, and thus a sense of comfort. Importantly, that sticky bias led (and still leads!) many security teams to attempt recreating the same old-school model in new environments. Such an approach defies resilience, and serves as a warning for how security will handle the adoption of other tech today and in the future.

Now that we are faced with the rapid adoption of APIs and containers in the enterprise, will history repeat itself? In the next part of the series, I will explore how infosec is currently responding to microservices (APIs and containers) and what delusions are being conjured…

Read post 2 in this series: Modern Monsters


[1]: Bridge, S. (2001). [Achieving Defense-in-Depth with Internal Firewalls]( Retrieved May 2019.

[2]: Intel, Intel IT Center Peer Research. (2012). What’s Holding Back the Cloud? Retrieved from

[3]: Ponemon Institute LLC. (2014). [Data Breach: The Cloud Multiplier Effect]( Retrieved May 2019.

[4]: A bunch of vendors cite this quote, but I cannot find it directly via Gartner. I am assuming it is behind Gartner’s paywall.

[5]: Elumalai, A., et al., McKinsey Digital. (2018). Making a secure transition to the public cloud. Retrieved from (Note that these statistics are only true if the apps are rearchitected for the cloud in parallel.)