The IT industry has gone through multiple revolutions – client-server computing, the Internet’s rise, virtualization, mobility – but none rivals the unprecedented impact of today’s digital transformation.
The implications for InfoSec professionals are broad, requiring that they adapt quickly to the profound changes brought about by digital transformation trends.
“Whether you’re ready or not, it’s coming at you, and it’s coming at you very fast,” Scott Crawford, Research Director of Information Security at 451 Research, told Qualys Security Conference 2017 attendees last week in Las Vegas.
Digital transformation – the innovative use of IT to better serve customers, partners and employees, improve business operations, disrupt markets, and invent business models – is being driven by three key elements:
“Cloud clearly plays a central role. Cloud makes it possible to take advantage of the scale and elasticity that virtualization technology has made possible,” Crawford said during his keynote speech “Not Just One Revolution: Extending Security Throughout Digital Transformation.”
However, cloud computing isn’t the only enabler of digital transformation. There are other technologies, such as software automation, and processes, such as DevOps, that also power these changes. While these capabilities have often arisen in concert with cloud, their real focus is speeding the delivery of technology at the increasingly rapid pace of business.
The impact of automation can be clearly seen in DevOps environments, where developers and IT operations staff work collaboratively to continuously and quickly deliver code in ways that extend agile concepts all the way through operations.
In DevOps software development pipelines, many of the steps involved in creating and deploying an application – including coding, testing, deploying and monitoring – are no longer manual.
“Automation has a fundamental impact on the way technology is conceived, developed and placed into operation,” Crawford said.
Intimately related to automation are APIs (application programming interfaces), which drive a lot of the benefits of automation by making it possible for heterogeneous tools to be integrated, interoperate and share data.
Thanks to APIs, traditional IT environments in which infrastructure is physically deployed and tools are purpose-built and siloed are giving way to programmable infrastructures with interoperable components.
Meanwhile, self-contained legacy assets – such as endpoints, servers, network devices and mobile devices – are being replaced by virtualized servers, containers, microservices, infrastructure as code and cloud service provider platforms.
The end result: ‘New IT’ environments are characterized by better performance, more agility, higher resilience and increased convenience, according to Crawford.
Although InfoSec wasn’t necessarily a driver in this push to make IT more efficient, affordable and responsive, some thorny security problems have become easier to solve as a result.
For example, whereas it used to take hours for an organization to run a network scan for asset management purposes, the process now takes seconds thanks to API calls, he said.
Likewise, incident response and forensics, historically a slow, manual process, has been largely automated, as have other tasks such as gathering evidence for IT compliance audits and doing server maintenance.
New opportunities for improving InfoSec processes are also surfacing, such as inserting security into DevOps. By incorporating InfoSec pros into DevOps teams, and integrating automated security products with development and IT operations tools, security and compliance issues can be detected and addressed early and often during an application’s lifecycle.
In this manner, the pace of security comes closer to the pace of modern IT’s software development and deployment.
“We need to take more advantage of this and help break down the silos that have kept not just development apart from security, but IT operations apart from security and at odds with each other, and continue to shift to the left,” Crawford said, referring to the practice of getting security and IT ops pros involved as early as possible – thus, on the left – in the DevOps pipeline.
“The more you can do in the build process to reduce defects before they hit the street, the less expense you’re going to incur having to remediate those defects after the fact,” he said.
The advantages of “New IT” bring with it challenges as well. For starters, the pace of technology development and deployment is much faster, and change is constant in these IT environments, requiring that IT and InfoSec pros possess new skills and knowledge.
“We’re not going to be able to keep up using a lot of the tactics of the past,” Crawford said.
Then there is the growing universe of new tools coming out all the time. Just in the DevOps automation market, there are tons of products for tasks such as software configuration management, continuous integration and delivery, release management, repository management, logging, build, testing, containerization, and, of course, security.
“That’s a daunting set of tools,” Crawford said, looking up at one of his slides showing a “periodic table” with a sampling of about 90 DevOps products.
“If you come at the challenges and opportunities of ‘New IT’ with the skills you’ve had up to this point, you’ll be facing a pretty significant re-education,” he said.
The trend that started in the early 2000s with on-premises virtualization, and continued with enterprises’ embrace of public cloud IaaS, PaaS and SaaS services, has yielded containers and microservices today, and promises so-called “serverless computing.”
Of course, “serverless computing” cannot exist without underlying computing capability, which the industry has historically conceived as server functionality. Rather, it refers to a paradigm where those who use it simply equip the service provider with the application or business logic needed to perform a certain task – hence why the model is sometimes called “Functions as a Service.”
The underlying compute infrastructure, availability and performance are all handled and assured by the provider, and abstracted from the user. These narrow slices of functionality that are activated as needed and for a limited time, further enhancing the efficiency of the delivery platform.
For security, already tasked with protecting new web apps made up of multiple third-party components and containing a profusion of APIs, this increased abstraction of the IT infrastructure brings efficiencies but also creates further risks, according to Crawford.
“What’s the data underlying all these multiple functions? Do they expose to added risk to the sensitive data you’re responsible for? How do you assess that? How do you control access to these functions?” he said.
To adapt to these changes, InfoSec will have to adopt new tools that are API-based and designed for use in these new IT environments, as well as flexible and scalable. Security pros will also need to embrace automation tools even more, not just for DevOps, but also for other tasks such as security orchestration, process automation and incident response.
It will also be important for InfoSec teams to pick vendors whose security products have strong, useful data analytics capabilities, as this ability to analyze large amounts of security information will be key for protecting “New IT” infrastructures and processes, as well as traditional on-premises systems, which aren’t going away any time soon.
This data will come from legacy IT systems, “New IT” environments, IoT sensors and even non-IT sources, as well as from a variety of API connections, and it will need to be stored, managed and distributed to different users, analytics engines, tools and APIs.
Ultimately, the goal for InfoSec teams is to attain what Crawford calls “actionable situational awareness.” “We want to be able to take in all these data types and synthesize insights that are actionable for improving security, and we want to turn that into response,” he said.
Throughout it all, InfoSec pros need to remain engaged and informed about all the changes in IT, proactively asking questions and figuring out what the shifts mean for security and compliance.