Cloud Architecture and Responsibilities (Domain 3)

In this episode, we are shifting our focus to cloud security—specifically, how cloud architecture affects roles, responsibilities, and risk. As organizations increasingly rely on cloud providers, it becomes essential to understand where your responsibilities end and where your provider’s responsibilities begin. We'll cover cloud security concepts, Infrastructure as Code, and the security implications of serverless architecture and microservices.
Let’s begin with the concept of shared responsibility. In traditional on-premises environments, the organization is responsible for everything—from physical hardware and networking to applications and data. But in the cloud, some of those responsibilities shift to the provider.
This is where the shared responsibility matrix comes into play. The matrix outlines what the cloud provider is responsible for, and what the customer must still manage. In general, the provider handles the security of the cloud—things like physical infrastructure, virtualization, and hypervisor security. The customer is responsible for security in the cloud—such as managing access controls, securing workloads, and encrypting data.
The exact breakdown depends on the cloud service model. In Infrastructure as a Service, customers manage more elements—like virtual machines and operating systems. In Platform as a Service, the provider manages the platform and the customer focuses on applications and data. In Software as a Service, customers usually only manage user access and data, while the provider handles nearly everything else.
Hybrid cloud environments add complexity. In a hybrid model, some services are hosted on-premises, while others run in public or private clouds. Security teams must ensure that consistent policies apply across all environments. This includes identity management, data classification, and logging. Integration points—like VPN tunnels, cloud connectors, or APIs—must be secured just like internal network links.
Working with third-party vendors also introduces risk. Cloud providers may subcontract certain services, or customers may use third-party platforms within cloud environments. Each additional layer introduces potential vulnerabilities. Organizations must evaluate vendor security, ensure contracts define clear responsibilities, and maintain visibility into how data is stored, processed, and secured.
Now let’s move to Infrastructure as Code. Infrastructure as Code, or I A C, is a method of managing and provisioning computing infrastructure through machine-readable scripts or configuration files, rather than manual processes. Tools like Terraform, CloudFormation, and Ansible allow teams to define their infrastructure once and deploy it consistently across environments.
The security advantages of Infrastructure as Code are significant. First, it provides consistency. Instead of setting up each environment by hand—which can introduce errors—I A C ensures that every deployment follows the same configuration. This eliminates human error and supports secure defaults.
Second, I A C supports scalability. You can rapidly provision new environments with hardened settings already in place. Changes can be versioned, audited, and rolled back, which adds to accountability and traceability.
But there are also risks. If a misconfiguration is written into the code, that misconfiguration is replicated across every environment. A single misplaced setting in a template file could expose dozens of systems. Similarly, automation errors—like accidentally deleting resources or misapplying permissions—can scale damage just as quickly as they scale deployment.
To mitigate these risks, I A C files should be reviewed just like application code. Use version control, peer reviews, and automated security scans to detect errors before deployment. Store sensitive values—like passwords or keys—in secure vaults, not hardcoded into configuration files. And use role-based access to control who can modify infrastructure code.
Now let’s explore serverless architecture and microservices. Serverless computing refers to a model where developers deploy functions without managing servers. The cloud provider handles infrastructure, scalability, and runtime management. Common platforms include AWS Lambda, Azure Functions, and Google Cloud Functions.
Serverless architecture offers security benefits. Because there are no persistent servers, there are fewer patching requirements and less exposure to common server-based vulnerabilities. Functions are usually short-lived, meaning attackers have less opportunity to maintain a foothold. And isolation is built into the platform—functions run in separate containers by default.
But serverless also introduces new challenges. Since the provider controls the runtime, customers have limited visibility into the environment. Attackers may exploit misconfigured triggers, abuse excessive permissions, or chain together multiple functions in a complex attack. Serverless applications rely heavily on event-driven execution and API calls, so securing APIs and validating input is essential.
Microservices follow a similar trend. In a microservices architecture, applications are broken into smaller, independent services that communicate over the network. Each service can be developed and updated separately, which improves agility and scalability. But more services also mean more attack surfaces. Each endpoint, each communication channel, and each integration must be secured.
To protect microservices, isolation is key. Each service should run in its own container or environment, with minimal privileges. Communication should be encrypted using Transport Layer Security. Authentication and authorization should be enforced using tokens, certificates, or centralized identity providers. And all APIs should be protected against common web vulnerabilities like injection and cross-site scripting.
As you prepare for the Security Plus exam, make sure you understand the shared responsibility model across different cloud service types. Be ready to explain the benefits and risks of Infrastructure as Code, and how it supports secure automation. You may also be asked about the trade-offs of serverless computing and how to secure microservice architectures through isolation, authentication, and strong API management. Think in terms of visibility, automation, and trust boundaries across everything you deploy.

Cloud Architecture and Responsibilities (Domain 3)
Broadcast by