If you are looking for some sample questions to practice and get the logic of the AWS Certified Solution Architect Professional Exam. Here is what you need. In this blog, we have brought some interesting free questions for the AWS Certified Solutions Architect Professional Exam under different categories.
Free Practice Questions for AWS CSA Professionals 2025
Management and Governance(2)
Question No: #1
Domain: Design Solutions for Organisational Complexity (26% of scored content)
Main Topic: Architect network connectivity strategies
Sub Topic: Network segmentation (e.g., subnetting, IP addressing, connectivity among VPCs)
Question Text: A company operates in a regulated industry and must design a network architecture that ensures:
- Full isolation of development, staging, and production environments.
- Centralised control of all outbound internet traffic with strict policy enforcement.
- Secure inter-VPC communication between environments.
- Compliance with organisational IP addressing standards and avoidance of overlapping CIDR blocks.
Which design meets these requirements?
- Create separate AWS accounts for each environment and use AWS Direct Connect for centralised internet access.
- Deploy a Transit Gateway with route tables configured to enforce inter-VPC communication rules and attach each environment’s VPC to the Transit Gateway. Deploy a centralised NAT gateway for outbound internet traffic.
- Use VPC Peering to connect all environments and deploy NAT instances in each VPC to control outbound internet traffic.
- Implement AWS Network Firewall in each VPC and configure route tables to enforce security rules.
Correct Answer: B
Explanation:
Option B is correct because AWS Transit Gateway provides scalable and secure inter-VPC communication. Centralised NAT gateways enable centralised outbound internet control, and route table configurations ensure compliance with segmentation requirements.
Question No: #2
Domain: Design Solutions for Organisational Complexity (26% of scored content)
Main Topic: Determine cost optimisation and visibility strategies
Sub Topic: AWS cost and usage monitoring tools (for example, AWS Trusted Advisor, AWS Pricing Calculator, AWS Cost Explorer, AWS Budgets)
Question Text:
A large enterprise has several AWS accounts, each with a distinct department (e.g., Finance, Marketing, and Engineering). The company wants to manage its costs at both the department and organisation levels, ensuring that each department is charged appropriately. The finance team needs to track usage and costs by department and receive notifications when spending exceeds predefined thresholds.
Which combination of AWS tools would provide the most cost-effective and comprehensive solution for this requirement?
- Use AWS Cost Explorer for departmental cost analysis, set up AWS Budgets for cost alerts by department, and implement consolidated billing for cross-account visibility.
- Use AWS Trusted Advisor to identify savings opportunities across departments, set up AWS Budgets for alerts at the organisational level, and enable resource-level cost allocation tags.
- Use AWS Pricing Calculator to estimate department-wise costs, set up AWS Budgets at the account level, and leverage AWS Cost Explorer for ongoing cost analysis.
- Use AWS Cost Explorer for real-time cost tracking, implement AWS Budgets for organisation-wide thresholds, and set up consolidated billing for each department’s cost tracking.
Correct Answer: A
Explanation:
Option A is correct because AWS Cost Explorer allows for detailed cost analysis at both the department and organisation levels, and AWS Budgets can be used for setting alerts by department. Consolidated billing simplifies cost management by consolidating all accounts under one billing structure, providing visibility into costs for each department.
Migration(2)
Question No: #3
Domain: Accelerate Workload Migration and Modernisation (20% of scored content)
Main Topic: Determine the optimal migration approach for existing workloads
Sub Topic: Application migration tools (for example, AWS Application Discovery Service, AWS Application Migration Service)
Question Text:
A company is planning to migrate its monolithic on-premises application to AWS. The application consists of multiple components, including a web server, a database, and a caching layer. The company is looking for a solution that will allow it to migrate its application quickly with minimal downtime and minimise manual intervention.
Which combination of AWS services is the best approach to migrate the application to AWS while ensuring minimal downtime and reducing manual effort?
- Use AWS Application Migration Service (AWS MGN) to replicate the application and database in real-time, perform a cutover to AWS, and use AWS Application Discovery Service to analyse the application’s dependencies before migration.
- Use AWS Server Migration Service (AWS SMS) to replicate the application in real-time and use AWS Cloud Endure Migration for minimal downtime during the cutover.
- Use AWS Application Migration Service (AWS MGN) for a lift-and-shift migration with minimal downtime and implement AWS Elastic Load Balancing (ELB) to manage traffic distribution during the migration process.
- Use AWS Database Migration Service (AWS DMS) for database migration and use AWS Application Discovery Service to understand the application’s dependencies before performing a manual migration of the web and caching layers.
Correct Answer: A
Explanation:
Option A is correct because AWS MGN replicates the application and database in real-time with minimal downtime. AWS Application Discovery Service assesses the application’s dependencies, allowing for a smooth migration. This combination reduces manual effort and ensures that migration is efficient and well-planned.
Question No: #4
Domain: Accelerate Workload Migration and Modernisation
Main Topic: Planning and executing migrations for complex workloads
Sub-Topic: Using AWS Application Discovery Service, Application Migration Service, and Snow Family for application dependency discovery, server migration, and large-scale data transfer.
Question Text:
Your organisation is migrating a portfolio of applications to AWS from an on-premises data centre. The portfolio includes SQL databases, file storage systems, and legacy applications running on custom operating systems. The migration team wants to assess the application dependencies, automate server migration, and securely transfer large datasets to AWS. Which combination of services is most suitable for this requirement?
- AWS Application Discovery Service, AWS Application Migration Service, AWS Snowball
- AWS Migration Hub, AWS DMS, AWS DataSync
- AWS Application Migration Service, AWS SCT, AWS Transfer Family
- AWS Application Discovery Service, AWS Snowcone, AWS Transfer Family
Correct Answer: A
Explanation
- A is correct because:
- AWS Application Discovery Service: Helps discover application dependencies to plan migration.
- AWS Application Migration Service: Automates lift-and-shift migration of servers to AWS.
- AWS Snowball: Transfers large datasets securely when network bandwidth is insufficient.
Frontend Web and Mobile Development(2)
Question No: #5
Domain: Design for New Solutions
Main Topic: Frontend Web and Mobile Development
Question Text:
A multinational enterprise runs its customer-facing application on AWS across multiple regions. The application experiences Cross-region write latency exceeding 2 seconds, Authentication service failures during US trading hours, Inconsistent session management across regions, Monthly data transfer costs exceeding $500,0 and 00 a regulatory requirement to maintain data sovereignty in specific regions.
- Implement regional Cognito User Pools with identity federation, use DynamoDB Global Tables with DAX clusters, modify CloudFront to use price class 200, and implement S3 Cross-Region Replication with KMS multi-region keys.
- Migrate to Aurora Multi-Master, implement Application Load Balancers in each region, use ElastiCache Global Datastore, and configure AWS Global Accelerator for authentication endpoints.
- Deploy regional API Gateways with Lambda authorizers, use DynamoDB streams for cross-region replication, implement sticky sessions on ALB, and use Direct Connect for inter-region traffic.
- Implement regional Cognito User Pools with SAML federation, use DynamoDB Global Tables with DAX clusters, configure CloudFront price class 100, and implement Transit Gateway for inter-region connectivity.
Correct Answer: A
Explanation:
Option A is correct because:
- Regional Cognito User Pools reduce authentication latency and provide failover
- DynamoDB Global Tables with DAX provides consistent sub-10ms read latency
- CloudFront price class 200 reduces costs while maintaining performance in key markets
- KMS multi-region keys maintain data sovereignty requirements
- Solution addresses all current pain points while optimising costs
Question No: #6
Domain: Design for New Solutions
Main Topic: Frontend Web and Mobile Development
Question Text:
A travel agency is building a Progressive Web App (PWA) to allow customers to:
- Book trips and view their itineraries even when offline.
- Receive real-time updates on flight changes.
- Log in securely using social media or email accounts.
- Scale the solution globally for users in different regions.
- The agency has a small team and prefers managed services to simplify operations.
What solution best meets the requirements?
- Deploy the PWA using Amazon EC2, use Amazon DynamoDB Streams for real-time updates, and AWS IAM for secure login.
- Host the PWA on Amazon S3, use Amazon CloudFront for content delivery, integrate with AWS AppSync for real-time updates, and Amazon Cognito for authentication.
- Use AWS Amplify for both hosting and authentication, and Amazon SNS for flight update notifications.
- Deploy the app using AWS Elastic Beanstalk, store flight data in Amazon RDS, and use AWS Lambda for real-time updates.
Correct Answer: B
Explanation:
Option B is Correct because
- Amazon S3 with CloudFront: Efficiently hosts and delivers static PWA content globally with caching for low-latency access.
- AWS AppSync: Provides real-time updates about flight changes and offline synchronisation for itineraries.
- Amazon Cognito: Simplifies secure authentication, including social login.
- This serverless, managed architecture ensures scalability and reduces operational complexity.
Database(2)
Question No: #7
Domain: Design for New Solutions
Main Topic: Design a solution to ensure business continuity
Sub Topic: Configuring disaster recovery solutions using Aurora Global Database
Question Text:
You are designing a multi-region disaster recovery solution for an application using Amazon Aurora. The application must support a Recovery Time Objective (RTO) of less than 1 minute and a Recovery Point Objective (RPO) of less than 5 seconds. Which strategy should you recommend?
- Use Aurora Global Database with one primary Region and a secondary Region.
- Use Aurora read replicas across Regions with a replication lag of 5 seconds.
- Implement a pilot light DR strategy using Aurora backups in a secondary Region.
- Use AWS Database Migration Service (AWS DMS) for continuous replication between Regions.
Correct Answer: A
Explanation:
- A is correct because Aurora Global Database supports low-latency replication (typically under 1 second) and provides a fast failover mechanism, meeting both RTO and RPO requirements.
Question No: #8
Domain: Design for New Solutions
Main Topic: Design a solution to ensure business continuity
Sub Topic: Implementing DynamoDB Global Tables for high availability and low
Latency
Question Text:
You are designing a DynamoDB table to store session data for a gaming application. The table requires high availability and consistent, low-latency reads across multiple AWS Regions. Which solution meets these requirements?
- Use DynamoDB Streams and AWS Lambda to replicate data across Regions.
- Use DynamoDB Global Tables to automatically replicate data across Regions.
- Use an Amazon S3 bucket with CRR to replicate session data to another Region.
- Use Amazon ElastiCache for session storage with cross-region replication enabled.
Correct Answer: B
Explanation:
- B is correct because DynamoDB Global Tables provide fully managed, multi-region, and active-active replication, ensuring high availability and low-latency reads.
Machine Learning(2)
Question No: #9
Domain: Design for New Solutions
Main Topic: Designing application solutions to meet business requirements
Sub-Topic: Selecting AWS services to build personalised, multi-language, and fraud detection systems.
Question Text:
Your company operates an e-commerce platform. You want to provide personalised product recommendations, enable multi-language chatbots for customer support, and automatically detect fraudulent transactions. Which combination of AWS services will you use to implement these features efficiently?
- Amazon Personalise, Amazon Lex, Amazon Translate, Amazon Fraud Detector
- Amazon SageMaker, Amazon Comprehend, Amazon Kendra, Amazon Fraud Detector
- Amazon Forecast, Amazon Polly, Amazon Rekognition, Amazon Fraud Detector
- Amazon Personalise, Amazon SageMaker, Amazon Textract, Amazon Fraud Detector
Correct Answer: A
Explanation:
A is correct because
- Amazon Personalise: Provides personalised product recommendations.
- Amazon Lex: Builds chatbots for customer support.
- Amazon Translate: Supports multi-language translation for chatbots.
- Amazon Fraud Detector: Detects fraudulent activities in transactions.
Question No: #10
Domain: Continuous Improvement for Existing Solutions
Main Topic: Architecting intelligent, searchable workflows for audio data
Sub-Topic: Using Amazon Transcribe, Comprehend, and Kendra for customer support analysis and sentiment detection.
Question Text:
You are designing a workflow to analyse customer support calls for insights. The system must transcribe the audio, detect sentiment, and store the results in an indexed repository searchable by agents. Which services will you use?
- Amazon Transcribe, Amazon Comprehend, Amazon Kendra
- Amazon Polly, Amazon Rekognition, Amazon SageMaker
- Amazon Translate, Amazon Lex, Amazon Personalise
- Amazon Transcribe, Amazon Textract, Amazon Rekognition
Correct Answer: A
Explanation
- A is correct because:
- Amazon Transcribe: Converts audio recordings to text.
- Amazon Comprehend: Analyses the text for sentiment.
- Amazon Kendra: Provides intelligent search capabilities for indexed content.
Question No: #11
Domain: Continuous Improvement for Existing Solutions (25% of scored content)
Main Topic: Determine a strategy to improve performance.
Sub Topic: High-performing systems architectures (for example, auto scaling, instance fleets, placement groups)
Question Text:
You are managing a mission-critical, highly available application hosted on Amazon EC2 instances. The application experiences variable traffic patterns, with periods of predictable load and sudden, unpredictable spikes. You need to improve the application’s scalability, performance, and cost efficiency, ensuring minimal downtime and fast response times during peak usage.
Which of the following strategies would provide the best solution for improving the application’s performance, scalability, and cost optimisation while maintaining high availability?
- Deploy a CloudFront CDN to cache static content, configure Auto Scaling based only on CPU utilisation, and use Amazon RDS read replicas to handle database query spikes.
- Configure Auto Scaling with a combination of target tracking scaling policies and instance fleets across multiple Availability Zones. Use EC2 Reserved Instances for baseline traffic and Spot Instances to handle traffic spikes.
- Use EC2 Auto Scaling with application load balancing, scale based on CPU utilisation and network throughput and deploy your instances in a single Availability Zone to reduce costs.
- Set up EC2 Auto Scaling in multiple regions, use Route 53 for latency-based routing, and scale instances based on CPU utilisation to handle regional traffic fluctuations.
Correct Answer: B
Explanation:
Option B is correct because it combines Auto Scaling with target tracking scaling policies and uses instance fleets across multiple Availability Zones. This ensures high availability and efficient scaling based on real-time demand. Using Reserved Instances for baseline traffic and Spot Instances for unpredictable spikes optimises both cost and performance, addressing both predictable and sudden traffic changes while minimising downtime.
Question No: #12
Domain: Design for New Solutions (29% of scored content)
Main Topic: Design a strategy to meet reliability requirements.
Sub Topic: Application integration (for example, Amazon Simple Notification Service [Amazon SNS], Amazon Simple Queue Service [Amazon SQS], AWS Step Functions)
Question Text:
A travel agency uses AWS Step Functions to orchestrate a booking workflow consisting of multiple microservices. Each step in the workflow calls an external API, some of which experience intermittent failures. The workflow must retry failed API calls with exponential backoff but skip non-critical steps if they continue to fail. The system must also log details of failed steps for debugging purposes. How can this be achieved?
- Use AWS Lambda for all steps, configure retries in the Lambda function, and invoke another Step Functions workflow to handle skipped steps.
- Use the Parallel state to execute all steps concurrently and handle failures using an external monitoring tool.
- Use the Wait state to add delays between retries and set a Fail state to terminate the workflow for non-critical steps.
- Use the Retry field in the state definition to configure retry attempts and set a Catch block for each step to log failures and proceed.
Correct Answer: D
Option D is correct because Step Functions’ built-in Retry and Catch fields provide exponential backoff retries and error handling, allowing you to skip non-critical steps while logging failures.
Question No: #13
Domain: Design for New Solutions
Main Topic: Design Data Storage Solutions
Sub-topic: Select appropriate data storage services to meet business and technical requirements.
Question Text
Your organisation has implemented a data lake on Amazon S3 for storing structured and unstructured data. Analysts require ad-hoc querying capabilities without provisioning servers. Additionally, the queries should be optimised for cost, with most queries performed using SQL. Data volumes grow by 20% every month, and performance is a key requirement.
Which service or approach best fulfils this need?
- Use Amazon Athena to query the S3 data lake directly.
- Set up Amazon EMR with Presto for ad-hoc querying.
- Transfer data to Amazon Redshift using AWS Glue, then query the data in Redshift.
- Deploy a self-managed Spark cluster on EC2 instances for ad-hoc querying.
Correct Answer: A
Explanation:
A is correct because Amazon Athena is a serverless query service that directly queries data in Amazon S3 using SQL. It’s cost-efficient for ad-hoc queries and requires no infrastructure management. Its scalability handles the growing data volume effectively.
Question No: #14
Domain: Design for New Solutions
Main Topic: Design Data Processing Solutions
Sub-topic: Design solutions for processing data in real time or batches.
Question Text
Your company manages IoT sensors that continuously send temperature data. The data must be processed in real-time to detect anomalies, such as abrupt temperature changes, and trigger alerts. You also need a scalable, fault-tolerant, and serverless solution that integrates easily with other AWS services.
Which solution is the most appropriate to meet the requirements?
- Use Amazon Kinesis Data Streams with AWS Lambda for real-time processing and alerting.
- Use Amazon SQS with AWS Lambda for message processing.
- Deploy an Apache Kafka cluster on Amazon EC2 for real-time streaming.
- Use AWS Glue for real-time data transformation and triggering alerts.
Correct Answer: A
Explanation:
A is correct because Amazon Kinesis Data Streams provides a scalable, serverless solution for ingesting and processing streaming data in real-time. Integrating with AWS Lambda allows event-driven processing and alert generation without managing infrastructure.
Question No: #15
Domain: Design for New Solutions
Main Topic: Design Data Storage Solutions
Sub-topic: Design solutions to optimise storage costs, latency, and availability.
Question Text
Your company operates globally and requires an analytics system that processes customer behaviour data in near real-time. The data is ingested and queried by teams located in different regions. The solution must provide low latency for regional teams while ensuring a unified view of data globally.
What architecture design is most suitable for this use case?
- Use Amazon Redshift in a single region and replicate queries across regions.
- Use Amazon S3 for data storage, with Amazon Athena in each region for querying.
- Deploy Amazon DynamoDB Global Tables and analyse data using AWS Glue.
- Use Amazon OpenSearch Service (formerly Elasticsearch) with cross-cluster replication enabled.
Correct Answer: B
Explanation:
B is correct because Amazon S3 provides global durability and cost-effective storage, while Amazon Athena can be deployed in multiple regions to enable low-latency queries for regional teams. This ensures performance and scalability.
Design New Solutions(2)
Question No: #16
Domain: Design for New Solutions
Main Topic: Design Data Processing Solutions
Sub-topic: Design solutions for processing data in real-time or in batches.
Question Text:
A logistics company operates a fleet of IoT-enabled vehicles that transmit location and sensor data every second. The company needs a system to process this data in real-time for anomaly detection (e.g., sudden temperature changes in refrigerated trucks) and route optimisation. The system must scale to handle increasing data volumes as the fleet grows and ensure high availability across multiple regions.
Which solution architecture best meets these requirements?
- Use Amazon Kinesis Data Streams for ingestion, process data with AWS Lambda, and store results in Amazon DynamoDB Global Tables.
- Deploy Apache Kafka on Amazon EC2 instances across regions, process the data with Spark Streaming on Amazon EMR, and store results in Amazon S3.
- Use AWS Glue for ETL processing, Amazon SQS for message queuing, and Amazon Redshift for analytics.
- Implement an Amazon OpenSearch Service cluster with cross-cluster replication for ingesting and analysing the data.
Correct Answer: A
Explanation:
A is correct because Amazon Kinesis Data Streams is a fully managed service designed for real-time data streaming. It can handle large-scale data ingestion with low latency. AWS Lambda enables serverless processing, which scales automatically. Storing results in DynamoDB Global Tables ensures low-latency access and high availability across regions. This architecture meets scalability and real-time processing needs.
Question No: #17
Domain: Design for New Solutions
Main Topic: Design Secure Solutions
Sub-topic: Design a secure and compliant application architecture.
Question Text:
A financial services company needs to deploy a customer-facing application that processes sensitive transaction data. The application must comply with regulatory requirements for data sovereignty, ensuring data remains in specific regions. It also needs automatic failover and data replication to maintain high availability during disasters. Additionally, the architecture should minimise operational overhead while enforcing security best practices. Which solution architecture best meets these requirements?
- Use Amazon RDS with Multi-AZ deployment, encrypt data using AWS KMS, and replicate backups across regions using Amazon S3.
- Deploy Amazon Aurora Global Database with region-specific endpoints, encrypt data at rest using KMS, and configure automated failover between regions.
- Use DynamoDB with Global Tables to store transaction data and encrypt it using customer-managed KMS keys.
- Set up an Amazon Redshift cluster in a single region, replicate snapshots to other regions, and use AWS CloudHSM for encryption.
Correct Answer: B.
Explanation:
B is correct because Global Database supports multi-region deployments with low-latency replication. It ensures compliance by keeping data isolated in specific regions while offering automatic failover for high availability. KMS encryption protects sensitive data, meeting security requirements.
Database
Question No: 18
Domain: Accelerate Workload Migration and Modernisation
Category: Database
Topic: Determine opportunities for modernisation and enhancements
Subtopic: Purpose-built databases (for example, DynamoDB, Amazon Aurora Serverless, ElastiCache)
Type: Multiple Choice
A retail company is modernising its e-commerce platform to improve scalability and performance. They need a solution for their relational database that can handle fluctuating traffic during peak sales events without requiring continuous server management. Additionally, the company wants to offer personalised recommendations to customers based on recent shopping trends and to store shopping cart data for fast access.
Which AWS service would best meet these requirements?
- Amazon RDS for MySQL
- Amazon Aurora Serverless
- Amazon DynamoDB
- Amazon Redshift
Correct Answer: B
Explanation:
Option B is correct because it provides an auto-scaling relational database that automatically adjusts capacity based on demand, ideal for handling varying traffic. It also reduces management overhead, which meets the company’s need for a serverless option during peak sales.
Question No: 19
Domain: Accelerate Workload Migration and Modernisation
Category: Database
Topic: Determine a new architecture for existing workloads
Subtopic: Databases (for example, Amazon DynamoDB, Amazon OpenSearch Service, Amazon RDS, self-managed databases on Amazon EC2)
Type: Multi-Response
Question:
A media company is migrating its on-premises workloads to AWS to enhance scalability, searchability, and performance. They have the following requirements:
- Store structured relational data with support for ACID transactions.
- Support complex queries and analytics on large datasets.
- Provide a fast, scalable search functionality for their media catalogue.
- Minimise the operational overhead of managing database servers.
Which combination of AWS services would best meet these requirements? (Select TWO)
- Use Amazon RDS for PostgreSQL to manage relational data with transactional support and to handle structured data requirements.
- Use Amazon DynamoDB to handle relational data with complex queries and ensure ACID compliance.
- Use Amazon OpenSearch Service to provide scalable, low-latency search functionality for the media catalogue.
- Use self-managed MySQL on Amazon EC2 to control database configuration and handle transaction-heavy relational data.
- Use Amazon Redshift to handle relational data and analytics, but without ACID compliance support.
Correct Answers: A and C
Explanation:
Option A is correct: Amazon RDS for PostgreSQL is a managed relational database that supports ACID transactions, which meets the requirements for storing structured relational data with transactional support and reducing operational overhead.
Option C is correct: Amazon OpenSearch Service provides a fully managed search solution with low-latency, scalable search capabilities, making it ideal for implementing a search function on the mediacatalogue.
Option B is incorrect: Amazon DynamoDB is a NoSQL database, which is less suitable for
Question No: 20
Domain: Design for New Solutions
Category: Database
Topic: Design a solution to meet performance objectives
Subtopic: Purpose-built databases
Type: Multiple Choice
Question:
A financial services company needs to design a high-performance solution for processing real-time transactions. The solution must support low-latency read and write operations for handling thousands of simultaneous transactions per second. Additionally, the company requires multi-region replication to ensure data availability and resilience.
Which AWS database service would be the best fit for this use case?
- Amazon RDS for PostgreSQL
- Amazon Redshift
- Amazon DynamoDB
- Amazon DocumentDB
Correct Answer: C
Explanation:
Option C (Amazon DynamoDB) is correct because it is a NoSQL database designed for high-velocity, low-latency transactions, which makes it ideal for real-time processing in financial services. It also supports multi-region replication, ensuring high availability and resilience across regions.
Migration and Transfer
Question No: 21
Domain: Accelerate Workload Migration and Modernisation
Category: Migration and Transfer
Topic: Determine the optimal migration approach for existing workloads
Subtopic: Application migration tools (for example, AWS Application Discovery Service, AWS Application Migration Service)
Type: Multiple Choice
Question:
An organisation plans to migrate its on-premises applications to AWS to improve agility and scalability. The company wants to assess its current environment to understand application dependencies, utilisation patterns, and resource configurations before migration. Additionally, they want a streamlined process for rehosting applications onto AWS with minimal downtime.
Which AWS services would best address these requirements?
- Use AWS Application Discovery for assessment and AWS Application Migration for minimal-downtime rehosting.
- Use AWS Database Migration for discovery and S3 Transfer Acceleration for faster data migration.
- Use AWS Migration Hub for automated application migration and monitoring.
- Use EC2 Auto Scaling for resource assessment and CloudFormation for IaC migration.
Correct Answer: A
Explanation:
Option A is correct: AWS Application Discovery Service helps assess on-premises environments by collecting data on application dependencies, utilisation, and configurations. AWS Application Migration Service then enables rehosting applications on AWS with minimal
Question No: 22
Domain: Accelerate Workload Migration and Modernisation
Category: Migration and Transfer
Topic: Determine the optimal migration approach for existing workloads
Subtopic: Data migration options and tools (for example, AWS DataSync, AWS Transfer Family, AWS Snow Family, S3 Transfer Acceleration)
Type: Multiple Choice
Question:
A large healthcare organisation needs to migrate petabytes of sensitive patient data from its on-premises data centre to Amazon S3. Due to network limitations, transferring such a large volume of data over the internet would take an unacceptable amount of time. Additionally, they require a secure solution that complies with healthcare data regulations.
Which AWS data migration tool would best meet these requirements?
- AWS DataSync
- AWS Transfer Family
- AWS Snow Family
- S3 Transfer Acceleration
Correct Answer: C
Explanation:
Option C (AWS Snow Family) is correct because it provides a secure, offline method for transferring large amounts of data (up to an exabyte scale) by physically shipping AWS Snowball or Snowmobile devices to the customer. This approach is ideal for scenarios with limited network capacity and meets security and compliance requirements, making it well-suited for sensitive healthcare data.
Question No: 23
Domain: Accelerate Workload Migration and Modernisation
Category: Migration and Transfer
Topic: Determine the optimal migration approach for existing workloads
Subtopic: Application migration tools (for example, AWS Application Discovery Service, AWS Application Migration Service)
Type: Multi-Response
Question:
A retail company is preparing to migrate its on-premises applications to AWS. They need to understand the current environment, including application dependencies and resource usage, to ensure a smooth transition. Additionally, they are looking for a tool that can facilitate a straightforward “lift-and-shift” migration to rehost applications with minimal changes.
Which combination of AWS tools would best suit these requirements? (Select TWO)
- Use AWS Application Discovery to analyse dependencies and usage patterns.
- Use AWS Application Migration to replicate and rehost with minimal changes.
- Use AWS Database Migration to handle application migration and dependencies.
- Use AWS CloudFormation to rehost applications with infrastructure as code.
- Use AWS Migration Hub to track application migration progress and dependencies.
Correct Answers: A and B
Explanation:
Option A (AWS Application Discovery Service) is correct because it helps in assessing the on-premises environment by discovering application dependencies, utilisation patterns, and configurations, which is essential for planning a smooth migration.
Option B (AWS Application Migration Service) is correct because it enables lift-and-shift migrations, allowing applications to be replicated and rehosted on AWS with minimal changes. This approach minimises downtime and accelerates migration.
Machine Learning
Question No: 24
Domain: Design for New Solutions
Category: Management and Governance
Topic: Design a deployment strategy to meet business requirements
Subtopic: Infrastructure as code (IaC) (for example, AWS CloudFormation)
Type: Multiple Choice
Question:
A global financial services company is designing a deployment strategy for its cloud infrastructure on AWS. The company needs to ensure that its resources, such as EC2 instances, RDS databases, and VPCs, are consistently deployed across multiple regions to meet business requirements for high availability, disaster recovery, and compliance. The company is also focused on automating the process to minimise manual intervention and prevent configuration drift.
Which of the following solutions using AWS CloudFormation best meets these objectives, considering the need for scalability, automation, and consistency across regions?
- Use AWS CloudFormation StackSets with cross-region deployment capabilities, leveraging AWS Organisations for centralised management of stacks across multiple accounts and regions.
- Create separate CloudFormation templates for each region and manually deploy them across regions, ensuring region-specific customisations are included in each template.
- Deploy CloudFormation templates using AWS Lambda functions triggered by CloudWatch Events, allowing for automated deployment across regions based on certain triggers, but without StackSets for centralised management.
- Use AWS Elastic Beanstalk to manage the deployment of infrastructure resources, with CloudFormation handling the underlying resources, but not leveraging the full potential of IaC for multi-region, multi-account management.
Correct Answer: A
Explanation:
Option A (Using AWS CloudFormation StackSets) is correct because StackSets allows you to deploy and manage CloudFormation stacks across multiple AWS accounts and regions from a central location. This solution addresses the need for high availability and disaster recovery by ensuring that the infrastructure is deployed consistently in multiple regions. StackSets integrates with AWS Organisations, which centralises management and simplifies the deployment process across various regions and accounts, reducing manual intervention and the risk of configuration drift.
Question No: 25
Domain: Design Solutions for Organisational Complexity
Category: Management and Governance
Topic: Prescribe security controls
Subtopic: AWS security, identity, and compliance tools (for example, AWS CloudTrail, AWS Identity and Access Management Access Analyser, AWS Security Hub, Amazon Inspector)
Type: Multiple Choice
Question:
A healthcare organisation is building an application on AWS and needs to ensure that it meets strict compliance requirements, including identifying potential vulnerabilities and ensuring that access policies follow least privilege principles. They also want a centralised view of their security posture across AWS accounts to monitor compliance and quickly respond to security threats.
Which combination of AWS services would best meet these security and compliance requirements?
- Use Security Hub for a centralised security view and Inspector for vulnerability detection.
- Use CloudTrail for API monitoring and S3 for access log storage.
- Use IAM Access Analyser to restrict permissions and Inspector for vulnerability scans.
- Use KMS for data encryption and CloudTrail to monitor security configurations.
Correct Answer: A
Explanation:
Option A is correct because AWS Security Hub provides a centralised view of security findings and monitors compliance across multiple AWS accounts, while Amazon Inspector automatically scans for vulnerabilities in EC2 instances and container images. This combination addresses the organisation’s needs for compliance, vulnerability assessment, and centralised security management.
Question No: 26
Domain: Continuous Improvement for Existing Solutions
Category: Management and Governance
Topic: Determine a strategy to improve security
Subtopic: Automated monitoring and remediation strategies (for example, AWS Config rules)
Type: Multi-Response
Question:
A media company uses multiple AWS accounts to host its applications and is concerned about ensuring compliance with security policies across all environments. They want to implement automated checks to monitor configurations, enforce best practices, and automatically remediate non-compliant resources, such as public access to S3 buckets and unencrypted EBS volumes.
Which combination of AWS services and strategies would best achieve these objectives? (Select TWO)
- Use GuardDuty for real-time detection and remediation of non-compliant configurations.
- Use Systems Manager Automation for patching and resource management across accounts.
- Use AWS Config for custom compliance rules and automatic remediation.
- Use Config Conformance Packs to apply best-practice compliance rules across accounts.
- Use AWS CloudTrail to monitor user actions and ensure configuration compliance.
Correct Answers: C and D
Explanation:
Option C (AWS Config with custom rules) is correct because AWS Config can be used to create custom rules to monitor specific compliance requirements, such as ensuring S3 buckets are not publicly accessible and that EBS volumes are encrypted. AWS Config can also trigger AWS Systems Manager or Lambda to automatically remediate non-compliant resources.
Option D (AWS Config Conformance Packs) is correct as it allows deploying a set of AWS Config rules as a package, which simplifies monitoring compliance across multiple accounts and ensures alignment with security best practices. Conformance Packs are particularly useful for enforcing consistent policies across all environments.
Management and Governance
Question No: 27
Domain: Design for New Solutions
Category: Frontend Web and Mobile
Subtopic: AWS Device Farm
Type: Multiple Choice
Question:
A financial services company is developing a mobile banking app that needs to work seamlessly across a wide variety of mobile devices and operating system versions. They want to ensure the app performs well on different devices and provides a consistent user experience. The team also wants to automate testing to identify any issues early in the development process.
Which solution using AWS Device Farm would best help the company achieve its testing goals?
- Use Device Farm’s virtual environment for cost-effective simulated testing.
- Use Device Farm with CloudWatch for continuous performance monitoring in production.
- Use Device Farm to automate testing on real devices for functionality, performance, and compatibility.
- Use Device Farm for manual testing on a limited set of devices, relying on user feedback for broader compatibility.
Correct Answer: C
Explanation:
Option C (Automated testing on real devices with AWS Device Farm) is correct because AWS Device Farm allows for testing on a wide range of real mobile devices, helping ensure compatibility, performance, and a consistent user experience across different devices and OS versions. Automated testing enables the team to identify issues early and improves the efficiency of the testing process.
Question No: 28
Domain: Design for New Solutions
Category: Frontend Web and Mobile
Topic: Frontend Web and Mobile
Subtopic: Amazon API Gateway
Type: Multiple Choice
Question:
An e-commerce company is developing a new frontend web application to provide a seamless shopping experience across desktop and mobile devices. The application must interact with multiple backend microservices, including services for product recommendations, user profiles, and order processing. The company has chosen Amazon API Gateway to manage these interactions, but wants to optimise costs and ensure that only authenticated users can access sensitive endpoints, such as the checkout and user profile services.
Which configuration using Amazon API Gateway best meets the company’s requirements for security and cost optimisation?
- Enable API Gateway Caching with Low TTL (Time-to-Live)
- Set Up IAM Authorisation for All API Endpoints
- Use Amazon Cognito with Resource Policies and Configure Rate Limiting.
- Enable Lambda Authorizers with Throttling
Correct Answer: C
Explanation:
Option C (Using Amazon Cognito with Resource Policies and Rate Limiting) is correct because Amazon Cognito provides scalable user authentication, ensuring that only authorised users access sensitive endpoints. Resource policies restrict access to specific IP ranges, adding an extra layer of security, and rate limiting helps control costs by managing the request load on backend services.
Question No: 29
Domain: Design for New Solutions
Category: Frontend Web and Mobile
Topic: Frontend Web and Mobile
Subtopic: AWS Amplify
Type: Multiple Choice
Question:
A startup is building a mobile and web application that requires quick deployment and a streamlined development workflow. They want to integrate authentication, APIs, and storage easily into the app without managing backend infrastructure. The team also needs a solution that supports real-time data synchronisation between users across multiple devices.
Which AWS service and approach would best meet these requirements?
- Use CodePipeline for deployments, integrating services like Cognito for auth and RDS for storage.
- Use EC2 for a custom backend with auth and APIs, and AppSync for real-time sync.
- Use S3 for storage, Lambda for auth, and DynamoDB for real-time sync.
- Use Amplify to manage the backend, add auth, APIs, storage, and real-time sync.
Correct Answer: D
Explanation:
Option D (AWS Amplify) is correct because AWS Amplify provides a fully managed backend solution tailored for mobile and web applications. It supports quick integration of services like authentication, APIs, and storage and includes built-in support for real-time data synchronisation, which aligns with the startup’s requirements for a streamlined workflow and minimal backend management.
Frontend Web and Mobile
Question No: 30
Domain: Design Solutions for Organisational Complexity
Category: Machine Learning
Topic: Machine Learning
Subtopic: Amazon Comprehend
Type: Multiple Choice
Question:
A global e-commerce company wants to analyse customer feedback from multiple regions to identify recurring issues and customer sentiment. The data consists of customer reviews in multiple languages, and the company needs a solution that can detect key phrases, sentiments, and language in the feedback. The team also wants to automate the process of categorising feedback based on specific topics, such as “delivery,” “product quality,” and “customer service.”
Which approach using Amazon Comprehend would best help the company achieve these goals?
- Use Translate to English, then Comprehend for sentiment and key phrases.
- Use Comprehend for language detection, then manually categorise by sentiment.
- Use Lex for language detection and topic categorisation.
- Use Comprehend for sentiment, language detection, key phrases, and custom entity categorisation.
Correct Answer: D
Explanation:
Option D is correct because Amazon Comprehend provides built-in features for sentiment analysis, language detection, and key phrase extraction. Additionally, it allows for custom entity recognition, which can be used to categorise feedback into specific topics, meeting the company’s requirements.
Question No: 31
Domain: Design Solutions for Organisational Complexity
Category: Machine Learning
Topic: Machine Learning
Subtopic: Amazon Polly
Type: Multiple Choice
Question:
An e-learning platform wants to enhance its courses by adding audio narration for improved accessibility and user engagement. They are considering Amazon Polly to convert their written content into speech. However, they face challenges due to the diverse accents, languages, and tones preferred by their international user base. The platform also needs to ensure that the audio outputs sound as natural as possible, given the wide variety of course subjects.
Which of the following Amazon Polly features would best help the platform address these challenges?
- Speech Marks – Provides the timing of speech elements, allowing precise synchronisation with visual content.
- Neural Text-to-Speech (NTTS) – Offers high-quality, natural-sounding voices that can improve engagement across various languages and accents.
- Custom Lexicons – Allows adding new words or specific pronunciations to meet subject-specific terminology needs.
- Speech Synthesis Markup Language (SSML) – Enables customisation of speech output, such as controlling pauses and emphasising certain words.
Correct Answer: B
Explanation:
Option B (Neural Text-to-Speech, or NTTS) is correct because NTTS provides Amazon Polly’s most advanced, natural-sounding voices, which are essential for delivering high-quality audio narration to an international audience. This feature enhances the expressiveness and realism of voices, meeting the platform’s need for engaging, natural audio across various languages and accents.
Question No: 32
Domain: Design Solutions for Organisational Complexity
Category: Machine Learning
Topic: Machine Learning
Subtopic: Amazon Lex
Type: Multiple Choice
Question:
A telecommunications company is developing a customer support chatbot to handle common inquiries, such as checking account balances, troubleshooting service issues, and providing information about new plans. The chatbot must understand natural language input from users and respond accurately while also maintaining seamless integration with the company’s existing customer service systems.
Which approach using Amazon Lex would best enable the company to achieve these objectives?
- Use Lex for a static FAQ with scripted responses, without intent understanding.
- Use Lex to build a chatbot with intent recognition and CRM integration for customer data.
- Use Lex for a chatbot responding only to predefined phrases for accuracy.
- Use Lex to create a chatbot requiring strict command formats to avoid misinterpretation.
Correct Answer: B
Explanation:
Option B (Using Amazon Lex to understand user intents and integrate with CRM) is correct because Amazon Lex is designed for building conversational interfaces that leverage natural language understanding (NLU) to interpret user inputs accurately. Integrating with the company’s CRM allows the chatbot to access customer data dynamically and provide personalised responses, enhancing the user experience.
To Conclude
Hope this question has given you a brief overview of the question pattern, and the important segment question in AWS CSA professional exam. Also we have many other resources that could help you prepare for your AWS certification. Our Practices test, Video courses, hands on labs, Sandoboxes are all curated to explicitly meet your needs, and match with the objectives of AWS Certified Solution Architect Professional. Get started now and make your preparation heist free, smooth and more equipping.
- Top 20 Questions To Prepare For Certified Kubernetes Administrator Exam - August 16, 2024
- 10 AWS Services to Master for the AWS Developer Associate Exam - August 14, 2024
- Exam Tips for AWS Machine Learning Specialty Certification - August 7, 2024
- Best 15+ AWS Developer Associate hands-on labs in 2024 - July 24, 2024
- Containers vs Virtual Machines: Differences You Should Know - June 24, 2024
- Databricks Launched World’s Most Capable Large Language Model (LLM) - April 26, 2024
- What are the storage options available in Microsoft Azure? - March 14, 2024
- User’s Guide to Getting Started with Google Kubernetes Engine - March 1, 2024
Regarding question 4) of the architect professions exam sample: wouldn’t two schedules create snapshots of the same shared volumes in parallel, thus creating a contention, not to mention that that they would create redundant snapshots?
And why would option D be incorrect?