sap
241-270

244

A company has developed a hybrid solution between its data center and AWS. The company uses Amazon VPC and Amazon EC2 instances that send application logs to Amazon CloudWatch. The EC2 instances read data from multiple relational databases that are hosted on premises.

一家公司已经开发了一种数据中心和 AWS 之间的混合解决方案。该公司使用 AmazonVPC 和 AmazonEC2实例将应用程序日志发送到 AmazonCloudWatch。EC2实例从驻留在场所中的多个关系数据库中读取数据。

The company wants to monitor which EC2 instances are connected to the databases in near-real time. The company already has a monitoring solution that uses Splunk on premises. A solutions architect needs to determine how to send networking traffic to Splunk. 该公司希望近乎实时地监视哪些 EC2实例连接到数据库。该公司已经有一个监控解决方案,使用 Splunk 的处所。解决方案架构师需要确定如何将网络流量发送到 Splunk。

How should the solutions architect meet these requirements? 解决方案架构师应该如何满足这些需求?

A. Enable VPC flows logs, and send them to CloudWatch. Create an AWS Lambda function to periodically export the CloudWatch logs to an Amazon S3 bucket by using the pre-defined export function. Generate ACCESS_KEY and SECRET_KEY AWS credentials. Configure Splunk to pull the logs from the S3 bucket by using those credentials.

启用 VPC 流日志,并将其发送到 CloudWatch。创建一个 AWS Lambda 函数,使用预定义的导出函数定期将 CloudWatch 日志导出到 Amazon S3 bucket。生成 ACCESS _ KEY 和 SECRET _ KEY AWS 凭据。将 Splunk 配置为使用这些凭据从 S3 bucket 中提取日志。

B. Create an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination. Configure a pre-processing AWS Lambda function with a Kinesis Data Firehose stream processor that extracts individual log events from records sent by CloudWatch Logs subscription filters. Enable VPC flows logs, and send them to CloudWatch. Create a CloudWatch Logs subscription that sends log events to the Kinesis Data Firehose delivery stream.

创建一个以 Splunk 为目的地的 Amazon Kinesis 数据流交付。配置一个预处理 AWS Lambda 函数,该函数使用 Kinesis 数据流处理器从 CloudWatch Logs 订阅过滤器发送的记录中提取单个日志事件。启用 VPC 流日志,并将其发送到 CloudWatch。创建一个 CloudWatch Logs 订阅,该订阅将日志事件发送到 Kinesis 数据消防管传递流。

C. Ask the company to log every request that is made to the databases along with the EC2 instance IP address. Export the CloudWatch logs to an Amazon S3 bucket. Use Amazon Athena to query the logs grouped by database name. Export Athena results to another S3 bucket. Invoke an AWS Lambda function to automatically send any new file that is put in the S3 bucket to Splunk.

要求公司记录向数据库发出的每个请求以及 EC2实例 IP 地址。将 CloudWatch 日志导出到 AmazonS3桶。使用 AmazonAthena 查询按数据库名称分组的日志。将 Athena 结果导出到另一个 S3桶。调用一个 AWS Lambda 函数,自动将放在 S3 bucket 中的任何新文件发送到 Splunk。

D. Send the CloudWatch logs to an Amazon Kinesis data stream with Amazon Kinesis Data Analytics for SQL Applications. Configure a 1-minute sliding window to collect the events. Create a SQL query that uses the anomaly detection template to monitor any networking traffic anomalies in near-real time. Send the result to an Amazon Kinesis Data Firehose delivery stream with Splunk as the destination.

使用 Amazon Kinesis Data Analytics for SQL 应用程序将 CloudWatch 日志发送到 Amazon Kinesis 数据流。配置一个1分钟的滑动窗口来收集事件。创建一个 SQL 查询,使用异常检测模板近实时地监视任何网络流量异常。将结果发送到以 Splunk 为目的地的 Amazon Kinesis 数据消防管传输流。

👉

B : 工具: For near real time -->use Kinesis Datafirehose. For real time ---> use Kineses data streams

架构:总体需要分析 VPC Flowlog

  1. 开启 flowlog 送到 CW
  2. 创建一个 Firehose Delivery Stream 订阅 CW 的日志
  3. 创建一个 Data Firehose delivery stream with 一个 lambda 进行语出里之后发送给 splunk

REF: https://docs.aws.amazon.com/firehose/latest/dev/vpc-splunk-tutorial.html (opens in a new tab)

252

A company uses an Amazon Aurora PostgreSQL DB cluster for applications in a single AWS Region. The company's database team must monitor all data activity on all the databases.

Which solution will achieve this goal?

一家公司使用 AmazonAuroraPostgreSQLDB 集群来处理单个 AWS 区域中的应用程序。公司的数据库团队必须监视所有数据库上的所有数据活动。哪种解决方案将实现这一目标?

A. Set up an AWS Database Migration Service (AWS DMS) change data capture (CDC) task. Specify the Aurora DB cluster as the source. Specify Amazon Kinesis Data Firehose as the target. Use Kinesis Data Firehose to upload the data into an Amazon OpenSearch Service cluster for further analysis.

建立一个 AWS 数据库迁移服务(AWS DMS)更改数据捕获(CDC)任务。指定 Aurora DB 集群作为源。指定 AmazonKinesis 数据消防龙头作为目标。使用 Kinesis 数据消防软管将数据上传到 Amazon OpenSearch 服务集群以便进一步分析。

B. Start a database activity stream on the Aurora DB cluster to capture the activity stream in Amazon EventBridge. Define an AWS Lambda function as a target for EventBridge. Program the Lambda function to decrypt the messages from EventBridge and to publish all database activity to Amazon S3 for further analysis.

在 Aurora DB 集群上启动一个数据库活动流,以捕获 Amazon EventBridge 中的活动流。定义一个 AWS Lambda 函数作为 EventBridge 的目标。编写 Lambda 函数来解密来自 EventBridge 的消息,并将所有数据库活动发布到 AmazonS3以便进一步分析。

C. Start a database activity stream on the Aurora DB cluster to push the activity stream to an Amazon Kinesis data stream. Configure Amazon Kinesis Data Firehose to consume the Kinesis data stream and to deliver the data to Amazon S3 for further analysis.

在 Aurora DB 集群上启动一个数据库活动流,将活动流推送到 Amazon Kinesis 数据流。配置 AmazonKinesis 数据消防软管以使用 Kinesis 数据流,并将数据交付给 AmazonS3以便进一步分析。

D. Set up an AWS Database Migration Service (AWS DMS) change data capture (CDC) task. Specify the Aurora DB cluster as the source. Specify Amazon Kinesis Data Firehose as the target. Use Kinesis Data Firehose to upload the data into an Amazon Redshift cluster. Run queries on the Amazon Redshift data to determine database activities on the Aurora database.

建立一个 AWS 数据库迁移服务(AWS DMS)更改数据捕获(CDC)任务。指定 Aurora DB 集群作为源。指定 AmazonKinesis 数据消防龙头作为目标。使用 Kinesis 数据消防龙头将数据上传到 Amazon 红移集群。对 AmazonRedshift 数据运行查询,以确定 Aurora 数据库上的数据库活动。

👉

C : aurora 本身就有 Database Acticity stream 可以使用, 然后也支持将 stream 导入 kinesis stream

REF: https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/DBActivityStreams.Monitoring.html (opens in a new tab)