Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions packages/@aws-cdk/aws-scheduler-targets-alpha/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ The following targets are supported:
6. `targets.EventBridgePutEvents`: [Put Events on EventBridge](#send-events-to-an-eventbridge-event-bus)
7. `targets.InspectorStartAssessmentRun`: [Start an Amazon Inspector assessment run](#start-an-amazon-inspector-assessment-run)
8. `targets.KinesisStreamPutRecord`: [Put a record to an Amazon Kinesis Data Stream](#put-a-record-to-an-amazon-kinesis-data-stream)
9. `targets.KinesisDataFirehosePutRecord`: [Put a record to an Amazon Data Firehose](#put-a-record-to-an-amazon-data-firehose)
9. `targets.FirehosePutRecord`: [Put a record to an Amazon Data Firehose](#put-a-record-to-an-amazon-data-firehose)
10. `targets.CodePipelineStartPipelineExecution`: [Start a CodePipeline execution](#start-a-codepipeline-execution)
11. `targets.SageMakerStartPipelineExecution`: [Start a SageMaker pipeline execution](#start-a-sagemaker-pipeline-execution)
12. `targets.Universal`: [Invoke a wider set of AWS API](#invoke-a-wider-set-of-aws-api)
Expand Down Expand Up @@ -254,13 +254,13 @@ new Schedule(this, 'Schedule', {

## Put a record to an Amazon Data Firehose

Use the `KinesisDataFirehosePutRecord` target to put a record to an Amazon Data Firehose delivery stream.
Use the `FirehosePutRecord` target to put a record to an Amazon Data Firehose delivery stream.

The code snippet below creates an event rule with a delivery stream as a target
called every hour by EventBridge Scheduler with a custom payload.

```ts
import * as firehose from '@aws-cdk/aws-kinesisfirehose-alpha';
import * as firehose from 'aws-cdk-lib/aws-kinesisfirehose';
declare const deliveryStream: firehose.IDeliveryStream;

const payload = {
Expand All @@ -269,7 +269,7 @@ const payload = {

new Schedule(this, 'Schedule', {
schedule: ScheduleExpression.rate(Duration.minutes(60)),
target: new targets.KinesisDataFirehosePutRecord(deliveryStream, {
target: new targets.FirehosePutRecord(deliveryStream, {
input: ScheduleTargetInput.fromObject(payload),
}),
});
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ import { ScheduleTargetBase, ScheduleTargetBaseProps } from './target';
/**
* Use an Amazon Data Firehose as a target for AWS EventBridge Scheduler.
*/
export class KinesisDataFirehosePutRecord extends ScheduleTargetBase implements IScheduleTarget {
export class FirehosePutRecord extends ScheduleTargetBase implements IScheduleTarget {
constructor(
private readonly deliveryStream: IDeliveryStream,
props: ScheduleTargetBaseProps = {},
Expand Down
2 changes: 1 addition & 1 deletion packages/@aws-cdk/aws-scheduler-targets-alpha/lib/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ export * from './codebuild-start-build';
export * from './codepipeline-start-pipeline-execution';
export * from './event-bridge-put-events';
export * from './inspector-start-assessment-run';
export * from './kinesis-data-firehose-put-record';
export * from './firehose-put-record';
export * from './kinesis-stream-put-record';
export * from './lambda-invoke';
export * from './sage-maker-start-pipeline-execution';
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ import { AccountRootPrincipal, Role } from 'aws-cdk-lib/aws-iam';
import * as firehose from 'aws-cdk-lib/aws-kinesisfirehose';
import * as sqs from 'aws-cdk-lib/aws-sqs';
import { Construct } from 'constructs';
import { KinesisDataFirehosePutRecord } from '../lib';
import { FirehosePutRecord } from '../lib';

describe('schedule target', () => {
let app: App;
Expand Down Expand Up @@ -46,8 +46,8 @@ describe('schedule target', () => {
});
});

test('creates IAM role and IAM policy for kinesis data firehose target in the same account', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream);
test('creates IAM role and IAM policy for Amazon Data Firehose target in the same account', () => {
const firehoseTarget = new FirehosePutRecord(firehoseStream);

new Schedule(stack, 'MyScheduleDummy', {
schedule: expr,
Expand Down Expand Up @@ -119,7 +119,7 @@ describe('schedule target', () => {
assumedBy: new AccountRootPrincipal(),
});

const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
role: targetExecutionRole,
});

Expand Down Expand Up @@ -157,7 +157,7 @@ describe('schedule target', () => {
});

test('reuses IAM role and IAM policy for two schedules with the same target from the same account', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream);
const firehoseTarget = new FirehosePutRecord(firehoseStream);

new Schedule(stack, 'MyScheduleDummy1', {
schedule: expr,
Expand Down Expand Up @@ -218,7 +218,7 @@ describe('schedule target', () => {
});

test('creates IAM role and IAM policy for two schedules with the same target but different groups', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream);
const firehoseTarget = new FirehosePutRecord(firehoseStream);
const group = new Group(stack, 'Group', {
groupName: 'mygroup',
});
Expand Down Expand Up @@ -311,7 +311,7 @@ describe('schedule target', () => {
destination: mockS3Destination,
});

const firehoseTarget = new KinesisDataFirehosePutRecord(anotherFirehose);
const firehoseTarget = new FirehosePutRecord(anotherFirehose);

new Schedule(stack, 'MyScheduleDummy', {
schedule: expr,
Expand Down Expand Up @@ -349,7 +349,7 @@ describe('schedule target', () => {
test('creates IAM policy for imported role for firehose in the same account', () => {
const importedRole = Role.fromRoleArn(stack, 'ImportedRole', 'arn:aws:iam::123456789012:role/someRole');

const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
role: importedRole,
});

Expand Down Expand Up @@ -398,7 +398,7 @@ describe('schedule target', () => {
});
const importedRole = Role.fromRoleArn(stack, 'ImportedRole', 'arn:aws:iam::123456789012:role/someRole');

const firehoseTarget = new KinesisDataFirehosePutRecord(anotherFirehose, {
const firehoseTarget = new FirehosePutRecord(anotherFirehose, {
role: importedRole,
});

Expand Down Expand Up @@ -438,7 +438,7 @@ describe('schedule target', () => {
test('adds permissions to execution role for sending messages to DLQ', () => {
const dlq = new sqs.Queue(stack, 'DummyDeadLetterQueue');

const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
deadLetterQueue: dlq,
});

Expand Down Expand Up @@ -473,7 +473,7 @@ describe('schedule target', () => {
test('adds permission to execution role when imported DLQ is in same account', () => {
const importedQueue = sqs.Queue.fromQueueArn(stack, 'ImportedQueue', 'arn:aws:sqs:us-east-1:123456789012:queue1');

const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
deadLetterQueue: importedQueue,
});

Expand Down Expand Up @@ -504,7 +504,7 @@ describe('schedule target', () => {
});

test('renders expected retry policy', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
retryAttempts: 5,
maxEventAge: Duration.hours(3),
});
Expand All @@ -531,7 +531,7 @@ describe('schedule target', () => {
});

test('throws when retry policy max age is more than 1 day', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
maxEventAge: Duration.days(3),
});

Expand All @@ -543,7 +543,7 @@ describe('schedule target', () => {
});

test('throws when retry policy max age is less than 1 minute', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
maxEventAge: Duration.seconds(59),
});

Expand All @@ -555,7 +555,7 @@ describe('schedule target', () => {
});

test('throws when retry policy max retry attempts is out of the allowed limits', () => {
const firehoseTarget = new KinesisDataFirehosePutRecord(firehoseStream, {
const firehoseTarget = new FirehosePutRecord(firehoseStream, {
retryAttempts: 200,
});

Expand Down

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

Large diffs are not rendered by default.

Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ import { AwsApiCall, ExpectedResult, IntegTest } from '@aws-cdk/integ-tests-alph
import * as cdk from 'aws-cdk-lib';
import * as firehose from 'aws-cdk-lib/aws-kinesisfirehose';
import { Bucket } from 'aws-cdk-lib/aws-s3';
import { KinesisDataFirehosePutRecord } from '../lib';
import { FirehosePutRecord } from '../lib';

/*
* Stack verification steps:
* A record is put to the kinesis data firehose stream by the scheduler
* A record is put to the Amazon Data Firehose stream by the scheduler
* The firehose deliveries the record to S3 bucket
* The assertion checks there is an object in the S3 bucket
*/
Expand Down Expand Up @@ -38,7 +38,7 @@ const firehoseStream = new firehose.DeliveryStream(stack, 'MyFirehoseStream', {

new scheduler.Schedule(stack, 'Schedule', {
schedule: scheduler.ScheduleExpression.rate(cdk.Duration.minutes(1)),
target: new KinesisDataFirehosePutRecord(firehoseStream, {
target: new FirehosePutRecord(firehoseStream, {
input: scheduler.ScheduleTargetInput.fromObject(payload),
}),
});
Expand Down
Loading