Resource: awsKinesisAnalyticsApplication
Provides a Kinesis Analytics Application resource. Kinesis Analytics is a managed service that allows processing and analyzing streaming data using standard SQL.
For more details, see the Amazon Kinesis Analytics Documentation.
-> Note: To manage Amazon Kinesis Data Analytics for Apache Flink applications, use the awsKinesisanalyticsv2Application
resource.
Example Usage
Kinesis Stream Input
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as aws from "./.gen/providers/aws";
const awsKinesisStreamTestStream = new aws.kinesisStream.KinesisStream(
this,
"test_stream",
{
name: "terraform-kinesis-test",
shardCount: 1,
}
);
new aws.kinesisAnalyticsApplication.KinesisAnalyticsApplication(
this,
"test_application",
{
inputs: {
kinesisStream: {
resourceArn: awsKinesisStreamTestStream.arn,
roleArn: "${aws_iam_role.test.arn}",
},
namePrefix: "test_prefix",
parallelism: {
count: 1,
},
schema: {
recordColumns: [
{
mapping: "$.test",
name: "test",
sqlType: "VARCHAR(8)",
},
],
recordEncoding: "UTF-8",
recordFormat: {
mappingParameters: {
json: {
recordRowPath: "$",
},
},
},
},
},
name: "kinesis-analytics-application-test",
}
);
Starting An Application
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as aws from "./.gen/providers/aws";
const awsCloudwatchLogGroupExample =
new aws.cloudwatchLogGroup.CloudwatchLogGroup(this, "example", {
name: "analytics",
});
const awsCloudwatchLogStreamExample =
new aws.cloudwatchLogStream.CloudwatchLogStream(this, "example_1", {
logGroupName: awsCloudwatchLogGroupExample.name,
name: "example-kinesis-application",
});
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
awsCloudwatchLogStreamExample.overrideLogicalId("example");
const awsKinesisFirehoseDeliveryStreamExample =
new aws.kinesisFirehoseDeliveryStream.KinesisFirehoseDeliveryStream(
this,
"example_2",
{
destination: "extended_s3",
extendedS3Configuration: {
bucketArn: "${aws_s3_bucket.example.arn}",
roleArn: "${aws_iam_role.example.arn}",
},
name: "example-kinesis-delivery-stream",
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
awsKinesisFirehoseDeliveryStreamExample.overrideLogicalId("example");
const awsKinesisStreamExample = new aws.kinesisStream.KinesisStream(
this,
"example_3",
{
name: "example-kinesis-stream",
shardCount: 1,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
awsKinesisStreamExample.overrideLogicalId("example");
new aws.kinesisAnalyticsApplication.KinesisAnalyticsApplication(this, "test", {
cloudwatchLoggingOptions: {
logStreamArn: awsCloudwatchLogStreamExample.arn,
roleArn: "${aws_iam_role.example.arn}",
},
inputs: {
kinesisStream: {
resourceArn: awsKinesisStreamExample.arn,
roleArn: "${aws_iam_role.example.arn}",
},
namePrefix: "example_prefix",
schema: {
recordColumns: [
{
name: "COLUMN_1",
sqlType: "INTEGER",
},
],
recordFormat: {
mappingParameters: {
csv: {
recordColumnDelimiter: ",",
recordRowDelimiter: "|",
},
},
},
},
startingPositionConfiguration: [
{
startingPosition: "NOW",
},
],
},
name: "example-application",
outputs: [
{
kinesisFirehose: {
resourceArn: awsKinesisFirehoseDeliveryStreamExample.arn,
roleArn: "${aws_iam_role.example.arn}",
},
name: "OUTPUT_1",
schema: {
recordFormatType: "CSV",
},
},
],
startApplication: true,
});
Argument Reference
The following arguments are supported:
name
- (Required) Name of the Kinesis Analytics Application.code
- (Optional) SQL Code to transform input data, and generate output.description
- (Optional) Description of the application.cloudwatchLoggingOptions
- (Optional) The CloudWatch log stream options to monitor application errors. See CloudWatch Logging Options below for more details.inputs
- (Optional) Input configuration of the application. See Inputs below for more details.outputs
- (Optional) Output destination configuration of the application. See Outputs below for more details.referenceDataSources
- (Optional) An S3 Reference Data Source for the application. See Reference Data Sources below for more details.startApplication
- (Optional) Whether to start or stop the Kinesis Analytics Application. To start an application, an input with a definedstartingPosition
must be configured. To modify an application's starting position, first stop the application by settingstartApplication =False
, then updatestartingPosition
and setstartApplication =True
.tags
- Key-value map of tags for the Kinesis Analytics Application. If configured with a providerdefaultTags
configuration block present, tags with matching keys will overwrite those defined at the provider-level.
CloudWatch Logging Options
Configure a CloudWatch Log Stream to monitor application errors.
The cloudwatchLoggingOptions
block supports the following:
logStreamArn
- (Required) The ARN of the CloudWatch Log Stream.roleArn
- (Required) The ARN of the IAM Role used to send application messages.
Inputs
Configure an Input for the Kinesis Analytics Application. You can only have 1 Input configured.
The inputs
block supports the following:
namePrefix
- (Required) The Name Prefix to use when creating an in-application stream.schema
- (Required) The Schema format of the data in the streaming source. See Source Schema below for more details.kinesisFirehose
- (Optional) The Kinesis Firehose configuration for the streaming source. Conflicts withkinesisStream
. See Kinesis Firehose below for more details.kinesisStream
- (Optional) The Kinesis Stream configuration for the streaming source. Conflicts withkinesisFirehose
. See Kinesis Stream below for more details.parallelism
- (Optional) The number of Parallel in-application streams to create. See Parallelism below for more details.processingConfiguration
- (Optional) The Processing Configuration to transform records as they are received from the stream. See Processing Configuration below for more details.startingPositionConfiguration
(Optional) The point at which the application starts processing records from the streaming source. See Starting Position Configuration below for more details.
Outputs
Configure Output destinations for the Kinesis Analytics Application. You can have a maximum of 3 destinations configured.
The outputs
block supports the following:
name
- (Required) The Name of the in-application stream.schema
- (Required) The Schema format of the data written to the destination. See Destination Schema below for more details.kinesisFirehose
- (Optional) The Kinesis Firehose configuration for the destination stream. Conflicts withkinesisStream
. See Kinesis Firehose below for more details.kinesisStream
- (Optional) The Kinesis Stream configuration for the destination stream. Conflicts withkinesisFirehose
. See Kinesis Stream below for more details.lambda
- (Optional) The Lambda function destination. See Lambda below for more details.
Reference Data Sources
Add a Reference Data Source to the Kinesis Analytics Application. You can only have 1 Reference Data Source.
The referenceDataSources
block supports the following:
schema
- (Required) The Schema format of the data in the streaming source. See Source Schema below for more details.tableName
- (Required) The in-application Table Name.s3
- (Optional) The S3 configuration for the reference data source. See S3 Reference below for more details.
Kinesis Firehose
Configuration for a Kinesis Firehose delivery stream.
The kinesisFirehose
block supports the following:
resourceArn
- (Required) The ARN of the Kinesis Firehose delivery stream.roleArn
- (Required) The ARN of the IAM Role used to access the stream.
Kinesis Stream
Configuration for a Kinesis Stream.
The kinesisStream
block supports the following:
resourceArn
- (Required) The ARN of the Kinesis Stream.roleArn
- (Required) The ARN of the IAM Role used to access the stream.
Destination Schema
The Schema format of the data in the destination.
The schema
block supports the following:
recordFormatType
- (Required) The Format Type of the records on the output stream. Can becsv
orjson
.
Source Schema
The Schema format of the data in the streaming source.
The schema
block supports the following:
recordColumns
- (Required) The Record Column mapping for the streaming source data element. See Record Columns below for more details.recordFormat
- (Required) The Record Format and mapping information to schematize a record. See Record Format below for more details.recordEncoding
- (Optional) The Encoding of the record in the streaming source.
Parallelism
Configures the number of Parallel in-application streams to create.
The parallelism
block supports the following:
count
- (Required) The Count of streams.
Processing Configuration
The Processing Configuration to transform records as they are received from the stream.
The processingConfiguration
block supports the following:
lambda
- (Required) The Lambda function configuration. See Lambda below for more details.
Lambda
The Lambda function that pre-processes records in the stream.
The lambda
block supports the following:
resourceArn
- (Required) The ARN of the Lambda function.roleArn
- (Required) The ARN of the IAM Role used to access the Lambda function.
Starting Position Configuration
The point at which the application reads from the streaming source.
The startingPositionConfiguration
block supports the following:
startingPosition
- (Required) The starting position on the stream. Valid values:LAST_STOPPED_POINT
,now
,TRIM_HORIZON
.
Record Columns
The Column mapping of each data element in the streaming source to the corresponding column in the in-application stream.
The recordColumns
block supports the following:
name
- (Required) Name of the column.sqlType
- (Required) The SQL Type of the column.mapping
- (Optional) The Mapping reference to the data element.
Record Format
The Record Format and relevant mapping information that should be applied to schematize the records on the stream.
The recordFormat
block supports the following:
recordFormatType
- (Required) The type of Record Format. Can becsv
orjson
.mappingParameters
- (Optional) The Mapping Information for the record format. See Mapping Parameters below for more details.
Mapping Parameters
Provides Mapping information specific to the record format on the streaming source.
The mappingParameters
block supports the following:
csv
- (Optional) Mapping information when the record format uses delimiters. See CSV Mapping Parameters below for more details.json
- (Optional) Mapping information when JSON is the record format on the streaming source. See JSON Mapping Parameters below for more details.
CSV Mapping Parameters
Mapping information when the record format uses delimiters.
The csv
block supports the following:
recordColumnDelimiter
- (Required) The Column Delimiter.recordRowDelimiter
- (Required) The Row Delimiter.
JSON Mapping Parameters
Mapping information when JSON is the record format on the streaming source.
The json
block supports the following:
recordRowPath
- (Required) Path to the top-level parent that contains the records.
S3 Reference
Identifies the S3 bucket and object that contains the reference data.
The s3
blcok supports the following:
bucketArn
- (Required) The S3 Bucket ARN.fileKey
- (Required) The File Key name containing reference data.roleArn
- (Required) The IAM Role ARN to read the data.
Attributes Reference
In addition to all arguments above, the following attributes are exported:
id
- The ARN of the Kinesis Analytics Application.arn
- The ARN of the Kinesis Analytics Appliation.createTimestamp
- The Timestamp when the application version was created.lastUpdateTimestamp
- The Timestamp when the application was last updated.status
- The Status of the application.version
- The Version of the application.tagsAll
- A map of tags assigned to the resource, including those inherited from the providerdefaultTags
configuration block.
Import
Kinesis Analytics Application can be imported by using ARN, e.g.,