Resource: awsAppflowFlow
Provides an AppFlow flow resource.
Example Usage
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as aws from "./.gen/providers/aws";
const awsS3BucketExampleDestination = new aws.s3Bucket.S3Bucket(
this,
"example_destination",
{
bucket: "example_destination",
}
);
const awsS3BucketExampleSource = new aws.s3Bucket.S3Bucket(
this,
"example_source",
{
bucket: "example_source",
}
);
new aws.s3Object.S3Object(this, "example", {
bucket: awsS3BucketExampleSource.id,
key: "example_source.csv",
source: "example_source.csv",
});
const dataAwsIamPolicyDocumentExampleDestination =
new aws.dataAwsIamPolicyDocument.DataAwsIamPolicyDocument(
this,
"example_destination_3",
{
actions: [
"s3:PutObject",
"s3:AbortMultipartUpload",
"s3:ListMultipartUploadParts",
"s3:ListBucketMultipartUploads",
"s3:GetBucketAcl",
"s3:PutObjectAcl",
],
effect: "Allow",
principals: [
{
identifiers: ["appflow.amazonaws.com"],
type: "Service",
},
],
resources: [
"arn:aws:s3:::example_destination",
"arn:aws:s3:::example_destination/*",
],
sid: "AllowAppFlowDestinationActions",
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
dataAwsIamPolicyDocumentExampleDestination.overrideLogicalId(
"example_destination"
);
const dataAwsIamPolicyDocumentExampleSource =
new aws.dataAwsIamPolicyDocument.DataAwsIamPolicyDocument(
this,
"example_source_4",
{
statement: [
{
actions: ["s3:ListBucket", "s3:GetObject"],
effect: "Allow",
principals: [
{
identifiers: ["appflow.amazonaws.com"],
type: "Service",
},
],
resources: [
"arn:aws:s3:::example_source",
"arn:aws:s3:::example_source/*",
],
sid: "AllowAppFlowSourceActions",
},
],
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
dataAwsIamPolicyDocumentExampleSource.overrideLogicalId("example_source");
const awsS3BucketPolicyExampleDestination =
new aws.s3BucketPolicy.S3BucketPolicy(this, "example_destination_5", {
bucket: awsS3BucketExampleDestination.id,
policy: dataAwsIamPolicyDocumentExampleDestination.json,
});
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
awsS3BucketPolicyExampleDestination.overrideLogicalId("example_destination");
const awsS3BucketPolicyExampleSource = new aws.s3BucketPolicy.S3BucketPolicy(
this,
"example_source_6",
{
bucket: awsS3BucketExampleSource.id,
policy: dataAwsIamPolicyDocumentExampleSource.json,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
awsS3BucketPolicyExampleSource.overrideLogicalId("example_source");
const awsAppflowFlowExample = new aws.appflowFlow.AppflowFlow(
this,
"example_7",
{
destinationFlowConfig: [
{
connectorType: "S3",
destinationConnectorProperties: {
s3: {
bucketName: awsS3BucketPolicyExampleDestination.bucket,
s3OutputFormatConfig: {
prefixConfig: {
prefixType: "PATH",
},
},
},
},
},
],
name: "example",
sourceFlowConfig: {
connectorType: "S3",
sourceConnectorProperties: {
s3: {
bucketName: awsS3BucketPolicyExampleSource.bucket,
bucketPrefix: "example",
},
},
},
task: [
{
connectorOperator: [
{
s3: "NO_OP",
},
],
destinationField: "exampleField",
sourceFields: ["exampleField"],
taskType: "Map",
},
],
triggerConfig: {
triggerType: "OnDemand",
},
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
awsAppflowFlowExample.overrideLogicalId("example");
Argument Reference
The following arguments are supported:
name
- (Required) Name of the flow.destinationFlowConfig
- (Required) A Destination Flow Config that controls how Amazon AppFlow places data in the destination connector.sourceFlowConfig
- (Required) The Source Flow Config that controls how Amazon AppFlow retrieves data from the source connector.task
- (Required) A Task that Amazon AppFlow performs while transferring the data in the flow run.triggerConfig
- (Required) A Trigger that determine how and when the flow runs.description
- (Optional) Description of the flow you want to create.kmsArn
- (Optional) ARN (Amazon Resource Name) of the Key Management Service (KMS) key you provide for encryption. This is required if you do not want to use the Amazon AppFlow-managed KMS key. If you don't provide anything here, Amazon AppFlow uses the Amazon AppFlow-managed KMS key.tags
- (Optional) Key-value mapping of resource tags. If configured with a providerdefaultTags
configuration block present, tags with matching keys will overwrite those defined at the provider-level.tagsAll
- Map of tags assigned to the resource, including those inherited from the providerdefaultTags
configuration block.
Destination Flow Config
connectorType
- (Required) Type of connector, such as Salesforce, Amplitude, and so on. Valid values aresalesforce
,singular
,slack
,redshift
,s3
,marketo
,googleanalytics
,zendesk
,servicenow
,datadog
,trendmicro
,snowflake
,dynatrace
,infornexus
,amplitude
,veeva
,eventBridge
,lookoutMetrics
,upsolver
,honeycode
,customerProfiles
,sapoData
, andcustomConnector
.destinationConnectorProperties
- (Required) This stores the information that is required to query a particular connector. See Destination Connector Properties for more information.apiVersion
- (Optional) API version that the destination connector uses.connectorProfileName
- (Optional) Name of the connector profile. This name must be unique for each connector profile in the AWS account.
Destination Connector Properties
customConnector
- (Optional) Properties that are required to query the custom Connector. See Custom Connector Destination Properties for more details.customerProfiles
- (Optional) Properties that are required to query Amazon Connect Customer Profiles. See Customer Profiles Destination Properties for more details.eventBridge
- (Optional) Properties that are required to query Amazon EventBridge. See Generic Destination Properties for more details.honeycode
- (Optional) Properties that are required to query Amazon Honeycode. See Generic Destination Properties for more details.marketo
- (Optional) Properties that are required to query Marketo. See Generic Destination Properties for more details.redshift
- (Optional) Properties that are required to query Amazon Redshift. See Redshift Destination Properties for more details.s3
- (Optional) Properties that are required to query Amazon S3. See S3 Destination Properties for more details.salesforce
- (Optional) Properties that are required to query Salesforce. See Salesforce Destination Properties for more details.sapoData
- (Optional) Properties that are required to query SAPOData. See SAPOData Destination Properties for more details.snowflake
- (Optional) Properties that are required to query Snowflake. See Snowflake Destination Properties for more details.upsolver
- (Optional) Properties that are required to query Upsolver. See Upsolver Destination Properties for more details.zendesk
- (Optional) Properties that are required to query Zendesk. See Zendesk Destination Properties for more details.
Generic Destination Properties
EventBridge, Honeycode, and Marketo destination properties all support the following attributes:
object
- (Required) Object specified in the flow destination.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the destination. See Error Handling Config for more details.
Custom Connector Destination Properties
entityName
- (Required) Entity specified in the custom connector as a destination in the flow.customProperties
- (Optional) Custom properties that are specific to the connector when it's used as a destination in the flow. Maximum of 50 items.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the custom connector as destination. See Error Handling Config for more details.idFieldNames
- (Optional) Name of the field that Amazon AppFlow uses as an ID when performing a write operation such as update, delete, or upsert.writeOperationType
- (Optional) Type of write operation to be performed in the custom connector when it's used as destination. Valid values areinsert
,upsert
,update
, anddelete
.
Customer Profiles Destination Properties
domainName
- (Required) Unique name of the Amazon Connect Customer Profiles domain.objectTypeName
- (Optional) Object specified in the Amazon Connect Customer Profiles flow destination.
Redshift Destination Properties
intermediateBucketName
- (Required) Intermediate bucket that Amazon AppFlow uses when moving data into Amazon Redshift.object
- (Required) Object specified in the Amazon Redshift flow destination.bucketPrefix
- (Optional) Object key for the bucket in which Amazon AppFlow places the destination files.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the destination. See Error Handling Config for more details.
S3 Destination Properties
bucketName
- (Required) Amazon S3 bucket name in which Amazon AppFlow places the transferred data.bucketPrefix
- (Optional) Object key for the bucket in which Amazon AppFlow places the destination files.s3OutputFormatConfig
- (Optional) Configuration that determines how Amazon AppFlow should format the flow output data when Amazon S3 is used as the destination. See S3 Output Format Config for more details.
S3 Output Format Config
aggregationConfig
- (Optional) Aggregation settings that you can use to customize the output format of your flow data. See Aggregation Config for more details.fileType
- (Optional) File type that Amazon AppFlow places in the Amazon S3 bucket. Valid values arecsv
,json
, andparquet
.prefixConfig
- (Optional) Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date. See Prefix Config for more details.preserveSourceDataTyping
- (Optional, Boolean) Whether the data types from the source system need to be preserved (Only valid forparquet
file type)
Salesforce Destination Properties
object
- (Required) Object specified in the flow destination.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the destination. See Error Handling Config for more details.idFieldNames
- (Optional) Name of the field that Amazon AppFlow uses as an ID when performing a write operation such as update or delete.writeOperationType
- (Optional) This specifies the type of write operation to be performed in Salesforce. When the value isupsert
, thenidFieldNames
is required. Valid values areinsert
,upsert
,update
, anddelete
.
SAPOData Destination Properties
objectPath
- (Required) Object path specified in the SAPOData flow destination.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the destination. See Error Handling Config for more details.idFieldNames
- (Optional) Name of the field that Amazon AppFlow uses as an ID when performing a write operation such as update or delete.successResponseHandlingConfig
- (Optional) Determines how Amazon AppFlow handles the success response that it gets from the connector after placing data. See Success Response Handling Config for more details.writeOperation
- (Optional) Possible write operations in the destination connector. When this value is not provided, this defaults to theinsert
operation. Valid values areinsert
,upsert
,update
, anddelete
.
Success Response Handling Config
bucketName
- (Optional) Name of the Amazon S3 bucket.bucketPrefix
- (Optional) Amazon S3 bucket prefix.
Snowflake Destination Properties
intermediateBucketName
- (Required) Intermediate bucket that Amazon AppFlow uses when moving data into Amazon Snowflake.object
- (Required) Object specified in the Amazon Snowflake flow destination.bucketPrefix
- (Optional) Object key for the bucket in which Amazon AppFlow places the destination files.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the destination. See Error Handling Config for more details.
Upsolver Destination Properties
bucketName
- (Required) Upsolver Amazon S3 bucket name in which Amazon AppFlow places the transferred data. This must begin withupsolverAppflow
.bucketPrefix
- (Optional) Object key for the Upsolver Amazon S3 Bucket in which Amazon AppFlow places the destination files.s3OutputFormatConfig
- (Optional) Configuration that determines how Amazon AppFlow should format the flow output data when Upsolver is used as the destination. See Upsolver S3 Output Format Config for more details.
Upsolver S3 Output Format Config
aggregationConfig
- (Optional) Aggregation settings that you can use to customize the output format of your flow data. See Aggregation Config for more details.fileType
- (Optional) File type that Amazon AppFlow places in the Upsolver Amazon S3 bucket. Valid values arecsv
,json
, andparquet
.prefixConfig
- (Optional) Determines the prefix that Amazon AppFlow applies to the folder name in the Amazon S3 bucket. You can name folders according to the flow frequency and date. See Prefix Config for more details.
Aggregation Config
aggregationType
- (Optional) Whether Amazon AppFlow aggregates the flow records into a single file, or leave them unaggregated. Valid values arenone
andsingleFile
.
Prefix Config
prefixFormat
- (Optional) Determines the level of granularity that's included in the prefix. Valid values areyear
,month
,day
,hour
, andminute
.prefixType
- (Optional) Determines the format of the prefix, and whether it applies to the file name, file path, or both. Valid values arefilename
,path
, andPATH_AND_FILENAME
.
Zendesk Destination Properties
object
- (Required) Object specified in the flow destination.errorHandlingConfig
- (Optional) Settings that determine how Amazon AppFlow handles an error when placing data in the destination. See Error Handling Config for more details.idFieldNames
- (Optional) Name of the field that Amazon AppFlow uses as an ID when performing a write operation such as update or delete.writeOperationType
- (Optional) This specifies the type of write operation to be performed in Zendesk. When the value isupsert
, thenidFieldNames
is required. Valid values areinsert
,upsert
,update
, anddelete
.
Error Handling Config
bucketName
- (Optional) Name of the Amazon S3 bucket.bucketPrefix
- (Optional) Amazon S3 bucket prefix.failOnFirstDestinationError
- (Optional, boolean) If the flow should fail after the first instance of a failure when attempting to place data in the destination.
Source Flow Config
connectorType
- (Required) Type of connector, such as Salesforce, Amplitude, and so on. Valid values aresalesforce
,singular
,slack
,redshift
,s3
,marketo
,googleanalytics
,zendesk
,servicenow
,datadog
,trendmicro
,snowflake
,dynatrace
,infornexus
,amplitude
,veeva
,eventBridge
,lookoutMetrics
,upsolver
,honeycode
,customerProfiles
,sapoData
, andcustomConnector
.sourceConnectorProperties
- (Required) Information that is required to query a particular source connector. See Source Connector Properties for details.apiVersion
- (Optional) API version that the destination connector uses.connectorProfileName
- (Optional) Name of the connector profile. This name must be unique for each connector profile in the AWS account.incrementalPullConfig
- (Optional) Defines the configuration for a scheduled incremental data pull. If a valid configuration is provided, the fields specified in the configuration are used when querying for the incremental data pull. See Incremental Pull Config for more details.
Source Connector Properties
amplitude
- (Optional) Information that is required for querying Amplitude. See Generic Source Properties for more details.customConnector
- (Optional) Properties that are applied when the custom connector is being used as a source. See Custom Connector Source Properties.datadog
- (Optional) Information that is required for querying Datadog. See Generic Source Properties for more details.dynratrace
- (Optional) Information that is required for querying Dynatrace. See Generic Source Properties for more details.inforNexus
- (Optional) Information that is required for querying Infor Nexus. See Generic Source Properties for more details.marketo
- (Optional) Information that is required for querying Marketo. See Generic Source Properties for more details.s3
- (Optional) Information that is required for querying Amazon S3. See S3 Source Properties for more details.salesforce
- (Optional) Information that is required for querying Salesforce. See Salesforce Source Properties for more details.sapoData
- (Optional) Information that is required for querying SAPOData as a flow source. See SAPO Source Properties for more details.serviceNow
- (Optional) Information that is required for querying ServiceNow. See Generic Source Properties for more details.singular
- (Optional) Information that is required for querying Singular. See Generic Source Properties for more details.slack
- (Optional) Information that is required for querying Slack. See Generic Source Properties for more details.trendMicro
- (Optional) Information that is required for querying Trend Micro. See Generic Source Properties for more details.veeva
- (Optional) Information that is required for querying Veeva. See Veeva Source Properties for more details.zendesk
- (Optional) Information that is required for querying Zendesk. See Generic Source Properties for more details.
Generic Source Properties
Amplitude, Datadog, Dynatrace, Google Analytics, Infor Nexus, Marketo, ServiceNow, Singular, Slack, Trend Micro, and Zendesk source properties all support the following attributes:
object
- (Required) Object specified in the flow source.
Custom Connector Source Properties
entityName
- (Required) Entity specified in the custom connector as a source in the flow.customProperties
- (Optional) Custom properties that are specific to the connector when it's used as a source in the flow. Maximum of 50 items.
S3 Source Properties
bucketName
- (Required) Amazon S3 bucket name where the source files are stored.bucketPrefix
- (Optional) Object key for the Amazon S3 bucket in which the source files are stored.s3InputFormatConfig
- (Optional) When you use Amazon S3 as the source, the configuration format that you provide the flow input data. See S3 Input Format Config for details.
S3 Input Format Config
s3InputFileType
- (Optional) File type that Amazon AppFlow gets from your Amazon S3 bucket. Valid values arecsv
andjson
.
Salesforce Source Properties
object
- (Required) Object specified in the Salesforce flow source.enableDynamicFieldUpdate
- (Optional, boolean) Flag that enables dynamic fetching of new (recently added) fields in the Salesforce objects while running a flow.includeDeletedRecords
- (Optional, boolean) Whether Amazon AppFlow includes deleted files in the flow run.
SAPOData Source Properties
objectPath
- (Required) Object path specified in the SAPOData flow source.
Veeva Source Properties
object
- (Required) Object specified in the Veeva flow source.documentType
- (Optional) Document type specified in the Veeva document extract flow.includeAllVersions
- (Optional, boolean) Boolean value to include All Versions of files in Veeva document extract flow.includeRenditions
- (Optional, boolean) Boolean value to include file renditions in Veeva document extract flow.includeSourceFiles
- (Optional, boolean) Boolean value to include source files in Veeva document extract flow.
Incremental Pull Config
datetimeTypeFieldName
- (Optional) Field that specifies the date time or timestamp field as the criteria to use when importing incremental records from the source.
Task
sourceFields
- (Required) Source fields to which a particular task is applied.taskType
- (Required) Particular task implementation that Amazon AppFlow performs. Valid values arearithmetic
,filter
,map
,mapAll
,mask
,merge
,passthrough
,truncate
, andvalidate
.connectorOperator
- (Optional) Operation to be performed on the provided source fields. See Connector Operator for details.destinationField
- (Optional) Field in a destination connector, or a field value against which Amazon AppFlow validates a source field.taskProperties
- (Optional) Map used to store task-related information. The execution service looks for particular information based on thetaskType
. Valid keys arevalue
,values
,DATA_TYPE
,UPPER_BOUND
,LOWER_BOUND
,SOURCE_DATA_TYPE
,DESTINATION_DATA_TYPE
,VALIDATION_ACTION
,MASK_VALUE
,MASK_LENGTH
,TRUNCATE_LENGTH
,MATH_OPERATION_FIELDS_ORDER
,CONCAT_FORMAT
,SUBFIELD_CATEGORY_MAP
, andEXCLUDE_SOURCE_FIELDS_LIST
.
Connector Operator
amplitude
- (Optional) Operation to be performed on the provided Amplitude source fields. The only valid value isbetween
.customConnector
- (Optional) Operators supported by the custom connector. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,contains
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,NOT_EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.datadog
- (Optional) Operation to be performed on the provided Datadog source fields. Valid values areprojection
,between
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.dynatrace
- (Optional) Operation to be performed on the provided Dynatrace source fields. Valid values areprojection
,between
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.googleAnalytics
- (Optional) Operation to be performed on the provided Google Analytics source fields. Valid values areprojection
andbetween
.inforNexus
- (Optional) Operation to be performed on the provided Infor Nexus source fields. Valid values areprojection
,between
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.marketo
- (Optional) Operation to be performed on the provided Marketo source fields. Valid values areprojection
,between
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.s3
- (Optional) Operation to be performed on the provided Amazon S3 source fields. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,NOT_EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.salesforce
- (Optional) Operation to be performed on the provided Salesforce source fields. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,contains
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,NOT_EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.sapoData
- (Optional) Operation to be performed on the provided SAPOData source fields. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,contains
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,NOT_EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.serviceNow
- (Optional) Operation to be performed on the provided ServiceNow source fields. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,contains
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,NOT_EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.singular
- (Optional) Operation to be performed on the provided Singular source fields. Valid values areprojection
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.slack
- (Optional) Operation to be performed on the provided Slack source fields. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.trendmicro
- (Optional) Operation to be performed on the provided Trend Micro source fields. Valid values areprojection
,EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.veeva
- (Optional) Operation to be performed on the provided Veeva source fields. Valid values areprojection
,LESS_THAN
,GREATER_THAN
,contains
,between
,LESS_THAN_OR_EQUAL_TO
,GREATER_THAN_OR_EQUAL_TO
,EQUAL_TO
,NOT_EQUAL_TO
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.zendesk
- (Optional) Operation to be performed on the provided Zendesk source fields. Valid values areprojection
,GREATER_THAN
,addition
,multiplication
,division
,subtraction
,MASK_ALL
,MASK_FIRST_N
,MASK_LAST_N
,VALIDATE_NON_NULL
,VALIDATE_NON_ZERO
,VALIDATE_NON_NEGATIVE
,VALIDATE_NUMERIC
, andNO_OP
.
Trigger Config
triggerType
- (Required) Type of flow trigger. Valid values arescheduled
,event
, andonDemand
.triggerProperties
- (Optional) Configuration details of a schedule-triggered flow as defined by the user. Currently, these settings only apply to thescheduled
trigger type. See Scheduled Trigger Properties for details.
Scheduled Trigger Properties
The triggerProperties
block only supports one attribute: scheduled
, a block which in turn supports the following:
scheduleExpression
- (Required) Scheduling expression that determines the rate at which the schedule will run, for examplerate(5Minutes)
.dataPullMode
- (Optional) Whether a scheduled flow has an incremental data transfer or a complete data transfer for each flow run. Valid values areincremental
andcomplete
.firstExecutionFrom
- (Optional) Date range for the records to import from the connector in the first flow run. Must be a valid RFC3339 timestamp.scheduleEndTime
- (Optional) Scheduled end time for a schedule-triggered flow. Must be a valid RFC3339 timestamp.scheduleOffset
- (Optional) Optional offset that is added to the time interval for a schedule-triggered flow. Maximum value of 36000.scheduleStartTime
- (Optional) Scheduled start time for a schedule-triggered flow. Must be a valid RFC3339 timestamp.timezone
- (Optional) Time zone used when referring to the date and time of a scheduled-triggered flow, such asamerica/newYork
.
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as aws from "./.gen/providers/aws";
new aws.appflowFlow.AppflowFlow(this, "example", {
triggerConfig: {
scheduled: [
{
scheduleExpression: "rate(1minutes)",
},
],
},
});
Attributes Reference
In addition to all arguments above, the following attributes are exported:
arn
- Flow's ARN.
Import
AppFlow flows can be imported using the arn
, e.g.: