azurermDataFactoryTriggerBlobEvent
Manages a Blob Event Trigger inside an Azure Data Factory.
Example Usage
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as azurerm from "./.gen/providers/azurerm";
/*The following providers are missing schema information and might need manual adjustments to synthesize correctly: azurerm.
For a more precise conversion please use the --provider flag in convert.*/
const azurermResourceGroupExample = new azurerm.resourceGroup.ResourceGroup(
this,
"example",
{
location: "West Europe",
name: "example-resources",
}
);
const azurermStorageAccountExample = new azurerm.storageAccount.StorageAccount(
this,
"example_1",
{
account_replication_type: "LRS",
account_tier: "Standard",
location: azurermResourceGroupExample.location,
name: "example",
resource_group_name: azurermResourceGroupExample.name,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermStorageAccountExample.overrideLogicalId("example");
const azurermDataFactoryExample = new azurerm.dataFactory.DataFactory(
this,
"example_2",
{
location: azurermResourceGroupExample.location,
name: "example",
resource_group_name: azurermResourceGroupExample.name,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryExample.overrideLogicalId("example");
const azurermDataFactoryPipelineExample =
new azurerm.dataFactoryPipeline.DataFactoryPipeline(this, "example_3", {
data_factory_id: azurermDataFactoryExample.id,
name: "example",
});
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryPipelineExample.overrideLogicalId("example");
const azurermDataFactoryTriggerBlobEventExample =
new azurerm.dataFactoryTriggerBlobEvent.DataFactoryTriggerBlobEvent(
this,
"example_4",
{
activated: true,
additional_properties: [
{
bar: "bar2",
foo: "foo1",
},
],
annotations: ["test1", "test2", "test3"],
blob_path_ends_with: ".txt",
data_factory_id: azurermDataFactoryExample.id,
description: "example description",
events: [
"Microsoft.Storage.BlobCreated",
"Microsoft.Storage.BlobDeleted",
],
ignore_empty_blobs: true,
name: "example",
pipeline: [
{
name: azurermDataFactoryPipelineExample.name,
parameters: [
{
Env: "Prod",
},
],
},
],
storage_account_id: azurermStorageAccountExample.id,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryTriggerBlobEventExample.overrideLogicalId("example");
Argument Reference
The following arguments are supported:
-
name
- (Required) Specifies the name of the Data Factory Blob Event Trigger. Changing this forces a new resource to be created. -
dataFactoryId
- (Required) The ID of Data Factory in which to associate the Trigger with. Changing this forces a new resource. -
storageAccountId
- (Required) The ID of Storage Account in which blob event will be listened. Changing this forces a new resource. -
events
- (Required) List of events that will fire this trigger. Possible values aremicrosoftStorageBlobCreated
andmicrosoftStorageBlobDeleted
. -
pipeline
- (Required) One or morepipeline
blocks as defined below. -
activated
- (Optional) Specifies if the Data Factory Blob Event Trigger is activated. Defaults totrue
. -
additionalProperties
- (Optional) A map of additional properties to associate with the Data Factory Blob Event Trigger. -
annotations
- (Optional) List of tags that can be used for describing the Data Factory Blob Event Trigger. -
blobPathBeginsWith
- (Optional) The pattern that blob path starts with for trigger to fire. -
blobPathEndsWith
- (Optional) The pattern that blob path ends with for trigger to fire.
\~> Note: At least one of blobPathBeginsWith
and blobPathEndsWith
must be set.
-
description
- (Optional) The description for the Data Factory Blob Event Trigger. -
ignoreEmptyBlobs
- (Optional) are blobs with zero bytes ignored?
A pipeline
block supports the following:
-
name
- (Required) The Data Factory Pipeline name that the trigger will act on. -
parameters
- (Optional) The Data Factory Pipeline parameters that the trigger will act on.
Attributes Reference
The following attributes are exported:
id
- The ID of the Data Factory Blob Event Trigger.
Timeouts
The timeouts
block allows you to specify timeouts for certain actions:
create
- (Defaults to 30 minutes) Used when creating the Data Factory Blob Event Trigger.update
- (Defaults to 30 minutes) Used when updating the Data Factory Blob Event Trigger.read
- (Defaults to 5 minutes) Used when retrieving the Data Factory Blob Event Trigger.delete
- (Defaults to 30 minutes) Used when deleting the Data Factory Blob Event Trigger.
Import
Data Factory Blob Event Trigger can be imported using the resourceId
, e.g.