azurermDataFactoryPipeline
Manages a Pipeline inside a Azure Data Factory.
Example Usage
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as azurerm from "./.gen/providers/azurerm";
/*The following providers are missing schema information and might need manual adjustments to synthesize correctly: azurerm.
For a more precise conversion please use the --provider flag in convert.*/
const azurermResourceGroupExample = new azurerm.resourceGroup.ResourceGroup(
this,
"example",
{
location: "West Europe",
name: "example-resources",
}
);
const azurermDataFactoryExample = new azurerm.dataFactory.DataFactory(
this,
"example_1",
{
location: azurermResourceGroupExample.location,
name: "example",
resource_group_name: azurermResourceGroupExample.name,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryExample.overrideLogicalId("example");
const azurermDataFactoryPipelineExample =
new azurerm.dataFactoryPipeline.DataFactoryPipeline(this, "example_2", {
data_factory_id: azurermDataFactoryExample.id,
name: "example",
});
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryPipelineExample.overrideLogicalId("example");
Example Usage with Activities
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as azurerm from "./.gen/providers/azurerm";
/*The following providers are missing schema information and might need manual adjustments to synthesize correctly: azurerm.
For a more precise conversion please use the --provider flag in convert.*/
new azurerm.dataFactoryPipeline.DataFactoryPipeline(this, "test", {
activities_json:
'[\n {\n "name": "Append variable1",\n "type": "AppendVariable",\n "dependsOn": [],\n "userProperties": [],\n "typeProperties": {\n "variableName": "bob",\n "value": "something"\n }\n }\n]\n',
data_factory_id: "${azurerm_data_factory.test.id}",
name: "acctest%d",
variables: [
{
bob: "item1",
},
],
});
Argument Reference
The following arguments are supported:
-
name
- (Required) Specifies the name of the Data Factory Pipeline. Changing this forces a new resource to be created. Must be globally unique. See the Microsoft documentation for all restrictions. -
dataFactoryId
- (Required) The Data Factory ID in which to associate the Linked Service with. Changing this forces a new resource. -
description
- (Optional) The description for the Data Factory Pipeline. -
annotations
- (Optional) List of tags that can be used for describing the Data Factory Pipeline. -
concurrency
- (Optional) The max number of concurrent runs for the Data Factory Pipeline. Must be between1
and50
. -
folder
- (Optional) The folder that this Pipeline is in. If not specified, the Pipeline will appear at the root level. -
moniterMetricsAfterDuration
- (Optional) The TimeSpan value after which an Azure Monitoring Metric is fired. -
parameters
- (Optional) A map of parameters to associate with the Data Factory Pipeline. -
variables
- (Optional) A map of variables to associate with the Data Factory Pipeline. -
activitiesJson
- (Optional) A JSON object that contains the activities that will be associated with the Data Factory Pipeline.
Attributes Reference
The following attributes are exported:
id
- The ID of the Data Factory Pipeline.
Timeouts
The timeouts
block allows you to specify timeouts for certain actions:
create
- (Defaults to 30 minutes) Used when creating the Data Factory Pipeline.update
- (Defaults to 30 minutes) Used when updating the Data Factory Pipeline.read
- (Defaults to 5 minutes) Used when retrieving the Data Factory Pipeline.delete
- (Defaults to 30 minutes) Used when deleting the Data Factory Pipeline.
Import
Data Factory Pipeline's can be imported using the resourceId
, e.g.