azurermDataFactoryCustomDataset
Manages a Dataset inside an Azure Data Factory. This is a generic resource that supports all different Dataset Types.
Example Usage
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as azurerm from "./.gen/providers/azurerm";
/*The following providers are missing schema information and might need manual adjustments to synthesize correctly: azurerm.
For a more precise conversion please use the --provider flag in convert.*/
const azurermResourceGroupExample = new azurerm.resourceGroup.ResourceGroup(
this,
"example",
{
location: "West Europe",
name: "example-resources",
}
);
const azurermStorageAccountExample = new azurerm.storageAccount.StorageAccount(
this,
"example_1",
{
account_kind: "BlobStorage",
account_replication_type: "LRS",
account_tier: "Standard",
location: azurermResourceGroupExample.location,
name: "example",
resource_group_name: azurermResourceGroupExample.name,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermStorageAccountExample.overrideLogicalId("example");
const azurermStorageContainerExample =
new azurerm.storageContainer.StorageContainer(this, "example_2", {
container_access_type: "private",
name: "content",
storage_account_name: azurermStorageAccountExample.name,
});
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermStorageContainerExample.overrideLogicalId("example");
const azurermDataFactoryExample = new azurerm.dataFactory.DataFactory(
this,
"example_3",
{
identity: [
{
type: "SystemAssigned",
},
],
location: azurermResourceGroupExample.location,
name: "example",
resource_group_name: azurermResourceGroupExample.name,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryExample.overrideLogicalId("example");
const azurermDataFactoryLinkedCustomServiceExample =
new azurerm.dataFactoryLinkedCustomService.DataFactoryLinkedCustomService(
this,
"example_4",
{
data_factory_id: azurermDataFactoryExample.id,
name: "example",
type: "AzureBlobStorage",
type_properties_json: `{
"connectionString":"\${${azurermStorageAccountExample.primaryConnectionString}}"
}
`,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryLinkedCustomServiceExample.overrideLogicalId("example");
const azurermDataFactoryCustomDatasetExample =
new azurerm.dataFactoryCustomDataset.DataFactoryCustomDataset(
this,
"example_5",
{
additional_properties: [
{
bar: "test2",
foo: "test1",
},
],
annotations: ["test1", "test2", "test3"],
data_factory_id: azurermDataFactoryExample.id,
description: "test description",
folder: "testFolder",
linked_service: [
{
name: azurermDataFactoryLinkedCustomServiceExample.name,
parameters: [
{
key1: "value1",
},
],
},
],
name: "example",
parameters: [
{
Bar: "Test2",
foo: "test1",
},
],
schema_json:
'{\n "type": "object",\n "properties": {\n "name": {\n "type": "object",\n "properties": {\n "firstName": {\n "type": "string"\n },\n "lastName": {\n "type": "string"\n }\n }\n },\n "age": {\n "type": "integer"\n }\n }\n}\n',
type: "Json",
type_properties_json: `{
"location": {
"container":"\${${azurermStorageContainerExample.name}}",
"fileName":"foo.txt",
"folderPath": "foo/bar/",
"type":"AzureBlobStorageLocation"
},
"encodingName":"UTF-8"
}
`,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryCustomDatasetExample.overrideLogicalId("example");
Argument Reference
-
name
- (Required) Specifies the name of the Data Factory Dataset. Changing this forces a new resource to be created. Must be globally unique. See the Microsoft documentation for all restrictions. -
dataFactoryId
- (Required) The Data Factory ID in which to associate the Dataset with. Changing this forces a new resource. -
linkedService
- (Required) AlinkedService
block as defined below. -
type
- (Required) The type of dataset that will be associated with Data Factory. Changing this forces a new resource to be created. -
typePropertiesJson
- (Required) A JSON object that contains the properties of the Data Factory Dataset. -
additionalProperties
- (Optional) A map of additional properties to associate with the Data Factory Dataset. -
annotations
- (Optional) List of tags that can be used for describing the Data Factory Dataset. -
description
- (Optional) The description for the Data Factory Dataset. -
folder
- (Optional) The folder that this Dataset is in. If not specified, the Dataset will appear at the root level. -
parameters
- (Optional) A map of parameters to associate with the Data Factory Dataset. -
schemaJson
- (Optional) A JSON object that contains the schema of the Data Factory Dataset.
A linkedService
block supports the following:
-
name
- (Required) The name of the Data Factory Linked Service. -
parameters
- (Optional) A map of parameters to associate with the Data Factory Linked Service.
Attributes Reference
The following attributes are exported:
id
- The ID of the Data Factory Dataset.
Timeouts
The timeouts
block allows you to specify timeouts for certain actions:
create
- (Defaults to 30 minutes) Used when creating the Data Factory Dataset.update
- (Defaults to 30 minutes) Used when updating the Data Factory Dataset.read
- (Defaults to 5 minutes) Used when retrieving the Data Factory Dataset.delete
- (Defaults to 30 minutes) Used when deleting the Data Factory Dataset.
Import
Data Factory Datasets can be imported using the resourceId
, e.g.