azurermDataFactoryDatasetSnowflake
Manages a Snowflake Dataset inside an Azure Data Factory.
Example Usage
/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as azurerm from "./.gen/providers/azurerm";
/*The following providers are missing schema information and might need manual adjustments to synthesize correctly: azurerm.
For a more precise conversion please use the --provider flag in convert.*/
const azurermResourceGroupExample = new azurerm.resourceGroup.ResourceGroup(
this,
"example",
{
location: "West Europe",
name: "example-resources",
}
);
const azurermDataFactoryExample = new azurerm.dataFactory.DataFactory(
this,
"example_1",
{
location: azurermResourceGroupExample.location,
name: "example",
resource_group_name: azurermResourceGroupExample.name,
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryExample.overrideLogicalId("example");
const azurermDataFactoryLinkedServiceSnowflakeExample =
new azurerm.dataFactoryLinkedServiceSnowflake.DataFactoryLinkedServiceSnowflake(
this,
"example_2",
{
connection_string:
"jdbc:snowflake://account.region.snowflakecomputing.com/?user=user&db=db&warehouse=wh",
data_factory_id: azurermDataFactoryExample.id,
name: "example",
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryLinkedServiceSnowflakeExample.overrideLogicalId("example");
const azurermDataFactoryDatasetSnowflakeExample =
new azurerm.dataFactoryDatasetSnowflake.DataFactoryDatasetSnowflake(
this,
"example_3",
{
data_factory_id: azurermDataFactoryExample.id,
linked_service_name: azurermDataFactoryLinkedServiceSnowflakeExample.name,
name: "example",
schema_name: "foo_schema",
table_name: "foo_table",
}
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryDatasetSnowflakeExample.overrideLogicalId("example");
Argument Reference
The following arguments are supported:
-
name
- (Required) Specifies the name of the Data Factory Dataset Snowflake. Changing this forces a new resource to be created. Must be globally unique. See the Microsoft documentation for all restrictions. -
dataFactoryId
- (Required) The Data Factory ID in which to associate the Linked Service with. Changing this forces a new resource. -
linkedServiceName
- (Required) The Data Factory Linked Service name in which to associate the Dataset with. -
schemaName
- (Optional) The schema name of the Data Factory Dataset Snowflake. -
tableName
- (Optional) The table name of the Data Factory Dataset Snowflake. -
folder
- (Optional) The folder that this Dataset is in. If not specified, the Dataset will appear at the root level. -
schemaColumn
- (Optional) AschemaColumn
block as defined below. -
description
- (Optional) The description for the Data Factory Dataset Snowflake. -
annotations
- (Optional) List of tags that can be used for describing the Data Factory Dataset Snowflake. -
parameters
- (Optional) A map of parameters to associate with the Data Factory Dataset Snowflake. -
additionalProperties
- (Optional) A map of additional properties to associate with the Data Factory Dataset Snowflake.
A schemaColumn
block supports the following:
-
name
- (Required) The name of the column. -
type
- (Optional) Type of the column. Valid values arenumber
,decimal
,numeric
,int
,integer
,bigint
,smallint
,float``float4
,float8
,double
,doublePrecision
,real
,varchar
,char
,character
,string
,text
,binary
,varbinary
,boolean
,date
,datetime
,time
,timestamp
,timestampLtz
,timestampNtz
,timestampTz
,variant
,object
,array
,geography
. Please note these values are case sensitive. -
precision
- (Optional) The total number of digits allowed. -
scale
- (Optional) The number of digits allowed to the right of the decimal point.
Attributes Reference
The following attributes are exported:
id
- The ID of the Data Factory Snowflake Dataset.
Timeouts
The timeouts
block allows you to specify timeouts for certain actions:
create
- (Defaults to 30 minutes) Used when creating the Data Factory Snowflake Dataset.update
- (Defaults to 30 minutes) Used when updating the Data Factory Snowflake Dataset.read
- (Defaults to 5 minutes) Used when retrieving the Data Factory Snowflake Dataset.delete
- (Defaults to 30 minutes) Used when deleting the Data Factory Snowflake Dataset.
Import
Data Factory Snowflake Datasets can be imported using the resourceId
, e.g.