Skip to content

azurermDataFactoryDatasetBinary

Manages a Data Factory Binary Dataset inside an Azure Data Factory.

Example Usage

/*Provider bindings are generated by running cdktf get.
See https://cdk.tf/provider-generation for more details.*/
import * as azurerm from "./.gen/providers/azurerm";
/*The following providers are missing schema information and might need manual adjustments to synthesize correctly: azurerm.
For a more precise conversion please use the --provider flag in convert.*/
const azurermResourceGroupExample = new azurerm.resourceGroup.ResourceGroup(
  this,
  "example",
  {
    location: "West Europe",
    name: "example",
  }
);
const azurermDataFactoryExample = new azurerm.dataFactory.DataFactory(
  this,
  "example_1",
  {
    location: azurermResourceGroupExample.location,
    name: "example",
    resource_group_name: azurermResourceGroupExample.name,
  }
);
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryExample.overrideLogicalId("example");
const azurermDataFactoryLinkedServiceSftpExample =
  new azurerm.dataFactoryLinkedServiceSftp.DataFactoryLinkedServiceSftp(
    this,
    "example_2",
    {
      authentication_type: "Basic",
      data_factory_id: azurermDataFactoryExample.id,
      host: "http://www.bing.com",
      name: "example",
      password: "bar",
      port: 22,
      username: "foo",
    }
  );
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryLinkedServiceSftpExample.overrideLogicalId("example");
const azurermDataFactoryDatasetBinaryExample =
  new azurerm.dataFactoryDatasetBinary.DataFactoryDatasetBinary(
    this,
    "example_3",
    {
      data_factory_id: azurermDataFactoryExample.id,
      linked_service_name: azurermDataFactoryLinkedServiceSftpExample.name,
      name: "example",
      sftp_server_location: [
        {
          filename: "**",
          path: "/test/",
        },
      ],
    }
  );
/*This allows the Terraform resource name to match the original name. You can remove the call if you don't need them to match.*/
azurermDataFactoryDatasetBinaryExample.overrideLogicalId("example");

Arguments Reference

The following arguments are supported:

  • name - (Required) Specifies the name of the Data Factory Binary Dataset. Changing this forces a new resource to be created. Must be globally unique. See the Microsoft documentation for all restrictions.

  • dataFactoryId - (Required) The Data Factory ID in which to associate the Linked Service with. Changing this forces a new resource.

  • linkedServiceName - (Required) The Data Factory Linked Service name in which to associate the Binary Dataset with.


  • additionalProperties - (Optional) A map of additional properties to associate with the Data Factory Binary Dataset.

  • annotations - (Optional) List of tags that can be used for describing the Data Factory Binary Dataset.

  • compression - (Optional) A compression block as defined below.

  • description - (Optional) The description for the Data Factory Dataset.

  • folder - (Optional) The folder that this Dataset is in. If not specified, the Dataset will appear at the root level.

  • parameters - (Optional) Specifies a list of parameters to associate with the Data Factory Binary Dataset.

The following supported locations for a Binary Dataset. One of these should be specified:

  • httpServerLocation - (Optional) A httpServerLocation block as defined below.

  • azureBlobStorageLocation - (Optional) A azureBlobStorageLocation block as defined below.

  • sftpServerLocation - (Optional) A sftpServerLocation block as defined below.


A compression block supports the following:

  • type - (Required) The type of compression used during transport. Possible values are bZip2, deflate, gZip, tar, tarGZip and zipDeflate.

  • level - (Optional) The level of compression. Possible values are fastest and optimal.


A httpServerLocation block supports the following:

  • relativeUrl - (Required) The base URL to the web server hosting the file.

  • path - (Required) The folder path to the file on the web server.

  • filename - (Required) The filename of the file on the web server.

  • dynamicPathEnabled - (Optional) Is the path using dynamic expression, function or system variables? Defaults to false.

  • dynamicFilenameEnabled - (Optional) Is the filename using dynamic expression, function or system variables? Defaults to false.


A azureBlobStorageLocation block supports the following:

  • container - (Required) The container on the Azure Blob Storage Account hosting the file.

  • path - (Optional) The folder path to the file in the blob container.

  • filename - (Optional) The filename of the file in the blob container.

  • dynamicContainerEnabled - (Optional) Is the container using dynamic expression, function or system variables? Defaults to false.

  • dynamicPathEnabled - (Optional) Is the path using dynamic expression, function or system variables? Defaults to false.

  • dynamicFilenameEnabled - (Optional) Is the filename using dynamic expression, function or system variables? Defaults to false.


A sftpServerLocation block supports the following:

  • path - (Required) The folder path to the file on the SFTP server.

  • filename - (Required) The filename of the file on the SFTP server.

  • dynamicPathEnabled - (Optional) Is the path using dynamic expression, function or system variables? Defaults to false.

  • dynamicFilenameEnabled - (Optional) Is the filename using dynamic expression, function or system variables? Defaults to false.

Attributes Reference

In addition to the Arguments listed above - the following Attributes are exported:

  • id - The ID of the Data Factory Dataset.

Timeouts

The timeouts block allows you to specify timeouts for certain actions:

  • create - (Defaults to 30 minutes) Used when creating the Data Factory Dataset.
  • read - (Defaults to 5 minutes) Used when retrieving the Data Factory Dataset.
  • update - (Defaults to 30 minutes) Used when updating the Data Factory Dataset.
  • delete - (Defaults to 30 minutes) Used when deleting the Data Factory Dataset.

Import

Data Factory Binary Datasets can be imported using the resourceId, e.g.

terraform import azurerm_data_factory_dataset_binary.example /subscriptions/00000000-0000-0000-0000-000000000000/resourceGroups/example/providers/Microsoft.DataFactory/factories/example/datasets/example