airbyte.DestinationDatabricks
Explore with Pulumi AI
DestinationDatabricks Resource
Example Usage
Coming soon!
Coming soon!
Coming soon!
Coming soon!
package generated_program;
import com.pulumi.Context;
import com.pulumi.Pulumi;
import com.pulumi.core.Output;
import com.pulumi.airbyte.DestinationDatabricks;
import com.pulumi.airbyte.DestinationDatabricksArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs;
import com.pulumi.airbyte.inputs.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs;
import java.util.List;
import java.util.ArrayList;
import java.util.Map;
import java.io.File;
import java.nio.file.Files;
import java.nio.file.Paths;
public class App {
    public static void main(String[] args) {
        Pulumi.run(App::stack);
    }
    public static void stack(Context ctx) {
        var myDestinationDatabricks = new DestinationDatabricks("myDestinationDatabricks", DestinationDatabricksArgs.builder()
            .configuration(DestinationDatabricksConfigurationArgs.builder()
                .accept_terms(false)
                .authentication(DestinationDatabricksConfigurationAuthenticationArgs.builder()
                    .oAuth2Recommended(DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs.builder()
                        .clientId("...my_client_id...")
                        .secret("...my_secret...")
                        .build())
                    .personalAccessToken(DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs.builder()
                        .personalAccessToken("...my_personal_access_token...")
                        .build())
                    .build())
                .database("...my_database...")
                .hostname("abc-12345678-wxyz.cloud.databricks.com")
                .http_path("sql/1.0/warehouses/0000-1111111-abcd90")
                .port("443")
                .purge_staging_data(false)
                .raw_schema_override("...my_raw_schema_override...")
                .schema("default")
                .build())
            .definitionId("fb6a88f5-a304-46f5-ab8b-4280a6d91f99")
            .workspaceId("2615758c-c904-459e-9fd6-c8a55cba9327")
            .build());
    }
}
resources:
  myDestinationDatabricks:
    type: airbyte:DestinationDatabricks
    properties:
      configuration:
        accept_terms: false
        authentication:
          oAuth2Recommended:
            clientId: '...my_client_id...'
            secret: '...my_secret...'
          personalAccessToken:
            personalAccessToken: '...my_personal_access_token...'
        database: '...my_database...'
        hostname: abc-12345678-wxyz.cloud.databricks.com
        http_path: sql/1.0/warehouses/0000-1111111-abcd90
        port: '443'
        purge_staging_data: false
        raw_schema_override: '...my_raw_schema_override...'
        schema: default
      definitionId: fb6a88f5-a304-46f5-ab8b-4280a6d91f99
      workspaceId: 2615758c-c904-459e-9fd6-c8a55cba9327
Create DestinationDatabricks Resource
Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.
Constructor syntax
new DestinationDatabricks(name: string, args: DestinationDatabricksArgs, opts?: CustomResourceOptions);@overload
def DestinationDatabricks(resource_name: str,
                          args: DestinationDatabricksArgs,
                          opts: Optional[ResourceOptions] = None)
@overload
def DestinationDatabricks(resource_name: str,
                          opts: Optional[ResourceOptions] = None,
                          configuration: Optional[DestinationDatabricksConfigurationArgs] = None,
                          workspace_id: Optional[str] = None,
                          definition_id: Optional[str] = None,
                          name: Optional[str] = None)func NewDestinationDatabricks(ctx *Context, name string, args DestinationDatabricksArgs, opts ...ResourceOption) (*DestinationDatabricks, error)public DestinationDatabricks(string name, DestinationDatabricksArgs args, CustomResourceOptions? opts = null)
public DestinationDatabricks(String name, DestinationDatabricksArgs args)
public DestinationDatabricks(String name, DestinationDatabricksArgs args, CustomResourceOptions options)
type: airbyte:DestinationDatabricks
properties: # The arguments to resource properties.
options: # Bag of options to control resource's behavior.
Parameters
- name string
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- resource_name str
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts ResourceOptions
- Bag of options to control resource's behavior.
- ctx Context
- Context object for the current deployment.
- name string
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts ResourceOption
- Bag of options to control resource's behavior.
- name string
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- opts CustomResourceOptions
- Bag of options to control resource's behavior.
- name String
- The unique name of the resource.
- args DestinationDatabricksArgs
- The arguments to resource properties.
- options CustomResourceOptions
- Bag of options to control resource's behavior.
Constructor example
The following reference example uses placeholder values for all input properties.
var destinationDatabricksResource = new Airbyte.DestinationDatabricks("destinationDatabricksResource", new()
{
    Configuration = new Airbyte.Inputs.DestinationDatabricksConfigurationArgs
    {
        Authentication = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationArgs
        {
            OAuth2Recommended = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs
            {
                ClientId = "string",
                Secret = "string",
            },
            PersonalAccessToken = new Airbyte.Inputs.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs
            {
                PersonalAccessToken = "string",
            },
        },
        Database = "string",
        Hostname = "string",
        HttpPath = "string",
        AcceptTerms = false,
        Port = "string",
        PurgeStagingData = false,
        RawSchemaOverride = "string",
        Schema = "string",
    },
    WorkspaceId = "string",
    DefinitionId = "string",
    Name = "string",
});
example, err := airbyte.NewDestinationDatabricks(ctx, "destinationDatabricksResource", &airbyte.DestinationDatabricksArgs{
Configuration: &.DestinationDatabricksConfigurationArgs{
Authentication: &.DestinationDatabricksConfigurationAuthenticationArgs{
OAuth2Recommended: &.DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs{
ClientId: pulumi.String("string"),
Secret: pulumi.String("string"),
},
PersonalAccessToken: &.DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs{
PersonalAccessToken: pulumi.String("string"),
},
},
Database: pulumi.String("string"),
Hostname: pulumi.String("string"),
HttpPath: pulumi.String("string"),
AcceptTerms: pulumi.Bool(false),
Port: pulumi.String("string"),
PurgeStagingData: pulumi.Bool(false),
RawSchemaOverride: pulumi.String("string"),
Schema: pulumi.String("string"),
},
WorkspaceId: pulumi.String("string"),
DefinitionId: pulumi.String("string"),
Name: pulumi.String("string"),
})
var destinationDatabricksResource = new DestinationDatabricks("destinationDatabricksResource", DestinationDatabricksArgs.builder()
    .configuration(DestinationDatabricksConfigurationArgs.builder()
        .authentication(DestinationDatabricksConfigurationAuthenticationArgs.builder()
            .oAuth2Recommended(DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs.builder()
                .clientId("string")
                .secret("string")
                .build())
            .personalAccessToken(DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs.builder()
                .personalAccessToken("string")
                .build())
            .build())
        .database("string")
        .hostname("string")
        .httpPath("string")
        .acceptTerms(false)
        .port("string")
        .purgeStagingData(false)
        .rawSchemaOverride("string")
        .schema("string")
        .build())
    .workspaceId("string")
    .definitionId("string")
    .name("string")
    .build());
destination_databricks_resource = airbyte.DestinationDatabricks("destinationDatabricksResource",
    configuration={
        "authentication": {
            "o_auth2_recommended": {
                "client_id": "string",
                "secret": "string",
            },
            "personal_access_token": {
                "personal_access_token": "string",
            },
        },
        "database": "string",
        "hostname": "string",
        "http_path": "string",
        "accept_terms": False,
        "port": "string",
        "purge_staging_data": False,
        "raw_schema_override": "string",
        "schema": "string",
    },
    workspace_id="string",
    definition_id="string",
    name="string")
const destinationDatabricksResource = new airbyte.DestinationDatabricks("destinationDatabricksResource", {
    configuration: {
        authentication: {
            oAuth2Recommended: {
                clientId: "string",
                secret: "string",
            },
            personalAccessToken: {
                personalAccessToken: "string",
            },
        },
        database: "string",
        hostname: "string",
        httpPath: "string",
        acceptTerms: false,
        port: "string",
        purgeStagingData: false,
        rawSchemaOverride: "string",
        schema: "string",
    },
    workspaceId: "string",
    definitionId: "string",
    name: "string",
});
type: airbyte:DestinationDatabricks
properties:
    configuration:
        acceptTerms: false
        authentication:
            oAuth2Recommended:
                clientId: string
                secret: string
            personalAccessToken:
                personalAccessToken: string
        database: string
        hostname: string
        httpPath: string
        port: string
        purgeStagingData: false
        rawSchemaOverride: string
        schema: string
    definitionId: string
    name: string
    workspaceId: string
DestinationDatabricks Resource Properties
To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.
Inputs
In Python, inputs that are objects can be passed either as argument classes or as dictionary literals.
The DestinationDatabricks resource accepts the following input properties:
- Configuration
DestinationDatabricks Configuration 
- WorkspaceId string
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Name string
- Name of the destination e.g. dev-mysql-instance.
- Configuration
DestinationDatabricks Configuration Args 
- WorkspaceId string
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- Name string
- Name of the destination e.g. dev-mysql-instance.
- configuration
DestinationDatabricks Configuration 
- workspaceId String
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name String
- Name of the destination e.g. dev-mysql-instance.
- configuration
DestinationDatabricks Configuration 
- workspaceId string
- definitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name string
- Name of the destination e.g. dev-mysql-instance.
- configuration
DestinationDatabricks Configuration Args 
- workspace_id str
- definition_id str
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name str
- Name of the destination e.g. dev-mysql-instance.
- configuration Property Map
- workspaceId String
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- name String
- Name of the destination e.g. dev-mysql-instance.
Outputs
All input properties are implicitly available as output properties. Additionally, the DestinationDatabricks resource produces the following output properties:
- CreatedAt double
- DestinationId string
- DestinationType string
- Id string
- The provider-assigned unique ID for this managed resource.
- CreatedAt float64
- DestinationId string
- DestinationType string
- Id string
- The provider-assigned unique ID for this managed resource.
- createdAt Double
- destinationId String
- destinationType String
- id String
- The provider-assigned unique ID for this managed resource.
- createdAt number
- destinationId string
- destinationType string
- id string
- The provider-assigned unique ID for this managed resource.
- created_at float
- destination_id str
- destination_type str
- id str
- The provider-assigned unique ID for this managed resource.
- createdAt Number
- destinationId String
- destinationType String
- id String
- The provider-assigned unique ID for this managed resource.
Look up Existing DestinationDatabricks Resource
Get an existing DestinationDatabricks resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
public static get(name: string, id: Input<ID>, state?: DestinationDatabricksState, opts?: CustomResourceOptions): DestinationDatabricks@staticmethod
def get(resource_name: str,
        id: str,
        opts: Optional[ResourceOptions] = None,
        configuration: Optional[DestinationDatabricksConfigurationArgs] = None,
        created_at: Optional[float] = None,
        definition_id: Optional[str] = None,
        destination_id: Optional[str] = None,
        destination_type: Optional[str] = None,
        name: Optional[str] = None,
        workspace_id: Optional[str] = None) -> DestinationDatabricksfunc GetDestinationDatabricks(ctx *Context, name string, id IDInput, state *DestinationDatabricksState, opts ...ResourceOption) (*DestinationDatabricks, error)public static DestinationDatabricks Get(string name, Input<string> id, DestinationDatabricksState? state, CustomResourceOptions? opts = null)public static DestinationDatabricks get(String name, Output<String> id, DestinationDatabricksState state, CustomResourceOptions options)resources:  _:    type: airbyte:DestinationDatabricks    get:      id: ${id}- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- resource_name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- name
- The unique name of the resulting resource.
- id
- The unique provider ID of the resource to lookup.
- state
- Any extra arguments used during the lookup.
- opts
- A bag of options that control this resource's behavior.
- Configuration
DestinationDatabricks Configuration 
- CreatedAt double
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- DestinationId string
- DestinationType string
- Name string
- Name of the destination e.g. dev-mysql-instance.
- WorkspaceId string
- Configuration
DestinationDatabricks Configuration Args 
- CreatedAt float64
- DefinitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- DestinationId string
- DestinationType string
- Name string
- Name of the destination e.g. dev-mysql-instance.
- WorkspaceId string
- configuration
DestinationDatabricks Configuration 
- createdAt Double
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destinationId String
- destinationType String
- name String
- Name of the destination e.g. dev-mysql-instance.
- workspaceId String
- configuration
DestinationDatabricks Configuration 
- createdAt number
- definitionId string
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destinationId string
- destinationType string
- name string
- Name of the destination e.g. dev-mysql-instance.
- workspaceId string
- configuration
DestinationDatabricks Configuration Args 
- created_at float
- definition_id str
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destination_id str
- destination_type str
- name str
- Name of the destination e.g. dev-mysql-instance.
- workspace_id str
- configuration Property Map
- createdAt Number
- definitionId String
- The UUID of the connector definition. One of configuration.destinationType or definitionId must be provided. Requires replacement if changed.
- destinationId String
- destinationType String
- name String
- Name of the destination e.g. dev-mysql-instance.
- workspaceId String
Supporting Types
DestinationDatabricksConfiguration, DestinationDatabricksConfigurationArgs      
- Authentication
DestinationDatabricks Configuration Authentication 
- Authentication mechanism for Staging files and running queries
- Database string
- The name of the unity catalog for the database
- Hostname string
- Databricks Cluster Server Hostname.
- HttpPath string
- Databricks Cluster HTTP Path.
- AcceptTerms bool
- You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- Port string
- Databricks Cluster Port. Default: "443"
- PurgeStaging boolData 
- Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- RawSchema stringOverride 
- The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- Schema string
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- Authentication
DestinationDatabricks Configuration Authentication 
- Authentication mechanism for Staging files and running queries
- Database string
- The name of the unity catalog for the database
- Hostname string
- Databricks Cluster Server Hostname.
- HttpPath string
- Databricks Cluster HTTP Path.
- AcceptTerms bool
- You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- Port string
- Databricks Cluster Port. Default: "443"
- PurgeStaging boolData 
- Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- RawSchema stringOverride 
- The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- Schema string
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication
DestinationDatabricks Configuration Authentication 
- Authentication mechanism for Staging files and running queries
- database String
- The name of the unity catalog for the database
- hostname String
- Databricks Cluster Server Hostname.
- httpPath String
- Databricks Cluster HTTP Path.
- acceptTerms Boolean
- You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port String
- Databricks Cluster Port. Default: "443"
- purgeStaging BooleanData 
- Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- rawSchema StringOverride 
- The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema String
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication
DestinationDatabricks Configuration Authentication 
- Authentication mechanism for Staging files and running queries
- database string
- The name of the unity catalog for the database
- hostname string
- Databricks Cluster Server Hostname.
- httpPath string
- Databricks Cluster HTTP Path.
- acceptTerms boolean
- You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port string
- Databricks Cluster Port. Default: "443"
- purgeStaging booleanData 
- Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- rawSchema stringOverride 
- The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema string
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication
DestinationDatabricks Configuration Authentication 
- Authentication mechanism for Staging files and running queries
- database str
- The name of the unity catalog for the database
- hostname str
- Databricks Cluster Server Hostname.
- http_path str
- Databricks Cluster HTTP Path.
- accept_terms bool
- You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port str
- Databricks Cluster Port. Default: "443"
- purge_staging_ booldata 
- Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- raw_schema_ stroverride 
- The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema str
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
- authentication Property Map
- Authentication mechanism for Staging files and running queries
- database String
- The name of the unity catalog for the database
- hostname String
- Databricks Cluster Server Hostname.
- httpPath String
- Databricks Cluster HTTP Path.
- acceptTerms Boolean
- You must agree to the Databricks JDBC Driver \n\nTerms & Conditions\n\n to use this connector. Default: false
- port String
- Databricks Cluster Port. Default: "443"
- purgeStaging BooleanData 
- Default to 'true'. Switch it to 'false' for debugging purpose. Default: true
- rawSchema StringOverride 
- The schema to write raw tables into (default: airbyteinternal). Default: "airbyteinternal"
- schema String
- The default schema tables are written. If not specified otherwise, the "default" will be used. Default: "default"
DestinationDatabricksConfigurationAuthentication, DestinationDatabricksConfigurationAuthenticationArgs        
DestinationDatabricksConfigurationAuthenticationOAuth2Recommended, DestinationDatabricksConfigurationAuthenticationOAuth2RecommendedArgs          
DestinationDatabricksConfigurationAuthenticationPersonalAccessToken, DestinationDatabricksConfigurationAuthenticationPersonalAccessTokenArgs              
- PersonalAccess stringToken 
- PersonalAccess stringToken 
- personalAccess StringToken 
- personalAccess stringToken 
- personalAccess StringToken 
Import
$ pulumi import airbyte:index/destinationDatabricks:DestinationDatabricks my_airbyte_destination_databricks ""
To learn more about importing existing cloud resources, see Importing resources.
Package Details
- Repository
- airbyte airbytehq/terraform-provider-airbyte
- License
- Notes
- This Pulumi package is based on the airbyteTerraform Provider.