All Downloads are FREE. Search and download functionalities are using the official Maven repository.

target.apidocs.com.google.api.services.bigquery.model.ExternalDataConfiguration.html Maven / Gradle / Ivy

There is a newer version: v2-rev20250302-2.0.0
Show newest version






ExternalDataConfiguration (BigQuery API v2-rev20220422-1.32.1)












com.google.api.services.bigquery.model

Class ExternalDataConfiguration

    • Constructor Detail

      • ExternalDataConfiguration

        public ExternalDataConfiguration()
    • Method Detail

      • getAutodetect

        public Boolean getAutodetect()
        Try to detect schema and format options automatically. Any option specified explicitly will be honored.
        Returns:
        value or null for none
      • setAutodetect

        public ExternalDataConfiguration setAutodetect(Boolean autodetect)
        Try to detect schema and format options automatically. Any option specified explicitly will be honored.
        Parameters:
        autodetect - autodetect or null for none
      • getAvroOptions

        public AvroOptions getAvroOptions()
        Additional properties to set if sourceFormat is set to Avro.
        Returns:
        value or null for none
      • setAvroOptions

        public ExternalDataConfiguration setAvroOptions(AvroOptions avroOptions)
        Additional properties to set if sourceFormat is set to Avro.
        Parameters:
        avroOptions - avroOptions or null for none
      • getBigtableOptions

        public BigtableOptions getBigtableOptions()
        [Optional] Additional options if sourceFormat is set to BIGTABLE.
        Returns:
        value or null for none
      • setBigtableOptions

        public ExternalDataConfiguration setBigtableOptions(BigtableOptions bigtableOptions)
        [Optional] Additional options if sourceFormat is set to BIGTABLE.
        Parameters:
        bigtableOptions - bigtableOptions or null for none
      • getCompression

        public String getCompression()
        [Optional] The compression type of the data source. Possible values include GZIP and NONE. The default value is NONE. This setting is ignored for Google Cloud Bigtable, Google Cloud Datastore backups and Avro formats.
        Returns:
        value or null for none
      • setCompression

        public ExternalDataConfiguration setCompression(String compression)
        [Optional] The compression type of the data source. Possible values include GZIP and NONE. The default value is NONE. This setting is ignored for Google Cloud Bigtable, Google Cloud Datastore backups and Avro formats.
        Parameters:
        compression - compression or null for none
      • getConnectionId

        public String getConnectionId()
        [Optional, Trusted Tester] Connection for external data source.
        Returns:
        value or null for none
      • setConnectionId

        public ExternalDataConfiguration setConnectionId(String connectionId)
        [Optional, Trusted Tester] Connection for external data source.
        Parameters:
        connectionId - connectionId or null for none
      • getCsvOptions

        public CsvOptions getCsvOptions()
        Additional properties to set if sourceFormat is set to CSV.
        Returns:
        value or null for none
      • setCsvOptions

        public ExternalDataConfiguration setCsvOptions(CsvOptions csvOptions)
        Additional properties to set if sourceFormat is set to CSV.
        Parameters:
        csvOptions - csvOptions or null for none
      • getDecimalTargetTypes

        public List<String> getDecimalTargetTypes()
        [Optional] Defines the list of possible SQL data types to which the source decimal values are converted. This list and the precision and the scale parameters of the decimal field determine the target type. In the order of NUMERIC, BIGNUMERIC, and STRING, a type is picked if it is in the specified list and if it supports the precision and the scale. STRING supports all precision and scale values. If none of the listed types supports the precision and the scale, the type supporting the widest range in the specified list is picked, and if a value exceeds the supported range when reading the data, an error will be thrown. Example: Suppose the value of this field is ["NUMERIC", "BIGNUMERIC"]. If (precision,scale) is: (38,9) -> NUMERIC; (39,9) -> BIGNUMERIC (NUMERIC cannot hold 30 integer digits); (38,10) -> BIGNUMERIC (NUMERIC cannot hold 10 fractional digits); (76,38) -> BIGNUMERIC; (77,38) -> BIGNUMERIC (error if value exeeds supported range). This field cannot contain duplicate types. The order of the types in this field is ignored. For example, ["BIGNUMERIC", "NUMERIC"] is the same as ["NUMERIC", "BIGNUMERIC"] and NUMERIC always takes precedence over BIGNUMERIC. Defaults to ["NUMERIC", "STRING"] for ORC and ["NUMERIC"] for the other file formats.
        Returns:
        value or null for none
      • setDecimalTargetTypes

        public ExternalDataConfiguration setDecimalTargetTypes(List<String> decimalTargetTypes)
        [Optional] Defines the list of possible SQL data types to which the source decimal values are converted. This list and the precision and the scale parameters of the decimal field determine the target type. In the order of NUMERIC, BIGNUMERIC, and STRING, a type is picked if it is in the specified list and if it supports the precision and the scale. STRING supports all precision and scale values. If none of the listed types supports the precision and the scale, the type supporting the widest range in the specified list is picked, and if a value exceeds the supported range when reading the data, an error will be thrown. Example: Suppose the value of this field is ["NUMERIC", "BIGNUMERIC"]. If (precision,scale) is: (38,9) -> NUMERIC; (39,9) -> BIGNUMERIC (NUMERIC cannot hold 30 integer digits); (38,10) -> BIGNUMERIC (NUMERIC cannot hold 10 fractional digits); (76,38) -> BIGNUMERIC; (77,38) -> BIGNUMERIC (error if value exeeds supported range). This field cannot contain duplicate types. The order of the types in this field is ignored. For example, ["BIGNUMERIC", "NUMERIC"] is the same as ["NUMERIC", "BIGNUMERIC"] and NUMERIC always takes precedence over BIGNUMERIC. Defaults to ["NUMERIC", "STRING"] for ORC and ["NUMERIC"] for the other file formats.
        Parameters:
        decimalTargetTypes - decimalTargetTypes or null for none
      • getGoogleSheetsOptions

        public GoogleSheetsOptions getGoogleSheetsOptions()
        [Optional] Additional options if sourceFormat is set to GOOGLE_SHEETS.
        Returns:
        value or null for none
      • setGoogleSheetsOptions

        public ExternalDataConfiguration setGoogleSheetsOptions(GoogleSheetsOptions googleSheetsOptions)
        [Optional] Additional options if sourceFormat is set to GOOGLE_SHEETS.
        Parameters:
        googleSheetsOptions - googleSheetsOptions or null for none
      • getHivePartitioningOptions

        public HivePartitioningOptions getHivePartitioningOptions()
        [Optional] Options to configure hive partitioning support.
        Returns:
        value or null for none
      • setHivePartitioningOptions

        public ExternalDataConfiguration setHivePartitioningOptions(HivePartitioningOptions hivePartitioningOptions)
        [Optional] Options to configure hive partitioning support.
        Parameters:
        hivePartitioningOptions - hivePartitioningOptions or null for none
      • getIgnoreUnknownValues

        public Boolean getIgnoreUnknownValues()
        [Optional] Indicates if BigQuery should allow extra values that are not represented in the table schema. If true, the extra values are ignored. If false, records with extra columns are treated as bad records, and if there are too many bad records, an invalid error is returned in the job result. The default value is false. The sourceFormat property determines what BigQuery treats as an extra value: CSV: Trailing columns JSON: Named values that don't match any column names Google Cloud Bigtable: This setting is ignored. Google Cloud Datastore backups: This setting is ignored. Avro: This setting is ignored.
        Returns:
        value or null for none
      • setIgnoreUnknownValues

        public ExternalDataConfiguration setIgnoreUnknownValues(Boolean ignoreUnknownValues)
        [Optional] Indicates if BigQuery should allow extra values that are not represented in the table schema. If true, the extra values are ignored. If false, records with extra columns are treated as bad records, and if there are too many bad records, an invalid error is returned in the job result. The default value is false. The sourceFormat property determines what BigQuery treats as an extra value: CSV: Trailing columns JSON: Named values that don't match any column names Google Cloud Bigtable: This setting is ignored. Google Cloud Datastore backups: This setting is ignored. Avro: This setting is ignored.
        Parameters:
        ignoreUnknownValues - ignoreUnknownValues or null for none
      • getMaxBadRecords

        public Integer getMaxBadRecords()
        [Optional] The maximum number of bad records that BigQuery can ignore when reading data. If the number of bad records exceeds this value, an invalid error is returned in the job result. This is only valid for CSV, JSON, and Google Sheets. The default value is 0, which requires that all records are valid. This setting is ignored for Google Cloud Bigtable, Google Cloud Datastore backups and Avro formats.
        Returns:
        value or null for none
      • setMaxBadRecords

        public ExternalDataConfiguration setMaxBadRecords(Integer maxBadRecords)
        [Optional] The maximum number of bad records that BigQuery can ignore when reading data. If the number of bad records exceeds this value, an invalid error is returned in the job result. This is only valid for CSV, JSON, and Google Sheets. The default value is 0, which requires that all records are valid. This setting is ignored for Google Cloud Bigtable, Google Cloud Datastore backups and Avro formats.
        Parameters:
        maxBadRecords - maxBadRecords or null for none
      • getParquetOptions

        public ParquetOptions getParquetOptions()
        Additional properties to set if sourceFormat is set to Parquet.
        Returns:
        value or null for none
      • setParquetOptions

        public ExternalDataConfiguration setParquetOptions(ParquetOptions parquetOptions)
        Additional properties to set if sourceFormat is set to Parquet.
        Parameters:
        parquetOptions - parquetOptions or null for none
      • getSchema

        public TableSchema getSchema()
        [Optional] The schema for the data. Schema is required for CSV and JSON formats. Schema is disallowed for Google Cloud Bigtable, Cloud Datastore backups, and Avro formats.
        Returns:
        value or null for none
      • setSchema

        public ExternalDataConfiguration setSchema(TableSchema schema)
        [Optional] The schema for the data. Schema is required for CSV and JSON formats. Schema is disallowed for Google Cloud Bigtable, Cloud Datastore backups, and Avro formats.
        Parameters:
        schema - schema or null for none
      • getSourceFormat

        public String getSourceFormat()
        [Required] The data format. For CSV files, specify "CSV". For Google sheets, specify "GOOGLE_SHEETS". For newline-delimited JSON, specify "NEWLINE_DELIMITED_JSON". For Avro files, specify "AVRO". For Google Cloud Datastore backups, specify "DATASTORE_BACKUP". [Beta] For Google Cloud Bigtable, specify "BIGTABLE".
        Returns:
        value or null for none
      • setSourceFormat

        public ExternalDataConfiguration setSourceFormat(String sourceFormat)
        [Required] The data format. For CSV files, specify "CSV". For Google sheets, specify "GOOGLE_SHEETS". For newline-delimited JSON, specify "NEWLINE_DELIMITED_JSON". For Avro files, specify "AVRO". For Google Cloud Datastore backups, specify "DATASTORE_BACKUP". [Beta] For Google Cloud Bigtable, specify "BIGTABLE".
        Parameters:
        sourceFormat - sourceFormat or null for none
      • getSourceUris

        public List<String> getSourceUris()
        [Required] The fully-qualified URIs that point to your data in Google Cloud. For Google Cloud Storage URIs: Each URI can contain one '*' wildcard character and it must come after the 'bucket' name. Size limits related to load jobs apply to external data sources. For Google Cloud Bigtable URIs: Exactly one URI can be specified and it has be a fully specified and valid HTTPS URL for a Google Cloud Bigtable table. For Google Cloud Datastore backups, exactly one URI can be specified. Also, the '*' wildcard character is not allowed.
        Returns:
        value or null for none
      • setSourceUris

        public ExternalDataConfiguration setSourceUris(List<String> sourceUris)
        [Required] The fully-qualified URIs that point to your data in Google Cloud. For Google Cloud Storage URIs: Each URI can contain one '*' wildcard character and it must come after the 'bucket' name. Size limits related to load jobs apply to external data sources. For Google Cloud Bigtable URIs: Exactly one URI can be specified and it has be a fully specified and valid HTTPS URL for a Google Cloud Bigtable table. For Google Cloud Datastore backups, exactly one URI can be specified. Also, the '*' wildcard character is not allowed.
        Parameters:
        sourceUris - sourceUris or null for none

Copyright © 2011–2022 Google. All rights reserved.





© 2015 - 2025 Weber Informatics LLC | Privacy Policy