All Downloads are FREE. Search and download functionalities are using the official Maven repository.

target.apidocs.com.google.api.services.dataproc.model.PySparkBatch.html Maven / Gradle / Ivy

There is a newer version: v1-rev20241025-2.0.0
Show newest version






PySparkBatch (Cloud Dataproc API v1-rev20240605-2.0.0)












com.google.api.services.dataproc.model

Class PySparkBatch

  • All Implemented Interfaces:
    Cloneable, Map<String,Object>


    public final class PySparkBatch
    extends com.google.api.client.json.GenericJson
    A configuration for running an Apache PySpark (https://spark.apache.org/docs/latest/api/python/getting_started/quickstart.html) batch workload.

    This is the Java data model class that specifies how to parse/serialize into the JSON that is transmitted over HTTP when working with the Cloud Dataproc API. For a detailed explanation see: https://developers.google.com/api-client-library/java/google-http-java-client/json

    Author:
    Google, Inc.
    • Constructor Detail

      • PySparkBatch

        public PySparkBatch()
    • Method Detail

      • getArchiveUris

        public List<String> getArchiveUris()
        Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
        Returns:
        value or null for none
      • setArchiveUris

        public PySparkBatch setArchiveUris(List<String> archiveUris)
        Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
        Parameters:
        archiveUris - archiveUris or null for none
      • getArgs

        public List<String> getArgs()
        Optional. The arguments to pass to the driver. Do not include arguments that can be set as batch properties, such as --conf, since a collision can occur that causes an incorrect batch submission.
        Returns:
        value or null for none
      • setArgs

        public PySparkBatch setArgs(List<String> args)
        Optional. The arguments to pass to the driver. Do not include arguments that can be set as batch properties, such as --conf, since a collision can occur that causes an incorrect batch submission.
        Parameters:
        args - args or null for none
      • getFileUris

        public List<String> getFileUris()
        Optional. HCFS URIs of files to be placed in the working directory of each executor.
        Returns:
        value or null for none
      • setFileUris

        public PySparkBatch setFileUris(List<String> fileUris)
        Optional. HCFS URIs of files to be placed in the working directory of each executor.
        Parameters:
        fileUris - fileUris or null for none
      • getJarFileUris

        public List<String> getJarFileUris()
        Optional. HCFS URIs of jar files to add to the classpath of the Spark driver and tasks.
        Returns:
        value or null for none
      • setJarFileUris

        public PySparkBatch setJarFileUris(List<String> jarFileUris)
        Optional. HCFS URIs of jar files to add to the classpath of the Spark driver and tasks.
        Parameters:
        jarFileUris - jarFileUris or null for none
      • getMainPythonFileUri

        public String getMainPythonFileUri()
        Required. The HCFS URI of the main Python file to use as the Spark driver. Must be a .py file.
        Returns:
        value or null for none
      • setMainPythonFileUri

        public PySparkBatch setMainPythonFileUri(String mainPythonFileUri)
        Required. The HCFS URI of the main Python file to use as the Spark driver. Must be a .py file.
        Parameters:
        mainPythonFileUri - mainPythonFileUri or null for none
      • getPythonFileUris

        public List<String> getPythonFileUris()
        Optional. HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip.
        Returns:
        value or null for none
      • setPythonFileUris

        public PySparkBatch setPythonFileUris(List<String> pythonFileUris)
        Optional. HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip.
        Parameters:
        pythonFileUris - pythonFileUris or null for none
      • set

        public PySparkBatch set(String fieldName,
                                Object value)
        Overrides:
        set in class com.google.api.client.json.GenericJson
      • clone

        public PySparkBatch clone()
        Overrides:
        clone in class com.google.api.client.json.GenericJson

Copyright © 2011–2024 Google. All rights reserved.





© 2015 - 2025 Weber Informatics LLC | Privacy Policy