All Downloads are FREE. Search and download functionalities are using the official Maven repository.

target.apidocs.com.google.api.services.dataflow.Dataflow.Projects.Locations.Jobs.html Maven / Gradle / Ivy

There is a newer version: v1b4-rev7-1.20.0
Show newest version






Dataflow.Projects.Locations.Jobs (Dataflow API v1b3-rev20231112-2.0.0)












com.google.api.services.dataflow

Class Dataflow.Projects.Locations.Jobs

  • java.lang.Object
    • com.google.api.services.dataflow.Dataflow.Projects.Locations.Jobs
    • Constructor Detail

      • Jobs

        public Jobs()
    • Method Detail

      • create

        public Dataflow.Projects.Locations.Jobs.Create create(String projectId,
                                                              String location,
                                                              Job content)
                                                       throws IOException
        Creates a Cloud Dataflow job. To create a job, we recommend using `projects.locations.jobs.create` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.create` is not recommended, as your job will always start in `us-central1`. Do not enter confidential information when you supply string values using the API. Create a request for the method "jobs.create". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - The ID of the Cloud Platform project that the job belongs to.
        location - The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that contains this job.
        content - the Job
        Returns:
        the request
        Throws:
        IOException
      • get

        public Dataflow.Projects.Locations.Jobs.Get get(String projectId,
                                                        String location,
                                                        String jobId)
                                                 throws IOException
        Gets the state of the specified Cloud Dataflow job. To get the state of a job, we recommend using `projects.locations.jobs.get` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.get` is not recommended, as you can only get the state of jobs that are running in `us-central1`. Create a request for the method "jobs.get". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - The ID of the Cloud Platform project that the job belongs to.
        location - The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that contains this job.
        jobId - The job ID.
        Returns:
        the request
        Throws:
        IOException
      • getExecutionDetails

        public Dataflow.Projects.Locations.Jobs.GetExecutionDetails getExecutionDetails(String projectId,
                                                                                        String location,
                                                                                        String jobId)
                                                                                 throws IOException
        Request detailed information about the execution status of the job. EXPERIMENTAL. This API is subject to change or removal without notice. Create a request for the method "jobs.getExecutionDetails". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - A project id.
        location - The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that contains the job specified by job_id.
        jobId - The job to get execution details for.
        Returns:
        the request
        Throws:
        IOException
      • getMetrics

        public Dataflow.Projects.Locations.Jobs.GetMetrics getMetrics(String projectId,
                                                                      String location,
                                                                      String jobId)
                                                               throws IOException
        Request the job status. To request the status of a job, we recommend using `projects.locations.jobs.getMetrics` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.getMetrics` is not recommended, as you can only request the status of jobs that are running in `us-central1`. Create a request for the method "jobs.getMetrics". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - A project id.
        location - The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that contains the job specified by job_id.
        jobId - The job to get metrics for.
        Returns:
        the request
        Throws:
        IOException
      • list

        public Dataflow.Projects.Locations.Jobs.List list(String projectId,
                                                          String location)
                                                   throws IOException
        List the jobs of a project. To list the jobs of a project in a region, we recommend using `projects.locations.jobs.list` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). To list the all jobs across all regions, use `projects.jobs.aggregated`. Using `projects.jobs.list` is not recommended, because you can only get the list of jobs that are running in `us-central1`. `projects.locations.jobs.list` and `projects.jobs.list` support filtering the list of jobs by name. Filtering by name isn't supported by `projects.jobs.aggregated`. Create a request for the method "jobs.list". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - The project which owns the jobs.
        location - The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that contains this job.
        Returns:
        the request
        Throws:
        IOException
      • snapshot

        public Dataflow.Projects.Locations.Jobs.Snapshot snapshot(String projectId,
                                                                  String location,
                                                                  String jobId,
                                                                  SnapshotJobRequest content)
                                                           throws IOException
        Snapshot the state of a streaming job. Create a request for the method "jobs.snapshot". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - The project which owns the job to be snapshotted.
        location - The location that contains this job.
        jobId - The job to be snapshotted.
        content - the SnapshotJobRequest
        Returns:
        the request
        Throws:
        IOException
      • update

        public Dataflow.Projects.Locations.Jobs.Update update(String projectId,
                                                              String location,
                                                              String jobId,
                                                              Job content)
                                                       throws IOException
        Updates the state of an existing Cloud Dataflow job. To update the state of an existing job, we recommend using `projects.locations.jobs.update` with a [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints). Using `projects.jobs.update` is not recommended, as you can only update the state of jobs that are running in `us-central1`. Create a request for the method "jobs.update". This request holds the parameters needed by the dataflow server. After setting any optional parameters, call the AbstractGoogleClientRequest.execute() method to invoke the remote operation.
        Parameters:
        projectId - The ID of the Cloud Platform project that the job belongs to.
        location - The [regional endpoint] (https://cloud.google.com/dataflow/docs/concepts/regional-endpoints) that contains this job.
        jobId - The job ID.
        content - the Job
        Returns:
        the request
        Throws:
        IOException
      • debug

        public Dataflow.Projects.Locations.Jobs.Debug debug()
        An accessor for creating requests from the Debug collection.

        The typical use is:

            Dataflow dataflow = new Dataflow(...);
            Dataflow.Debug.List request = dataflow.debug().list(parameters ...)
         
        Returns:
        the resource collection
      • messages

        public Dataflow.Projects.Locations.Jobs.Messages messages()
        An accessor for creating requests from the Messages collection.

        The typical use is:

            Dataflow dataflow = new Dataflow(...);
            Dataflow.Messages.List request = dataflow.messages().list(parameters ...)
         
        Returns:
        the resource collection
      • snapshots

        public Dataflow.Projects.Locations.Jobs.Snapshots snapshots()
        An accessor for creating requests from the Snapshots collection.

        The typical use is:

            Dataflow dataflow = new Dataflow(...);
            Dataflow.Snapshots.List request = dataflow.snapshots().list(parameters ...)
         
        Returns:
        the resource collection
      • stages

        public Dataflow.Projects.Locations.Jobs.Stages stages()
        An accessor for creating requests from the Stages collection.

        The typical use is:

            Dataflow dataflow = new Dataflow(...);
            Dataflow.Stages.List request = dataflow.stages().list(parameters ...)
         
        Returns:
        the resource collection
      • workItems

        public Dataflow.Projects.Locations.Jobs.WorkItems workItems()
        An accessor for creating requests from the WorkItems collection.

        The typical use is:

            Dataflow dataflow = new Dataflow(...);
            Dataflow.WorkItems.List request = dataflow.workItems().list(parameters ...)
         
        Returns:
        the resource collection

Copyright © 2011–2023 Google. All rights reserved.





© 2015 - 2024 Weber Informatics LLC | Privacy Policy