PySparkJob class
A Dataproc job for running Apache PySpark (https://spark.apache.org/docs/0.9.0/python-programming-guide.html) applications on YARN.
Constructors
Properties
-
archiveUris
↔ List<
String> ? -
HCFS URIs of archives to be extracted into the working directory of each
executor.
getter/setter pair
-
args
↔ List<
String> ? -
The arguments to pass to the driver.
getter/setter pair
-
fileUris
↔ List<
String> ? -
HCFS URIs of files to be placed in the working directory of each executor.
getter/setter pair
- hashCode → int
-
The hash code for this object.
no setterinherited
-
jarFileUris
↔ List<
String> ? -
HCFS URIs of jar files to add to the CLASSPATHs of the Python driver and
tasks.
getter/setter pair
- loggingConfig ↔ LoggingConfig?
-
The runtime log config for job execution.
getter/setter pair
- mainPythonFileUri ↔ String?
-
The HCFS URI of the main Python file to use as the driver.
getter/setter pair
-
properties
↔ Map<
String, String> ? -
A mapping of property names to values, used to configure PySpark.
getter/setter pair
-
pythonFileUris
↔ List<
String> ? -
HCFS file URIs of Python files to pass to the PySpark framework.
getter/setter pair
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
Methods
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toJson(
) → Map< String, dynamic> -
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited