SparkStandaloneAutoscalingConfig class
Basic autoscaling configurations for Spark Standalone.
Constructors
- SparkStandaloneAutoscalingConfig({String? gracefulDecommissionTimeout, bool? removeOnlyIdleWorkers, double? scaleDownFactor, double? scaleDownMinWorkerFraction, double? scaleUpFactor, double? scaleUpMinWorkerFraction})
- SparkStandaloneAutoscalingConfig.fromJson(Map json_)
Properties
- gracefulDecommissionTimeout ↔ String?
-
Timeout for Spark graceful decommissioning of spark workers.
getter/setter pair
- hashCode → int
-
The hash code for this object.
no setterinherited
- removeOnlyIdleWorkers ↔ bool?
-
Remove only idle workers when scaling down cluster
getter/setter pair
- runtimeType → Type
-
A representation of the runtime type of the object.
no setterinherited
- scaleDownFactor ↔ double?
-
Fraction of required executors to remove from Spark Serverless clusters.
getter/setter pair
- scaleDownMinWorkerFraction ↔ double?
-
Minimum scale-down threshold as a fraction of total cluster size before
scaling occurs.
getter/setter pair
- scaleUpFactor ↔ double?
-
Fraction of required workers to add to Spark Standalone clusters.
getter/setter pair
- scaleUpMinWorkerFraction ↔ double?
-
Minimum scale-up threshold as a fraction of total cluster size before
scaling occurs.
getter/setter pair
Methods
-
noSuchMethod(
Invocation invocation) → dynamic -
Invoked when a nonexistent method or property is accessed.
inherited
-
toJson(
) → Map< String, dynamic> -
toString(
) → String -
A string representation of this object.
inherited
Operators
-
operator ==(
Object other) → bool -
The equality operator.
inherited