Package org.apache.spark.resource
Class ResourceUtils
Object
org.apache.spark.resource.ResourceUtils
-
Constructor Summary
Constructors -
Method Summary
Modifier and TypeMethodDescriptionstatic voidaddTaskResourceRequests(SparkConf sparkConf, TaskResourceRequests treqs) static StringAMOUNT()calculateAmountAndPartsForFraction(double doubleAmount) static Stringstatic scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement>executorResourceRequestToRequirement(scala.collection.immutable.Seq<ExecutorResourceRequest> resourceRequest) static final StringFPGA()static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResources(SparkConf sparkConf, String componentName, scala.Option<String> resourcesFileOpt) Gets all allocated resource information for the input component from input resources file and the application level Spark configs.static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResourcesForResourceProfile(scala.Option<String> resourcesFileOpt, String componentName, ResourceProfile resourceProfile, SparkConf sparkConf) This function is similar to getOrDiscoverallResources, except for it uses the ResourceProfile information instead of the application level configs.static final StringGPU()static scala.collection.immutable.Seq<ResourceID>listResourceIds(SparkConf sparkConf, String componentName) static voidlogResourceInfo(String componentName, scala.collection.immutable.Map<String, ResourceInformation> resources) static org.apache.spark.internal.Logging.LogStringContextLogStringContext(scala.StringContext sc) static org.slf4j.Loggerstatic voidorg$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceAllocation>parseAllocated(scala.Option<String> resourcesFileOpt, String componentName) static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceAllocation>parseAllocatedFromJsonFile(String resourcesFile) static scala.collection.immutable.Seq<ResourceRequest>parseAllResourceRequests(SparkConf sparkConf, String componentName) static ResourceRequestparseResourceRequest(SparkConf sparkConf, ResourceID resourceId) static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement>parseResourceRequirements(SparkConf sparkConf, String componentName) static final Stringstatic booleanresourcesMeetRequirements(scala.collection.immutable.Map<String, Object> resourcesFree, scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement> resourceRequirements) static booleanvalidateTaskCpusLargeEnough(SparkConf sparkConf, int execCores, int taskCpus) static StringVENDOR()static voidwarnOnWastedResources(ResourceProfile rp, SparkConf sparkConf, scala.Option<Object> execCores) static <T> scala.collection.immutable.Seq<T>withResourcesJson(String resourcesFile, scala.Function1<String, scala.collection.immutable.Seq<T>> extract)
-
Constructor Details
-
ResourceUtils
public ResourceUtils()
-
-
Method Details
-
DISCOVERY_SCRIPT
-
VENDOR
-
AMOUNT
-
parseResourceRequest
-
listResourceIds
public static scala.collection.immutable.Seq<ResourceID> listResourceIds(SparkConf sparkConf, String componentName) -
parseAllResourceRequests
public static scala.collection.immutable.Seq<ResourceRequest> parseAllResourceRequests(SparkConf sparkConf, String componentName) -
calculateAmountAndPartsForFraction
-
addTaskResourceRequests
-
parseResourceRequirements
-
executorResourceRequestToRequirement
public static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceRequirement> executorResourceRequestToRequirement(scala.collection.immutable.Seq<ExecutorResourceRequest> resourceRequest) -
resourcesMeetRequirements
-
withResourcesJson
-
parseAllocatedFromJsonFile
public static scala.collection.immutable.Seq<org.apache.spark.resource.ResourceAllocation> parseAllocatedFromJsonFile(String resourcesFile) -
parseAllocated
-
getOrDiscoverAllResources
public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResources(SparkConf sparkConf, String componentName, scala.Option<String> resourcesFileOpt) Gets all allocated resource information for the input component from input resources file and the application level Spark configs. It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the Spark configs to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.- Parameters:
sparkConf- (undocumented)componentName- (undocumented)resourcesFileOpt- (undocumented)- Returns:
- a map from resource name to resource info
-
getOrDiscoverAllResourcesForResourceProfile
public static scala.collection.immutable.Map<String,ResourceInformation> getOrDiscoverAllResourcesForResourceProfile(scala.Option<String> resourcesFileOpt, String componentName, ResourceProfile resourceProfile, SparkConf sparkConf) This function is similar to getOrDiscoverallResources, except for it uses the ResourceProfile information instead of the application level configs.It first looks to see if resource were explicitly specified in the resources file (this would include specified address assignments and it only specified in certain cluster managers) and then it looks at the ResourceProfile to get any others not specified in the file. The resources not explicitly set in the file require a discovery script for it to run to get the addresses of the resource. It also verifies the resource allocation meets required amount for each resource.
- Parameters:
resourcesFileOpt- (undocumented)componentName- (undocumented)resourceProfile- (undocumented)sparkConf- (undocumented)- Returns:
- a map from resource name to resource info
-
logResourceInfo
public static void logResourceInfo(String componentName, scala.collection.immutable.Map<String, ResourceInformation> resources) -
validateTaskCpusLargeEnough
-
warnOnWastedResources
public static void warnOnWastedResources(ResourceProfile rp, SparkConf sparkConf, scala.Option<Object> execCores) -
GPU
-
FPGA
-
RESOURCE_PREFIX
-
org$apache$spark$internal$Logging$$log_
public static org.slf4j.Logger org$apache$spark$internal$Logging$$log_() -
org$apache$spark$internal$Logging$$log__$eq
public static void org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger x$1) -
LogStringContext
public static org.apache.spark.internal.Logging.LogStringContext LogStringContext(scala.StringContext sc)
-