Class BigDataPoolResourceInfoInner
- java.lang.Object
-
- com.azure.core.management.ProxyResource
-
- com.azure.core.management.Resource
-
- com.azure.resourcemanager.synapse.fluent.models.BigDataPoolResourceInfoInner
-
public final class BigDataPoolResourceInfoInner extends com.azure.core.management.Resource
Big Data pool A Big Data pool.
-
-
Constructor Summary
Constructors Constructor Description BigDataPoolResourceInfoInner()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description AutoPauseProperties
autoPause()
Get the autoPause property: Spark pool auto-pausing properties Auto-pausing properties.AutoScaleProperties
autoScale()
Get the autoScale property: Spark pool auto-scaling properties Auto-scaling properties.Integer
cacheSize()
Get the cacheSize property: The cache size.OffsetDateTime
creationDate()
Get the creationDate property: The time when the Big Data pool was created.List<LibraryInfo>
customLibraries()
Get the customLibraries property: List of custom libraries/packages associated with the spark pool.String
defaultSparkLogFolder()
Get the defaultSparkLogFolder property: The default folder where Spark logs will be written.DynamicExecutorAllocation
dynamicExecutorAllocation()
Get the dynamicExecutorAllocation property: Dynamic Executor Allocation.Boolean
isComputeIsolationEnabled()
Get the isComputeIsolationEnabled property: Whether compute isolation is required or not.OffsetDateTime
lastSucceededTimestamp()
Get the lastSucceededTimestamp property: The time when the Big Data pool was updated successfully.LibraryRequirements
libraryRequirements()
Get the libraryRequirements property: Spark pool library version requirements Library version requirements.Integer
nodeCount()
Get the nodeCount property: The number of nodes in the Big Data pool.NodeSize
nodeSize()
Get the nodeSize property: The level of compute power that each node in the Big Data pool has.NodeSizeFamily
nodeSizeFamily()
Get the nodeSizeFamily property: The kind of nodes that the Big Data pool provides.String
provisioningState()
Get the provisioningState property: The state of the Big Data pool.Boolean
sessionLevelPackagesEnabled()
Get the sessionLevelPackagesEnabled property: Whether session level packages enabled.SparkConfigProperties
sparkConfigProperties()
Get the sparkConfigProperties property: Spark pool Config Properties Spark configuration file to specify additional properties.String
sparkEventsFolder()
Get the sparkEventsFolder property: The Spark events folder.String
sparkVersion()
Get the sparkVersion property: The Apache Spark version.void
validate()
Validates the instance.BigDataPoolResourceInfoInner
withAutoPause(AutoPauseProperties autoPause)
Set the autoPause property: Spark pool auto-pausing properties Auto-pausing properties.BigDataPoolResourceInfoInner
withAutoScale(AutoScaleProperties autoScale)
Set the autoScale property: Spark pool auto-scaling properties Auto-scaling properties.BigDataPoolResourceInfoInner
withCacheSize(Integer cacheSize)
Set the cacheSize property: The cache size.BigDataPoolResourceInfoInner
withCustomLibraries(List<LibraryInfo> customLibraries)
Set the customLibraries property: List of custom libraries/packages associated with the spark pool.BigDataPoolResourceInfoInner
withDefaultSparkLogFolder(String defaultSparkLogFolder)
Set the defaultSparkLogFolder property: The default folder where Spark logs will be written.BigDataPoolResourceInfoInner
withDynamicExecutorAllocation(DynamicExecutorAllocation dynamicExecutorAllocation)
Set the dynamicExecutorAllocation property: Dynamic Executor Allocation.BigDataPoolResourceInfoInner
withIsComputeIsolationEnabled(Boolean isComputeIsolationEnabled)
Set the isComputeIsolationEnabled property: Whether compute isolation is required or not.BigDataPoolResourceInfoInner
withLibraryRequirements(LibraryRequirements libraryRequirements)
Set the libraryRequirements property: Spark pool library version requirements Library version requirements.BigDataPoolResourceInfoInner
withLocation(String location)
BigDataPoolResourceInfoInner
withNodeCount(Integer nodeCount)
Set the nodeCount property: The number of nodes in the Big Data pool.BigDataPoolResourceInfoInner
withNodeSize(NodeSize nodeSize)
Set the nodeSize property: The level of compute power that each node in the Big Data pool has.BigDataPoolResourceInfoInner
withNodeSizeFamily(NodeSizeFamily nodeSizeFamily)
Set the nodeSizeFamily property: The kind of nodes that the Big Data pool provides.BigDataPoolResourceInfoInner
withProvisioningState(String provisioningState)
Set the provisioningState property: The state of the Big Data pool.BigDataPoolResourceInfoInner
withSessionLevelPackagesEnabled(Boolean sessionLevelPackagesEnabled)
Set the sessionLevelPackagesEnabled property: Whether session level packages enabled.BigDataPoolResourceInfoInner
withSparkConfigProperties(SparkConfigProperties sparkConfigProperties)
Set the sparkConfigProperties property: Spark pool Config Properties Spark configuration file to specify additional properties.BigDataPoolResourceInfoInner
withSparkEventsFolder(String sparkEventsFolder)
Set the sparkEventsFolder property: The Spark events folder.BigDataPoolResourceInfoInner
withSparkVersion(String sparkVersion)
Set the sparkVersion property: The Apache Spark version.BigDataPoolResourceInfoInner
withTags(Map<String,String> tags)
-
-
-
Method Detail
-
withLocation
public BigDataPoolResourceInfoInner withLocation(String location)
- Overrides:
withLocation
in classcom.azure.core.management.Resource
-
withTags
public BigDataPoolResourceInfoInner withTags(Map<String,String> tags)
- Overrides:
withTags
in classcom.azure.core.management.Resource
-
provisioningState
public String provisioningState()
Get the provisioningState property: The state of the Big Data pool.- Returns:
- the provisioningState value.
-
withProvisioningState
public BigDataPoolResourceInfoInner withProvisioningState(String provisioningState)
Set the provisioningState property: The state of the Big Data pool.- Parameters:
provisioningState
- the provisioningState value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
autoScale
public AutoScaleProperties autoScale()
Get the autoScale property: Spark pool auto-scaling properties Auto-scaling properties.- Returns:
- the autoScale value.
-
withAutoScale
public BigDataPoolResourceInfoInner withAutoScale(AutoScaleProperties autoScale)
Set the autoScale property: Spark pool auto-scaling properties Auto-scaling properties.- Parameters:
autoScale
- the autoScale value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
creationDate
public OffsetDateTime creationDate()
Get the creationDate property: The time when the Big Data pool was created.- Returns:
- the creationDate value.
-
autoPause
public AutoPauseProperties autoPause()
Get the autoPause property: Spark pool auto-pausing properties Auto-pausing properties.- Returns:
- the autoPause value.
-
withAutoPause
public BigDataPoolResourceInfoInner withAutoPause(AutoPauseProperties autoPause)
Set the autoPause property: Spark pool auto-pausing properties Auto-pausing properties.- Parameters:
autoPause
- the autoPause value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
isComputeIsolationEnabled
public Boolean isComputeIsolationEnabled()
Get the isComputeIsolationEnabled property: Whether compute isolation is required or not.- Returns:
- the isComputeIsolationEnabled value.
-
withIsComputeIsolationEnabled
public BigDataPoolResourceInfoInner withIsComputeIsolationEnabled(Boolean isComputeIsolationEnabled)
Set the isComputeIsolationEnabled property: Whether compute isolation is required or not.- Parameters:
isComputeIsolationEnabled
- the isComputeIsolationEnabled value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
sessionLevelPackagesEnabled
public Boolean sessionLevelPackagesEnabled()
Get the sessionLevelPackagesEnabled property: Whether session level packages enabled.- Returns:
- the sessionLevelPackagesEnabled value.
-
withSessionLevelPackagesEnabled
public BigDataPoolResourceInfoInner withSessionLevelPackagesEnabled(Boolean sessionLevelPackagesEnabled)
Set the sessionLevelPackagesEnabled property: Whether session level packages enabled.- Parameters:
sessionLevelPackagesEnabled
- the sessionLevelPackagesEnabled value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
cacheSize
public Integer cacheSize()
Get the cacheSize property: The cache size.- Returns:
- the cacheSize value.
-
withCacheSize
public BigDataPoolResourceInfoInner withCacheSize(Integer cacheSize)
Set the cacheSize property: The cache size.- Parameters:
cacheSize
- the cacheSize value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
dynamicExecutorAllocation
public DynamicExecutorAllocation dynamicExecutorAllocation()
Get the dynamicExecutorAllocation property: Dynamic Executor Allocation.- Returns:
- the dynamicExecutorAllocation value.
-
withDynamicExecutorAllocation
public BigDataPoolResourceInfoInner withDynamicExecutorAllocation(DynamicExecutorAllocation dynamicExecutorAllocation)
Set the dynamicExecutorAllocation property: Dynamic Executor Allocation.- Parameters:
dynamicExecutorAllocation
- the dynamicExecutorAllocation value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
sparkEventsFolder
public String sparkEventsFolder()
Get the sparkEventsFolder property: The Spark events folder.- Returns:
- the sparkEventsFolder value.
-
withSparkEventsFolder
public BigDataPoolResourceInfoInner withSparkEventsFolder(String sparkEventsFolder)
Set the sparkEventsFolder property: The Spark events folder.- Parameters:
sparkEventsFolder
- the sparkEventsFolder value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
nodeCount
public Integer nodeCount()
Get the nodeCount property: The number of nodes in the Big Data pool.- Returns:
- the nodeCount value.
-
withNodeCount
public BigDataPoolResourceInfoInner withNodeCount(Integer nodeCount)
Set the nodeCount property: The number of nodes in the Big Data pool.- Parameters:
nodeCount
- the nodeCount value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
libraryRequirements
public LibraryRequirements libraryRequirements()
Get the libraryRequirements property: Spark pool library version requirements Library version requirements.- Returns:
- the libraryRequirements value.
-
withLibraryRequirements
public BigDataPoolResourceInfoInner withLibraryRequirements(LibraryRequirements libraryRequirements)
Set the libraryRequirements property: Spark pool library version requirements Library version requirements.- Parameters:
libraryRequirements
- the libraryRequirements value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
customLibraries
public List<LibraryInfo> customLibraries()
Get the customLibraries property: List of custom libraries/packages associated with the spark pool.- Returns:
- the customLibraries value.
-
withCustomLibraries
public BigDataPoolResourceInfoInner withCustomLibraries(List<LibraryInfo> customLibraries)
Set the customLibraries property: List of custom libraries/packages associated with the spark pool.- Parameters:
customLibraries
- the customLibraries value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
sparkConfigProperties
public SparkConfigProperties sparkConfigProperties()
Get the sparkConfigProperties property: Spark pool Config Properties Spark configuration file to specify additional properties.- Returns:
- the sparkConfigProperties value.
-
withSparkConfigProperties
public BigDataPoolResourceInfoInner withSparkConfigProperties(SparkConfigProperties sparkConfigProperties)
Set the sparkConfigProperties property: Spark pool Config Properties Spark configuration file to specify additional properties.- Parameters:
sparkConfigProperties
- the sparkConfigProperties value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
sparkVersion
public String sparkVersion()
Get the sparkVersion property: The Apache Spark version.- Returns:
- the sparkVersion value.
-
withSparkVersion
public BigDataPoolResourceInfoInner withSparkVersion(String sparkVersion)
Set the sparkVersion property: The Apache Spark version.- Parameters:
sparkVersion
- the sparkVersion value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
defaultSparkLogFolder
public String defaultSparkLogFolder()
Get the defaultSparkLogFolder property: The default folder where Spark logs will be written.- Returns:
- the defaultSparkLogFolder value.
-
withDefaultSparkLogFolder
public BigDataPoolResourceInfoInner withDefaultSparkLogFolder(String defaultSparkLogFolder)
Set the defaultSparkLogFolder property: The default folder where Spark logs will be written.- Parameters:
defaultSparkLogFolder
- the defaultSparkLogFolder value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
nodeSize
public NodeSize nodeSize()
Get the nodeSize property: The level of compute power that each node in the Big Data pool has.- Returns:
- the nodeSize value.
-
withNodeSize
public BigDataPoolResourceInfoInner withNodeSize(NodeSize nodeSize)
Set the nodeSize property: The level of compute power that each node in the Big Data pool has.- Parameters:
nodeSize
- the nodeSize value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
nodeSizeFamily
public NodeSizeFamily nodeSizeFamily()
Get the nodeSizeFamily property: The kind of nodes that the Big Data pool provides.- Returns:
- the nodeSizeFamily value.
-
withNodeSizeFamily
public BigDataPoolResourceInfoInner withNodeSizeFamily(NodeSizeFamily nodeSizeFamily)
Set the nodeSizeFamily property: The kind of nodes that the Big Data pool provides.- Parameters:
nodeSizeFamily
- the nodeSizeFamily value to set.- Returns:
- the BigDataPoolResourceInfoInner object itself.
-
lastSucceededTimestamp
public OffsetDateTime lastSucceededTimestamp()
Get the lastSucceededTimestamp property: The time when the Big Data pool was updated successfully.- Returns:
- the lastSucceededTimestamp value.
-
validate
public void validate()
Validates the instance.- Throws:
IllegalArgumentException
- thrown if the instance is not valid.
-
-