Class SynapseSparkJobDefinitionActivity
java.lang.Object
com.azure.resourcemanager.datafactory.models.Activity
com.azure.resourcemanager.datafactory.models.ExecutionActivity
com.azure.resourcemanager.datafactory.models.SynapseSparkJobDefinitionActivity
Execute spark job activity.
-
Constructor Summary
ConstructorDescriptionCreates an instance of SynapseSparkJobDefinitionActivity class. -
Method Summary
Modifier and TypeMethodDescriptionGet the arguments property: User specified arguments to SynapseSparkJobDefinitionActivity.Get the className property: The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide.conf()
Get the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide.Get the configurationType property: The type of the spark config.Get the driverSize property: Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide.Get the executorSize property: Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide.file()
Get the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide.files()
Get the files property: (Deprecated.filesV2()
Get the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.Get the numExecutors property: Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide.Get the pythonCodeReference property: Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.Get the scanFolder property: Scanning subfolders from the root folder of the main definition file, these files will be added as reference files.Get the sparkConfig property: Spark configuration property.sparkJob()
Get the sparkJob property: Synapse spark job reference.Get the targetBigDataPool property: The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.Get the targetSparkConfiguration property: The spark configuration of the spark job.void
validate()
Validates the instance.withArguments
(List<Object> arguments) Set the arguments property: User specified arguments to SynapseSparkJobDefinitionActivity.withClassName
(Object className) Set the className property: The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide.Set the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide.withConfigurationType
(ConfigurationType configurationType) Set the configurationType property: The type of the spark config.withDependsOn
(List<ActivityDependency> dependsOn) Set the dependsOn property: Activity depends on condition.withDescription
(String description) Set the description property: Activity description.withDriverSize
(Object driverSize) Set the driverSize property: Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide.withExecutorSize
(Object executorSize) Set the executorSize property: Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide.Set the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide.Set the files property: (Deprecated.withFilesV2
(List<Object> filesV2) Set the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.withLinkedServiceName
(LinkedServiceReference linkedServiceName) Set the linkedServiceName property: Linked service reference.Set the name property: Activity name.withNumExecutors
(Object numExecutors) Set the numExecutors property: Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide.withPolicy
(ActivityPolicy policy) Set the policy property: Activity policy.withPythonCodeReference
(List<Object> pythonCodeReference) Set the pythonCodeReference property: Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.withScanFolder
(Object scanFolder) Set the scanFolder property: Scanning subfolders from the root folder of the main definition file, these files will be added as reference files.withSparkConfig
(Map<String, Object> sparkConfig) Set the sparkConfig property: Spark configuration property.withSparkJob
(SynapseSparkJobReference sparkJob) Set the sparkJob property: Synapse spark job reference.withTargetBigDataPool
(BigDataPoolParametrizationReference targetBigDataPool) Set the targetBigDataPool property: The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.withTargetSparkConfiguration
(SparkConfigurationParametrizationReference targetSparkConfiguration) Set the targetSparkConfiguration property: The spark configuration of the spark job.withUserProperties
(List<UserProperty> userProperties) Set the userProperties property: Activity user properties.Methods inherited from class com.azure.resourcemanager.datafactory.models.ExecutionActivity
linkedServiceName, policy
Methods inherited from class com.azure.resourcemanager.datafactory.models.Activity
additionalProperties, dependsOn, description, name, userProperties, withAdditionalProperties
-
Constructor Details
-
SynapseSparkJobDefinitionActivity
public SynapseSparkJobDefinitionActivity()Creates an instance of SynapseSparkJobDefinitionActivity class.
-
-
Method Details
-
withLinkedServiceName
public SynapseSparkJobDefinitionActivity withLinkedServiceName(LinkedServiceReference linkedServiceName) Set the linkedServiceName property: Linked service reference.- Overrides:
withLinkedServiceName
in classExecutionActivity
- Parameters:
linkedServiceName
- the linkedServiceName value to set.- Returns:
- the ExecutionActivity object itself.
-
withPolicy
Set the policy property: Activity policy.- Overrides:
withPolicy
in classExecutionActivity
- Parameters:
policy
- the policy value to set.- Returns:
- the ExecutionActivity object itself.
-
withName
Set the name property: Activity name.- Overrides:
withName
in classExecutionActivity
- Parameters:
name
- the name value to set.- Returns:
- the Activity object itself.
-
withDescription
Set the description property: Activity description.- Overrides:
withDescription
in classExecutionActivity
- Parameters:
description
- the description value to set.- Returns:
- the Activity object itself.
-
withDependsOn
Set the dependsOn property: Activity depends on condition.- Overrides:
withDependsOn
in classExecutionActivity
- Parameters:
dependsOn
- the dependsOn value to set.- Returns:
- the Activity object itself.
-
withUserProperties
Set the userProperties property: Activity user properties.- Overrides:
withUserProperties
in classExecutionActivity
- Parameters:
userProperties
- the userProperties value to set.- Returns:
- the Activity object itself.
-
sparkJob
Get the sparkJob property: Synapse spark job reference.- Returns:
- the sparkJob value.
-
withSparkJob
Set the sparkJob property: Synapse spark job reference.- Parameters:
sparkJob
- the sparkJob value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
arguments
Get the arguments property: User specified arguments to SynapseSparkJobDefinitionActivity.- Returns:
- the arguments value.
-
withArguments
Set the arguments property: User specified arguments to SynapseSparkJobDefinitionActivity.- Parameters:
arguments
- the arguments value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
file
Get the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).- Returns:
- the file value.
-
withFile
Set the file property: The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).- Parameters:
file
- the file value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
scanFolder
Get the scanFolder property: Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).- Returns:
- the scanFolder value.
-
withScanFolder
Set the scanFolder property: Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).- Parameters:
scanFolder
- the scanFolder value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
className
Get the className property: The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).- Returns:
- the className value.
-
withClassName
Set the className property: The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).- Parameters:
className
- the className value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
files
Get the files property: (Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.- Returns:
- the files value.
-
withFiles
Set the files property: (Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.- Parameters:
files
- the files value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
pythonCodeReference
Get the pythonCodeReference property: Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.- Returns:
- the pythonCodeReference value.
-
withPythonCodeReference
Set the pythonCodeReference property: Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.- Parameters:
pythonCodeReference
- the pythonCodeReference value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
filesV2
Get the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.- Returns:
- the filesV2 value.
-
withFilesV2
Set the filesV2 property: Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.- Parameters:
filesV2
- the filesV2 value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
targetBigDataPool
Get the targetBigDataPool property: The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.- Returns:
- the targetBigDataPool value.
-
withTargetBigDataPool
public SynapseSparkJobDefinitionActivity withTargetBigDataPool(BigDataPoolParametrizationReference targetBigDataPool) Set the targetBigDataPool property: The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.- Parameters:
targetBigDataPool
- the targetBigDataPool value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
executorSize
Get the executorSize property: Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).- Returns:
- the executorSize value.
-
withExecutorSize
Set the executorSize property: Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).- Parameters:
executorSize
- the executorSize value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
conf
Get the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide.- Returns:
- the conf value.
-
withConf
Set the conf property: Spark configuration properties, which will override the 'conf' of the spark job definition you provide.- Parameters:
conf
- the conf value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
driverSize
Get the driverSize property: Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).- Returns:
- the driverSize value.
-
withDriverSize
Set the driverSize property: Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).- Parameters:
driverSize
- the driverSize value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
numExecutors
Get the numExecutors property: Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).- Returns:
- the numExecutors value.
-
withNumExecutors
Set the numExecutors property: Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).- Parameters:
numExecutors
- the numExecutors value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
configurationType
Get the configurationType property: The type of the spark config.- Returns:
- the configurationType value.
-
withConfigurationType
Set the configurationType property: The type of the spark config.- Parameters:
configurationType
- the configurationType value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
targetSparkConfiguration
Get the targetSparkConfiguration property: The spark configuration of the spark job.- Returns:
- the targetSparkConfiguration value.
-
withTargetSparkConfiguration
public SynapseSparkJobDefinitionActivity withTargetSparkConfiguration(SparkConfigurationParametrizationReference targetSparkConfiguration) Set the targetSparkConfiguration property: The spark configuration of the spark job.- Parameters:
targetSparkConfiguration
- the targetSparkConfiguration value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
sparkConfig
Get the sparkConfig property: Spark configuration property.- Returns:
- the sparkConfig value.
-
withSparkConfig
Set the sparkConfig property: Spark configuration property.- Parameters:
sparkConfig
- the sparkConfig value to set.- Returns:
- the SynapseSparkJobDefinitionActivity object itself.
-
validate
public void validate()Validates the instance.- Overrides:
validate
in classExecutionActivity
- Throws:
IllegalArgumentException
- thrown if the instance is not valid.
-