Package version:

Interface SynapseSparkJobDefinitionActivity

Execute spark job activity.

Hierarchy

Properties

arguments?: any[]

User specified arguments to SynapseSparkJobDefinitionActivity.

className?: any

The fully-qualified identifier or the main class that is in the main definition file, which will override the 'className' of the spark job definition you provide. Type: string (or Expression with resultType string).

conf?: any

Spark configuration properties, which will override the 'conf' of the spark job definition you provide.

configurationType?: string

The type of the spark config.

dependsOn?: ActivityDependency[]

Activity depends on condition.

description?: string

Activity description.

driverSize?: any

Number of core and memory to be used for driver allocated in the specified Spark pool for the job, which will be used for overriding 'driverCores' and 'driverMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

executorSize?: any

Number of core and memory to be used for executors allocated in the specified Spark pool for the job, which will be used for overriding 'executorCores' and 'executorMemory' of the spark job definition you provide. Type: string (or Expression with resultType string).

file?: any

The main file used for the job, which will override the 'file' of the spark job definition you provide. Type: string (or Expression with resultType string).

files?: any[]

(Deprecated. Please use pythonCodeReference and filesV2) Additional files used for reference in the main definition file, which will override the 'files' of the spark job definition you provide.

filesV2?: any[]

Additional files used for reference in the main definition file, which will override the 'jars' and 'files' of the spark job definition you provide.

linkedServiceName?: LinkedServiceReference

Linked service reference.

name: string

Activity name.

numExecutors?: any

Number of executors to launch for this job, which will override the 'numExecutors' of the spark job definition you provide. Type: integer (or Expression with resultType integer).

Activity policy.

pythonCodeReference?: any[]

Additional python code files used for reference in the main definition file, which will override the 'pyFiles' of the spark job definition you provide.

scanFolder?: any

Scanning subfolders from the root folder of the main definition file, these files will be added as reference files. The folders named 'jars', 'pyFiles', 'files' or 'archives' will be scanned, and the folders name are case sensitive. Type: boolean (or Expression with resultType boolean).

sparkConfig?: {
    [propertyName: string]: any;
}

Spark configuration property.

Type declaration

  • [propertyName: string]: any

Synapse spark job reference.

The name of the big data pool which will be used to execute the spark batch job, which will override the 'targetBigDataPool' of the spark job definition you provide.

The spark configuration of the spark job.

type: "SparkJob"

Polymorphic discriminator, which specifies the different types this object can be

userProperties?: UserProperty[]

Activity user properties.

Generated using TypeDoc