In some cases, dynamic values need to be passed to the spark context and to do some process from spark. So that the most convenient way is passing it as a system variable using -D command.
But Spark Context is initiated indifferently in a single node and clustered environment. So that following approach can be taken to pass them to the spark context.
Provide system variables when starting the DAS carbon
serverEx: bin/wso2server.sh -Dkey1=value1 -Dkey2=value2
In Distributed environment Spark is spawn as separate JVM and provide properties to the Carbon server will not be passed to the Spark JVM. Hence you need to pass them to the Spark intentionally. To do that, open
<DAS_HOME> /repository/conf/analytics/spark/spark-defaults.conf and add the following property.spark.executor.extraJavaOptions -Dkey1=value1 -Dkey2=value2
spark-defaults.conf is used to provided external properties and override the default properties in the Spark context. With the above property, their properties passed to every executor that spawn by the parent JVM.
Also note that these configurations will work on other DAS flavored server like APIM Analytics, EI analytics Etc.
Add Comment
Comments (0)