samedi 10 juin 2017

Class.forName in Apache Spark 1.6.1 map

I have spark scala 1.6.1_2.10 project with 2 modules not dependent at compile time. The first modules is initiating a spark driver app. In first module, in one of the rdd.map{} operation I am trying to load a class using reflection class.forName("second.module.function.MapOperation")

my spark-submit has both the jars for both module one as primary and other in --jars option.

This code run fine in local on my intellij. This fails due to ClassNotFound second.module.function.MapOperation on cluster Also fails in functional test cases with ClassNotFound, if I test the same class.

I there an issue with classloaders and using Class.forName in a spark job/operation?





Aucun commentaire:

Enregistrer un commentaire