1.2.21 cant infer type of Spark's JavaSparkContext


#1

Trying to use kotlin with spark and can’t get the compiler to allow the following

val sc = JavaSparkContext(sparkContext)

var initialRDD =
sc.newAPIHadoopRDD(configuration,
MyInputFormat::class.java,
LongWritable::class.java,
MyWritable::class.java)

The My* classes look like the following (written in java)

public class MyWritable extends MyAbstractWritable {

and

public class MyInputFormat extends FileInputFormat<LongWritable, MyWritable> {

Here is the error

e: RawLog.kt: (50, 16): Type inference failed: Cannot infer type parameter V in fun <K : Any!, V : Any!, F : InputFormat<K!, V!>!> newAPIHadoopRDD(p0: Configuration!, p1: Class<F!>!, p2: Class<K!>!, p3: Class<V!>!): JavaPairRDD<K!, V!>!
None of the following substitutions
(Configuration!,Class<InputFormat<K!, V!>!>!,Class<LongWritable!>!,Class<MyWritable<>!>!)
(Configuration!,Class<InputFormat<K!, V!>!>!,Class<LongWritable!>!,Class<MyWritable<out Message!>!>!)
(Configuration!,Class<InputFormat<K!, V!>!>!,Class<LongWritable!>!,Class<Nothing!>!)
can be applied to
(Configuration!,Class<MyInputFormat<
>>,Class,Class<MyWritable<*>>)

I was able to trick the compiler to let this work by casting the class type

MyInputFormat::class.java as Class<InputFormat<LongWritable, MyWritable<*>>>,

Here is the JavaSparkContext, which is written in scala…

def newAPIHadoopRDD[K, V, F <: NewInputFormat[K, V]](
conf: Configuration,
fClass: Class[F],
kClass: Class[K],
vClass: Class[V]): JavaPairRDD[K, V] = {