Type inference for star projected typealiases

I noticed the following behaviour with typealias

typealias ListValuePair<T> = Pair<List<T>, T>
//Errors, does not compile as it should because true is not an Int
val coso: ListValuePair<Int> = Pair(listOf(1, 2, 3), true)
//No errors, it compiles
val coso: ListValuePair<*> = Pair(listOf(1, 2, 3), true)

Allowing type inference for star projected typealiases would be particularly useful on functions with vararg parameters:

fun doStuff(vararg states: ListValuePair<*>) {/*[...]*/}
init {
    //This should be allowed
    doStuff(Pair(listOf(1, 2, 3), 2), Pair(listOf(true, false), false))
    //This should not be allowed
    doStuff(Pair(listOf(1, 2, 3), false), Pair(listOf(true, false), 2))
}

Why shouldn’t the second one be allowed? It’s a valid vararg of ListValuePair<Any>.

1 Like

ListValuePair<T> is defined as Pair<List<T>, T>, the type T should match: the list elements should have the same type as the second member of the pair

This is infact its normal behaviour if you specify the type (eg: ListValuePair<Int>), but it will allow the types to be different if you use ListValuePair<*>

true and 2 are both of type Any.

1 Like

I wonder how could I write all of that and not realize I had a brain fart :disappointed:

Here’s an example that actually works (to an extent):

interface EmptyInterface<T>
typealias InterfaceValuePair<T> = Pair<EmptyInterface<in T>, T>

In this case InterfaceValuePair does expect the second value to be the same as the EmptyInterface’s type; however this only happens when calling InterfaceValuePair’s constructor but not the Pair constructor (even though the function expects a InterfaceValuePair)

This might be a little too hacky though as I can still cast it to EmptyInterface<Any>, or generate the interface with a method that uses generics and it will auto-cast it as EmptyInterface<Any> (like mutableListOf does with the previous example), so whatever, my bad for trying to do hacky stuff