Type is determined as not null even is safe call is present

I was using a third-party Java library. At some point they added @Nullable and @NotNull annotations. Unfortunately some methods (illustrated as Legacy::nullString in the example below) were marked as @NotNull, but in fact they could return nulls. I had safe-calls to protect from such cases, but they didn’t work, and I got NullPointerException. The problem was that I didn’t specify type of the B::x field explicitly, and it’s type was automatically determined as not null.

As Kotlin compiler got sure that the B::x field could not be null, it simply ignored all safe calls that followed (x?.length), which resulted into crash. I think that it is wrong behavior, and that type of B::x should be String?, but not String. This will introduce additional null check and can possibly affect performance, but as long as warnings are displayed, I think that it should be responsibility of the author of the code.

class Legacy {
    val nullString: String = getNull()
    val uninitialized: String = "unused"
    
    fun getNull(): String {
        return uninitialized
    }
}

class B {
    val x
    	get() = Legacy().nullString?.plus("0")
}

fun main(args: Array<String>) {
    print(B().x?.length)  // java.lang.NullPointerException
}

I don’t think this is the right way to solve this. If you have to use a project with wrong NotNull and Nullable annotations you should report this. Weakening kotlins type inference just to handle this would make a lot of code worse without gaining much. If you have to deal with something like this I suggest declaring the type directly.
val x: String? get() = ...

That’s what I ended up with - declaring the type directly. However it makes me uncomfortable that safe call operator is simply ignored. Are there any cases, in which it would be beneficial for safe call operator to have not-null return type?