AI and deep learning define the future of programming, will Kotlin fly or die?

@darksnake - thanks for the reply.

  • If python can be a wrapper for native so can Kotlin. If python implements in python kotlin can be as efficient or more.

  • Challenge w/ ND4J and deeplearning4j is simply that they aren’t near the top of the market now to say the least. TensorFlow and PyTorch are. Sounds we want to marry the top jvm platform which Kotlin wants to be with those 2 top AI/DL platforms.

  • Thus for kotlin-jvm - do we really want to depend on deeplearning4j? Unless it can be made a winner maybe not?. What are the alternatives. Spark ML is in good use, and surely the spark ecosystem is strong in distributed. Wonder how good is it as a local compute lib.
    And good to check how it compares to TensorFlow and PyTorch, but it won’t show as a competitor on the deep learning framework since spark DL is using TensorFlow under the hood.

  • kmath and other jvm-math libs - will improve, evolve, support as needed. Can/should we learn from at what/how spark ml done math. Quote from guide:

    MLlib uses the linear algebra package Breeze, which depends on netlib-java for optimised numerical processing. If native libraries1 are not available at runtime, you will see a warning message and a pure JVM implementation will be used instead.

  • Slack - yes, joined thanks.

My aim at kmath is currently not to provide performance, but to build a comfortable multiplatform API with basic implementations which could be later supplemented by optimized platform-specific implementations. I am not sure that performance is that much important in ML. In my experience, people with Python and C++ background just accept the statement that libraries they use work fast (usually it rather can work fast under certain conditions) and never check it. What ML people actually want is a convenient ecosystem. Currently we lack visualization tools and notebook scripting, but we are working on it.


the points listed above make sense.
But there is another one - support of unsigned ints (uint8 is the very important case) and float16.
Current Kotlin numerical libs I’m familiar with don’t support it (obviously, because of jvm don’t have builtins), but it’s common to operate on these types in DL.

UByte is an experimental type.

Oh, I missed that. By the way, why is it not inherited from Number?
Operations on unsigned ints represented (keeping bytes representation) as ints in theory did not break the operations logic, but representation, so it’s not a big problem.

But what about float16 - what is the workaround?

Inline classes are still an experimental feature and their current implementation does not allow inheritance from classes, only interfaces. That said Number is a pretty much useless in any case.

There is no simple way to represent float16 on the jvm since it does not support any bytecode operations for it. You could of cause create a class representing a float16 and either simulate it by using a normal float and adding some conversion or implementing all operations for it, but it would most likely have poor performance.
But why would you want to have float16? I guess for memory optimization if you know its precise enough but since this doesn’t work on the jvm it’s quite unnecessary. Not sure about kotlin native, but I don’t really see the point.