AI technologies in general and deep learning specifically are becoming mainstream and many if not most cloud, embedded and handheld applications are be expected to provide some AI based behavior in the near future. There is an opportunity for Kotlin to become the language of choice for such applications and as such become a mainstream language along with the AI technologies.
Kotlin is an amazing programming language and it’s target, JVM, has maintained it’s place as one of the major execution environments for software code. However programming as we know it is changing by the rising adoption of AI technologies and specifically deep learning based approaches. This is an area where JVM is not strong and cannot compete well due to lack of library support. How can Kotlin benefit from this transition into AI technologies becoming part of main stream software development and how can Kotlin make this transition easier for the software teams?
The deep learning, the dominant branch of AI techniques, can be divided into two steps: i) training deep learning based AI models and ii) running the pretrained models in the final execution environment. For training deep learning models people predominantly use Python because of the extensive library support that Python provides for machine learning applications combined with ability to write complex distributed applications with Python (something required for training models in the cloud). Some examples of what makes Python stand out are the natively implemented machine learning libraries with Python bindings including TensorFlow, Numpy, SciPy, Pandas, SciKit Learn and OpenCV. Building and training deep learning models will be mostly a feature of back-end systems or corporate data pipelines. Data scientists experiment with the models on their desktop but the final models are trained in the cloud. Training the models require efficient numerical libraries that allow compiling of the models to GPUs or other specific hardware. TensorFlow is the most widely used of such deep learning libraries and currently only fully supports Python for training. It’s likely that most cloud, mobile and IoT applications will require some live backend system collecting data and training deep learning models based on the data either live or in batches.
The second part of the deep learning based AI models is the actual usage of the trained models for predictions and intelligent decision making. This can mean speech recognition on a handheld device or movie recommendations in a cloud app. This can mean steering a self driving car based on a video stream or running a chat bot on an iOS, Android and web app. These models will be everywhere. Executing the models is again numerically intensive task and requires support for special purpose libraries that compile the models to special purpose hardware such as GPUs. Because the models are trained with some specific frameworks (e.g. TensorFlow) the execution of the models requires that the execution system supports the same framework or otherwise the workflow would just be painful for the developers. The full support for such framework is not needed for the execution environment but purely support for running the pre-trained models. Typically models are trained in Python but the model execution can occur in Python or C/C++ or some other languages.
How does this relate to Kotlin? Right now JVM is not widely used for deep learning. As such Kotlin cannot currently become widely used as a language for building AI based systems. Because AI based systems will find their way to most everyday software applications, lack of JVM support for deep learning is a disadvantage to Kotlin as a language and an obstacle for wider adoption of the language several years from now. However at the same time there is an arising opportunity for Kotlin. Right now Python is the main language people use for developing deep learning models and deep learning based applications. There is no competition. There is no elegant strongly typed language in the data science space. Granted data scientists like dynamic languages but it’s a different thing playing around with deep learning models on your desktop than developing distributed production systems where dynamic typing becomes a disadvantage. I’d much rather prefer Kotlin as a language for such systems than Python.
With this is mind, how could Kotling become the language that bridges AI and deep learning with modern strongly typed programming. The easiest way would be probably make Kotlin to target Python or Python bytecode. That way the machine learning libraries would not need to be ported for JVM or some other target, something that is likely to not happen or lag behind. The second step would be having a Kotlin to compile to native (LLVM) into code that does not use garbage collector or that allows to not use garbage collector for certain real-time parts of the code. This becomes important when executing the deep learning models on an embedded platforms to perform real-time tasks. You don’t want your drone’s balancing model to stop executing to purge the memory (my friend tried this and it didn’t end up well for the drone). Or you don’t want the next generation of Siri to have pauses in the speech generation. Or you don’t want your self driving car to pause steering to clear the unused objects. Python is not a suitable target language for these real-time applications and most people use C/C++ or Objective C for those cases. For this reason there is a C APIs for deep learning frameworks like TensorFlow to execute pretrained models in embedded environments.
If Kotlin would target Python (source code or bytecode) and native (LLVM) it would have the opportunity to become the strongly typed language that everyone would prefer to use when developing their AI based production software applications allowing to cover both the training of such AI models in the cloud and execution of such models on in the cloud or embedded platforms.