Recently there was a good article about the State of Loom. This article explains why this approach is superior to other approaches like C#'s async/await, Rx or Kotlins coroutines. The advantage is, that the programming model is still based on threads, but that these lightweight threads are decoupled from native OS threads, so that you don’t have to worry about the cost of a thread. So the API is more or less the same as with threads as they are known today. The advantage is not only the well known programming model, but that many existing applications with usual threads can benefit from async processing. It is a non-invasive programming model, while Rx and coroutines are invasive.
In another topic about loom in this forum it was argued that coroutines could be bolted on top of Loom, but from my point of view that would only make sense for then existing (legacy) coroutine-based applications. As soon as Java has lightweight threads, there is no benefit in using coroutines.
A possible exception to my thesis could be cross-platform development, but I can not say much about it.
I don’t want to say that coroutines in Kotlin would be bad, I like and use them actually, but I think the future is loom (whenever that will be).
2 Likes
Coroutines have two things going for them in Kotlin that Loom doesn’t, and can’t, cover:
- Multiplatform async support: Loom only enables lightweight threading on the JVM, while JS and Native aren’t capable of taking advantage of it.
- Non-async use cases: Loom only enables lightweight threads and futures/async computations on top of them, not all of the other capabilities that coroutines enable like sequence builders, logic-interleaved DSLs, and other context-oriented composable effects.
9 Likes
I agree with the potential usefulness in multi-platform development, as I’ve already written.
But your point to the other use cases is a bit like comparing apples to oranges. Coroutines themselves are pretty low level and other libraries, like Flow, can be built on it. The same is true for lightweight threads. It should be even easier to build libraries and DSLs on it, because they are not invasive and thus less limiting in API design.
1 Like
Lightweight threads can’t allow for arbitrary control of the continuation directly, so there are “libraries” that you can build on coroutines that you can’t build on Loom, no matter how hard you try; it’s a fundamental limitation of the fact that they’re bundled as executable tasks rather than as open suspendables. It’s the same kind of issue as internal vs. external iteration, a.k.a. why you can’t implement Sequence.asIterator
purely in terms of Sequence.forEach
without the overhead of an intermediate data structure.
3 Likes
Thanks for the link – it’s super interesting. The transition from “fibers”, with their own API, to “virtual threads” that are managed with the existing Thread
API, is really nice.
I’ve been doing multithreaded programming for a long time, though, and I think a lot of people are forgetting how difficult it is, and failing to appreciate how much more difficult it will be when you have millions of threads.
I still think that Kotlin coroutines built on virtual threads will be a much nicer way to code… unless JetBrains messes it up
All of those things can be implemented using virtual threads with special schedulers – which let you control the continuation directly.
From “State of Loom, Part 1”:
Virtual threads are preemptive, not cooperative — they do not have an explicit await
operation at scheduling (task-switching) points. Rather, they are preempted when they block on I/O or synchronization.
This means they can’t be used for things like delimited control or builders. When I said “arbitrary”, I meant “arbitrary”.
Well, it just doesn’t. In what way do you think delimited control or builders require protection from preemption? There are things that are kinda like that would be required to exactly mimic Kotlin behaviour, but like Kotlin’s other structured concurrency mechanisms, those can be implemented with special schedulers as I said. Using a virtual thread bound to a special scheduler, you can implement Kotlin’s native Continuations.
Also I noticed that project Loom has a Continuation
class, which they said may become public API at some point. (It’s one shot, delimited, multi-prompt - and I know what the first 2 of those mean )
Prove to me that you can implement a sequence builder like the one in Kotlin’s standard library in pure Java using the facilities of Loom. Lack of lambdas-with-receiver and trailing lambda syntax aside, you should be able to achieve basically the same API surface. To make it concrete, I should be able to write the following:
final Stream<String> fizzBuzz = SequenceBuilder.<String>build(cxt -> {
for (int x = 0;; x++) {
boolean is3 = x % 3 == 0;
boolean is5 = x % 5 == 0;
if (is3 && is5) cxt.yield(“FizzBuzz”);
else if (is3) cxt.yield(“Fizz”);
else if (is5) cxt.yield(“Buzz”);
else cxt.yield(“” + x);
}
}).asStream();
and have it produce a fully concurrency-safe lazy infinite stream.
I’m not sure what mtimmerm had in mind, but I think it depends what API they give us. I’m not sure they know yet what all the API will be. If the Continuation
is public, then it’s possible. Also, part 2 of State of Loom had an example with “channels”. I think that could do it. Kotlin-ish pseudocode…
val channel = Channel()
Thread.startVirtualThread {
var n = 0
while (true) {
channel.send(n)
n += 1
}
}
val iter = Iterator {
fun next(): String { channel.receive() }
}
for (n in iter) { ... }
Okay, and what if I have print statements in the builder closure? Will those be executed lazily, or will the virtual thread forge ahead and use the channel as a buffer, thus defeating the purpose?
With the channels I’ve seen (like Clojure’s core.async) that is configurable. You set the buffer size to zero and it would do what you want.
I’ve also seen videos on youtube about Loom where they were talking about generators, which are exactly the API you’re describing. One video was from 2018, so I don’t know if they have Generators in the early access API yet, but it sounds like they intend to provide stuff like that.
That’s fair; I didn’t know they intended to provide generators too.
That still doesn’t cover some other things that full exposure of delimited continuations can have. Even the fact that a Continuation
class is exposed isn’t enough because there’s no way to reify it at an arbitrary point. Basically I’m asking for the ability to use shift
and reset
(by whatever name), which you can do (within a controlled scope) in Kotlin.
1 Like
Proving things to you doesn’t sound like a good use of my time. But the basic strategy here is that the sequence builder runs the lambda with a special scheduler so that:
yield
stores the output and sets a flag to indicate that it is yielding, and then park
s the virtual thread.
- At that point, control returns to the scheduler that is running it, which returns the value out to the stream.
- When the stream pulls another value, the virtual thread is unparked.
- The unpark operation sends the vthread’s
VirtualThreadTask
to the scheduler for execution
- The scheduler runs it until it blocks again.
The scheduler that runs the virtual thread can distinguish between a block due to yield and a blocking IO call, because in the latter case, the “yield flag” isn’t set. When another kind of block occurs, the scheduler blocks its own thread until it’s called back by the system to continue the virtual thread execution.
There is a discussion about this article in a kotlin slack. I think that the general description is good, but that the author cofuses asynchronous program model with threading, which leads him to not quite correct conclussions.