I had a need for lots of concurrent HTTP requests and felt the pain of not having an easy-to-use concurrency model for coroutines.
So I made a Concurrently
utility. The usage looks like this:
Common case of just “run a bunch of side effects together but I need to await for all of them to be done before the next part of the code”
await Concurrently.fire_all( [some_coroutine, another_coroutine])
Another common case of “run these concurrently but I want all the results back”
var fired:= await Concurrently.fire_all([some_coroutine, another_coroutine])
# Order is preserved from the array passed in to .fire_all()
var some_coroutine_result := fired[0].result
var another_coroutine_result := fired[1].result
Also common case of “Need to run coroutines together that have arguments which will come from the scope I’m in” - here we can use lambdas
await Concurrently.fire_all(
[
func (): return await some_coroutine(local_var_1, local_var_2),
func (): return await other_coroutine(local_var_3)
]
)
And finally, a rarer but VERY helpful case of “I want to fire them off concurrently, but I want to await for their results at different times”. Instead of .fire_all()
, we can use individual Concurrently
instances. This also covers the even rarer but useful case of “I want to fire them off at different times too.”
# something that can run on the side and we want to fire as eagerly as possible
var concurrent_1 := Concurrently.new(some_independent_coroutine)
# regular ol' function here to get some data that's needed for next step
var required_data := some_immediately_needed_func()
var concurrent_2 := Concurrently.new(
func(): return await some_dependent_coroutine(required_data)
)
#could continue to do other work as necessary in this function, then when needed, do:
await concurrent_2.is_done()
print(concurrent_2.result)
## could also await this other one right away if needed, or continue working first, then:
await concurrent_1.is_done()
print(concurrent_1.result)
# note that the .is_done() method also returns the result, as a convenience
If this is helpful for you and the API surface is to your liking, then to use this in your own projects just create a script file of your choosing (probably concurrently.gd
), and here’s the (surprisingly lightweight) code:
class_name Concurrently extends RefCounted
signal _done_signal
var _is_complete: bool = false
var coroutine: Callable
var result: Variant
func fire() -> void:
result = await coroutine.call()
_is_complete = true
_done_signal.emit()
func _init(incoming_coroutine: Callable) -> void:
coroutine = incoming_coroutine
fire()
func is_done() -> Variant:
if not _is_complete:
await _done_signal
return result
static func fire_all(funcs: Array[Callable]) -> Array:
var fired: Array[Concurrently]
for this_func: Callable in funcs:
var concurrent_func: Concurrently = Concurrently.new(this_func)
fired.append(concurrent_func)
for item: Concurrently in fired:
await item.is_done()
return fired