The discussion around function color misses the distinction between the typical await approach and other solutions.
For example, consider a library that implements the C preprocessor; it implements a single function that takes a string to be processed and applies the C pre-processing algorithm to it and returns the preprocessed string.
c_preprocessor_v1(body: string) -> string
The C preprocessor has includes operations, so it might need to (recursively) open additional files. Instead of making assumptions about what's the include path is or even the existence of a filesystem, the designer of the c_preprocessor decided, in v2, to delegates the file opening to a separate function [1]:
c_preprocessor_v2(file: path, loader : path->string) -> string
c_preprocessor_v2 will incrementally call loader as it discover new include statements, possibly from the output of loader itself.
_v1 can of course be implemented in term of _v2 given a default loader definition.
Now you want to implement a preprocessor-as-a-service. It provides a rich API for the user to submit an initial file to your service and for the service to ask the user to submit the additional files on demand. And of course you want to use the c_preprocessor library. You expect your service to have to server hundreds of thousands of concurrent requests, so you want to make it async, in particular you want to make the loading async.
If you are using JS I believe you are screwed: you can't use the library as is: c_preprocessor_v2 and the async loader live in separate worlds: red (async) functions can call blue (sync) functions, but not vice versa; you need to ask the maintainer for a new async c_preprocessor_v3 that takes an async loader.
In some other languages (rust, c#, python) can wrap your async loader with a wrapper that blocks (in a way, closing over the async-ness of the function), but this is hardly ideal, the resulting call to c_preprocessor_v2 would not be async and prevent you from scaling to hundreds of thousands of requests. You might play around with offloading to thread pools, but as the bulk of the work is inside the c_preprocessor function it is never going to work well. In practice your blue functions can call red functions, but the resulting function is blue.
There is a third class of languages that allow you to combine blue and red functions producing red ones (Go, lua, scheme, and I believe this new Zig proposal); in these languages the caller can sandwich calls to sync functions across async domains, while still allowing suspending the whole call stack.
One disadvantage of the third class is that, as side effects are often unrestricted, if c_preprocessor relies on hidden global state, it might not be able to handle reentrancy correctly.
There is then a fourth class of languages where, not only side effects are always explicit (Haskel, some effectful programming languages), it is possible, and indeed idiomatic to be able to abstract over it. So c_processor_v2 might not only be able to call synchronous or asynchronous loaders transparently, but the idiomatic implementation might even be able to extract additional concurrency by not imposing dependencies unless necessary. One interpretation is that in these languages functions are always red, but I think that's reductive and not useful.
[1] this example uses higher order functions, but an OOP example would be of course completely equivalent.