A program runs in a low-spec hardware utilizes coroutines writes to a single socket but how does the socket know when the data should be sent as there could more coroutines writing given N time.
I find this problem similar to The halting problem even tho contexts are different.
Does anyone know how could be a one find a optimal solution to solve this problem?