1

A program runs in a low-spec hardware utilizes coroutines writes to a single socket but how does the socket know when the data should be sent as there could more coroutines writing given N time.

I find this problem similar to The halting problem even tho contexts are different.

Does anyone know how could be a one find a optimal solution to solve this problem?

jeffbRTC
  • 111
  • 2

1 Answers1

0

I ended up using a buffer limit and a queue.

I knew that this may not answer this question fully but it's the only optimal solution I could find.

jeffbRTC
  • 111
  • 2