I am trying to solve an issue with uploads to our web infastructure.
When a user uploads media to our site, it is proxied (via our Web Proxy tier) to a Java backend with a limited number of threads. When a user has a slow connection or a large upload, this holds one of the Java threads open a long period of time, reducing overall capacity.
To mitigate this I'd like to implement an 'upload proxy' which will accept the entire HTTP POST data of the upload and only when it has received all of the data it will proxy that POST to the Java backend quickly, pushing the problem of the upload thread being held open to a HTTP proxy.
Initially I found Apache Traffic Server has a 'buffer_upload' plugin, but it seems a bit bleeding edge and has no support for regex in URLs, although it would solve most of my issues.
Does anyone know a proxy product that would be able to do what I am suggesting (aside from Apache Traffic Server)?
I see that Nginx has fairly detaild buffer settings for proxying, but it doesn't seem (from docs/explanations) to wait for the whole POST before opening a backend connection/thread. Do I have this right?
Cheers,
Tim
Actually, nginx always buffers requests before opening a connection to the backend. It is possible to turn off response buffering using proxy_buffering or setting an X-Accel-Buffering response header for per-response buffering control.