For various reasons, I need to play the intermediary between an HTTP Request and a file on disk. My approach has been to populate headers and then perform a readfile('/path/to/file.jpg');
Now, everything works fine, except that it returns even a medium sized image very slowly.
Can anyone provide me with a more efficient way of streaming the file to the client once the headers have been sent?
Note: it's a linux box in a shared hosting environment if it matters
Several web servers allow an external script to tell them to do exactly this. X-Sendfile on Apache (with mod_xsendfile) is one.
In a nutshell, all you send is headers. The special X-Sendfile
header instructs the web server to send the named file as the body of the response.
You could start with implementing conditional GET request support.
Send out a "Last-Modified" header with the file and reply with "304 Not Modified" whenever the client requests the file with "If-Modified-Since" and you see that the file has not been modified. Some sensible freshness-information (via "Cache-Control" / "Expires" headers) also is advisable to prevent repeated requests for an unchanged resource in the first place.
This way at least the perceived performance can be improved, even if you should find that you can do nothing about the actual performance.
This should actually be fairly fast. We have done this with large images without a problem. Are you doing anything else before outputting the image that might be slowing the process, such as calculating some meta data on the image?
Edit: You may need to flush the output and use fread. IE:
$fp = fopen($strFile, "rb");
//start buffered download
while(!feof($fp))
{
print(fread($fp,1024*16));
flush();
ob_flush();
}
fclose($fp);
http://us3.php.net/manual/en/function.fpassthru.php#5500 - Jeff Davis 2009-06-16 17:41
Basically you want to build a server... That's not trivial.
There is a very promissing project of a PHP based server : Nanoweb.
It's free, and fully extensible.