2016年11月15日 星期二

Upload large files to S3 using Laravel 5

This is the code he used (slightly redacted):

$disk= Storage::disk('s3');
$disk->put($targetFile, file_get_contents($sourceFile));
This is a good way to go about it for small files. You should note that file_get_contentswill load the entire file into memory before sending it to S3. This can be problematic for large files.
If you want to upload big files you should use streams. Here’s the code to do it:

$disk = Storage::disk('s3');
$disk->put($targetFile, fopen($sourceFile, 'r+'));
PHP will only require a few MB of RAM even if you upload a file of several GB.
You can also use streams to download a file from S3 to the local file system:

$disk = Storage::disk('s3');
$stream = $disk->getDriver()
               ->readStream($sourceFileOnS3);
file_put_contents($targetFile, stream_get_contents($stream), FILE_APPEND);
You can even use streams to copy file from one disk to another without touching the local filesystem:

$stream = Storage::disk('s3')->getDriver()
                             ->readStream($sourceFile);
Storage::disk('sftp')->put($targetFile, $stream);

0 意見:

張貼留言