有没有办法一次性上传多个文件,而不必为每个文件重新连接?
我正在使用 S3 作为我的 php 应用程序的存储,该应用程序需要存储大量(一次 100 个)大部分是小(约 10k)的图像文件。目前,我正在循环浏览它们,并使用以下代码为每个代码单独上传:
$s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ)
这需要很长时间。大约一分钟1.5 meg的文件。按照其他答案中的建议关闭SSL可以减少到大约40秒,但这仍然很慢。
这是我当前的代码,使用 PHP 的 Amazon S3 REST 实施
$s3 = new S3($awsAccessKey, $awsSecretKey, false);
function send_to_s3($s3, $bucketName, $uploadFile)
{
$start = microtime(true);
// Check if our upload file exists
if (!file_exists($uploadFile) || !is_file($uploadFile))
exit("'nERROR: No such file: $uploadFile'n'n");
// Check for CURL
if (!extension_loaded('curl') && !@dl(PHP_SHLIB_SUFFIX == 'so' ? 'curl.so' : 'php_curl.dll'))
exit("'nERROR: CURL extension not loaded'n'n");
if ($s3->putObjectFile($uploadFile, $bucketName, ($uploadFile), S3::ACL_PUBLIC_READ))
{
$end = microtime(true);
$took = $end - $start;
echo "S3::putObjectFile(): File copied to {$bucketName}/".($uploadFile).PHP_EOL . ' - ' . filesize($uploadFile) . ' in ' . $took . ' seconds<br />';
return $took;
}
else
{
print 'error';
}
}
感谢任何帮助。
use Aws'S3'S3Client;
use Aws'CommandPool;
use Guzzle'Service'Exception'CommandTransferException;
$commands = array();
foreach ( $objects as $key => $file ) {
$fileContent = $file['body'];
$objParams = array (
'ACL' => 'bucket-owner-full-control',
'Bucket' => 'bucket_name',
'Key' => 's3_path',
'Body' => $fileContent
);
$commands[] = $clientS3->getCommand('PutObject', $objParams);
}
try {
$results = CommandPool::batch($clientS3, $commands);
} catch (CommandTransferException $e) {
$succeeded = $e->getSuccessfulCommands();
echo "Failed Commands:'n";
foreach ($e->getFailedCommands() as $failedCommand) {
echo $e->getExceptionForFailedCommand($failedCommand)->getMessage() . "'n";
}
}
我认为您需要做的是并行执行命令,如此处文档中所述:http://docs.aws.amazon.com/aws-sdk-php/v2/guide/feature-commands.html#executing-commands-in-parallel
更新:刚刚注意到杰里米的评论。不好意思!
我建议通过分段上传图片。
下面的代码是使用 Aws-sdk 版本 3 的基本示例。
use Aws'S3'MultipartUploader;
use Aws'Exception'MultipartUploadException;
$uploader = new MultipartUploader($s3Client, '/path/to/large/file.zip', [
'bucket' => 'your-bucket',
'key' => 'my-file.zip',
]);
try {
$uploader->upload();
echo "Upload complete.'n";
} catch (MultipartUploadException $e) {
echo $e->getMessage() . "'n";
}