我有连续的问题,我的脚本运行我们的内存。
我需要脚本遍历数据库中的每个客户,然后获取所有产品数据并生成一个文本文件。每个客户可以拥有1到100,000个产品。
我以1000个为一批取出产品数据,并写入一个文件,以尝试阻止脚本超时。这已经改善了很多,但是,我仍然有问题的客户有大量的产品。对于拥有超过5000种产品的客户,他们似乎有问题。
似乎在第5批(5,000个prods)之后停止写入文件,但浏览器只是挂起,好像它仍在生成文件,但文件中的product no从未增加。
有人能帮忙吗?
set_time_limit(0);
$db = new msSqlConnect('db');
$select = "SELECT customer FROM feeds ";
$run = mssql_query($select);
while($row = mssql_fetch_array($run)){
$arg = $row['customer'];
$txt_file = 'shopkeeper/'. $arg . '.txt';
$generated = generateFeed($db, $arg, $txt_file);
if ($generated){
$update = "UPDATE feeds SET lastGenerated = '$generated' WHERE customer = '$arg' ";
mssql_query($update);
}
}
function generateFeed($db, $customer, $file){
//if file already exists then delete file so can write new file
if (file_exists($file)){
unlink($file);
}
$datafeed_separator = "|";
//get product details
$productsObj = new Products($db, customer)
//find out how many products customer has
$countProds = $productsObj->countProducts();
$productBatchLimit = 1000;
//create new file
$fh = fopen($file, 'a');
$counter = 1;
for ($i = 0; $i < $countProds; $i += $productBatchLimit) {
$txt = '';
$limit = $productBatchLimit*$counter;
$products = $productsObj->getProducts($i, $limit);
foreach($products as $product){
$txt .=
$prod_name . $datafeed_separator .
$prod_brand . $datafeed_separator .
$prod_desc . $datafeed_separator .
$prod_price . $datafeed_separator . "'n";
}
}
fwrite($fh, $txt);
flush();
$counter++;
}
fclose($fh);
$endTime = date('Y-m-d H:i:s');
return $endTime;
}
我可以看到一件事可能有助于你的内存使用。如果将fwrite()移到foreach循环中,也可以在循环中释放$txt。所以它应该是这样的:
foreach($products as $product){
$txt =
$prod_name . $datafeed_separator .
$prod_brand . $datafeed_separator .
$prod_desc . $datafeed_separator .
$prod_price . $datafeed_separator . "'n";
fwrite($fh, $txt);
}
如果你有很多产品,这样可以防止$txt变大