2
我正在尝试将平面文件中的738627条记录读入MySQl。该脚本似乎运行良好,但给我上述内存错误。PHP内存耗尽错误,糟糕的代码或只是增加内存限制?
文件的样本是:
#export_dategenre_idapplication_idis_primary
#primaryKey:genre_idapplication_id
#dbTypes:BIGINTINTEGERINTEGERBOOLEAN
#exportMode:FULL
127667880285760002817317350
127667880285760002818261461
127667880285760002825372301
127667880285760002827785570
127667880285760002827930241
127667880285760002827987861
127667880285760002828089791
127667880285760002828168361
127667880285760002828192041
127667880285760002829144541
127667880285760002829351511
我试图用增加
ini_set("memory_limit","80M");
允许的内存,它仍然失败。我是否一直保持这种状态直到运行?
的代码完全是
<?php
ini_set("memory_limit","80M");
$db = mysql_connect("localhost", "uname", "pword");
// test connection
if (!$db) {
echo "Couldn't make a connection!";
exit;
}
// select database
if (!mysql_select_db("dbname",$db))
{
echo "Couldn't select database!";
exit;
}
mysql_set_charset('utf8',$db);
$delimiter = chr(1);
$eoldelimiter = chr(2) . "\n";
$fp = fopen('genre_application','r');
if (!$fp) {echo 'ERROR: Unable to open file.</table></body></html>'; exit;}
$loop = 0;
while (!feof($fp)) {
$loop++;
$line = stream_get_line($fp,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue; //Skip lines that start with #
$field[$loop] = explode ($delimiter, $line);
$fp++;
$export_date = $field[$loop][0];
$genre_id = $field[$loop][1];
$application_id = $field[$loop][2];
$query = "REPLACE into genre_apps
(export_date, genre_id, application_id)
VALUES ('$export_date','$genre_id','$application_id')";
print "SQL-Query: ".$query."<br>";
if(mysql_query($query,$db))
{
echo " OK !\n";
}
else
{
echo "Error<br><br>";
echo mysql_errno() . ":" . mysql_error() . "</font></center><br>\n";
}
}
fclose($fp);
?>
谢谢你,让脚本完整:) – kitenski 2010-07-16 14:37:32