我试图导入巨大CSV(> 1GB)的文件在MySQL数据库:导入CSV用PHP
/**
* @param $file_path
*/
private function importFileContents($file_path)
{
$query = sprintf("LOAD DATA LOCAL INFILE '%s'
INTO TABLE file_import_contents
FIELDS TERMINATED BY ';'
LINES TERMINATED BY '\\n'
IGNORE 1 LINES
(@col1, @col2, @col3, @col4, @col5, @col6, @col7, @col8, @col9, @col10, @col11, @col12, @col13, @col14, @col15, @col16, @col17, @col18, @col19, @col20, @col21)
set [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected]
", addslashes($file_path));
$em = $this->getContainer()->get('doctrine.orm.default_entity_manager');
$connection = $em->getConnection();
$statement = $connection->prepare($query);
$statement->execute();
}
而且问题是,当我导入后在数据库执行sql计数:
SELECT COUNT(*) FROM file_import_contents;
它返回我行,但是当我在终端中运行命令在所有CSV文件计数行:
find ./ -type f -name "*csv*" -exec wc -w {} +
返回总行... 我的PHP PARAMS是:
upload_max_filesize = 32000M max_execution_time = 300000 max_input_time = 600000 memory_limit = 1024M
...和脚本:
ini_set('memory_limit', '-1');
为什么会出现这种情况怎么可能我完全上传这些文件?感谢您的任何意见
这将是可能的做一行行,使用类似SplFileObject和fgetcsv(http://php.net/manual/en/splfileobject.fgetcsv.php)来读取每一行,然后将其插入。这意味着您可以一次处理duff数据记录。它会很慢,但它应该工作。 –
我试图导入行由行与联赛\的CSV \阅读器,但它的好与几乎30-50Mb文件的工作,但如果文件为1GB - 进口大约需要几个小时 –