2012-02-21 80 views
0

我目前正在为内部目的开发一个自定义票据系统,我已经为其编写了一个RAILS应用程序。票据数据来源于我只能从中获得每日CSV摘要的另一个位置。我已经写了一个任务,在本地FTP是CSV文件,然后运行下面的任务,使用ActiveRecord将其导入到MySQL数据库中。解析CSV数据到Activerecord(MySQL)太慢,需要时间

但是它非常慢!我每天获得的每个CSV文件都包含大约20,000-40,000行,总数据大小为8-10MB。每一行都包含过去一天修改过的创建的故障单,这解释了为什么我要检查以下代码中是否已存在故障单。

在没有调试输出的生产模式下运行并没有太大的区别。

desc 'Takes Orion csv file and parses into DB.' 
    task :importcsv, [:local_file_path] => :environment do |t, args| 
     require 'csv' 
     @error_count = 0 
     @success_count = 0 

     csv = CSV.read(args.local_file_path, col_sep: ",", encoding: "ISO8859-1", headers: true) 

     csv.each do |row| 
      if(/PR(.*)/.match(row[0])? true : false) # Skip PR tickets because they're a waste of space right now 
       @error_count += 1 
       next 
      end 

      if(row[0] == " ") # Break loop if ticketid is just whitespace 
       break 
      end 

      if(row[0].empty?) # Break loop if no ticketid 
       break 
      end 

      ticket = Ticket.find_or_create_by_ticketid(row[0], :severity => row[7], 
                   :status => row[1], 
                   :causecode => row[17], 
                   :title => row[25], 
                   :reportergrp => row[18], 
                   :resolvergrp => row[5], 
                   :resolvername => row[27], 
                   :opendate => row[14], 
                   :closedate => row[13], 
                   :accountname => row[23], 
                   :resolutiondesc => row[26]) 

      @success_count += 1 
     end 
     Rails.logger.info " #{@success_count} out of #{@error_count + @success_count} tickets were added or updated." 
    end 

这里是开发调试输出的一个样本:

############### START PARSING ORION DATA ############### 
Fetching data for date 2012-02-01... 
    Data already exists locally. Did not download. 
    Adding data to DB... 
    [1m[36mTicket Load (99.8ms)[0m [1mSELECT `tickets`.* FROM `tickets` WHERE `tickets`.`ticketid` = '03052019' LIMIT 1[0m 
    [1m[35m (0.3ms)[0m BEGIN 
    [1m[36mTicket Exists (24.2ms)[0m [1mSELECT 1 FROM `tickets` WHERE `tickets`.`ticketid` = BINARY '03052019' LIMIT 1[0m 
    [1m[35mSQL (1.4ms)[0m INSERT INTO `tickets` (`accountname`, `causecode`, `closedate`, `created_at`, `opendate`, `reportergrp`, `resolutiondesc`, `resolvergrp`, `resolvername`, `severity`, `status`, `ticketid`, `title`, `updated_at`) VALUES ('WESTPAC', 'AP_DATA', '2010-12-30 00:00:00', '2012-02-21 04:55:09', '2010-05-19 00:00:00', 'HDNZ', '-', 'DINZ', 'Sam Gardner', 3, 'CLOSED', '03052019', 'HTML GENERATED REPORTS CONT. OF FAULT: 03042', '2012-02-21 04:55:09') 
    [1m[36m (2.3ms)[0m [1mCOMMIT[0m 
    [1m[35mTicket Load (69.1ms)[0m SELECT `tickets`.* FROM `tickets` WHERE `tickets`.`ticketid` = '03089753' LIMIT 1 
    [1m[36m (0.4ms)[0m [1mBEGIN[0m 
    [1m[35mTicket Exists (19.8ms)[0m SELECT 1 FROM `tickets` WHERE `tickets`.`ticketid` = BINARY '03089753' LIMIT 1 
    [1m[36mSQL (0.9ms)[0m [1mINSERT INTO `tickets` (`accountname`, `causecode`, `closedate`, `created_at`, `opendate`, `reportergrp`, `resolutiondesc`, `resolvergrp`, `resolvername`, `severity`, `status`, `ticketid`, `title`, `updated_at`) VALUES ('WESTPAC', 'SW_PROGRAMCODE', NULL, '2012-02-21 04:55:09', '2010-07-20 00:00:00', 'HDNZ', '-', 'IANZ', 'Mitch Bell', 3, 'RESTORED', '03089753', 'CEE: EDS ERROR', '2012-02-21 04:55:09')[0m 
    [1m[35m (1.7ms)[0m COMMIT 
    [1m[36mTicket Load (66.2ms)[0m [1mSELECT `tickets`.* FROM `tickets` WHERE `tickets`.`ticketid` = '03236150' LIMIT 1[0m 
    [1m[35m (0.2ms)[0m BEGIN 
    [1m[36mTicket Exists (21.5ms)[0m [1mSELECT 1 FROM `tickets` WHERE `tickets`.`ticketid` = BINARY '03236150' LIMIT 1[0m 
    [1m[35mSQL (0.4ms)[0m INSERT INTO `tickets` (`accountname`, `causecode`, `closedate`, `created_at`, `opendate`, `reportergrp`, `resolutiondesc`, `resolvergrp`, `resolvername`, `severity`, `status`, `ticketid`, `title`, `updated_at`) VALUES ('WESTPAC', 'AP_DATA', '2011-12-12 00:00:00', '2012-02-21 04:55:09', '2011-03-04 00:00:00', 'HDNZ', '-', 'DINZ', 'Liam Fitzpatrick', 3, 'CLOSED', '03236150', 'SAMETIME CONNECTION ISSUES', '2012-02-21 04:55:09') 
    [1m[36m (1.5ms)[0m [1mCOMMIT[0m 
    [1m[35mTicket Load (64.5ms)[0m SELECT `tickets`.* FROM `tickets` WHERE `tickets`.`ticketid` = '03261509' LIMIT 1 
    [1m[36m (0.2ms)[0m [1mBEGIN[0m 
    [1m[35mTicket Exists (20.8ms)[0m SELECT 1 FROM `tickets` WHERE `tickets`.`ticketid` = BINARY '03261509' LIMIT 1 
    [1m[36mSQL (0.4ms)[0m [1mINSERT INTO `tickets` (`accountname`, `causecode`, `closedate`, `created_at`, `opendate`, `reportergrp`, `resolutiondesc`, `resolvergrp`, `resolvername`, `severity`, `status`, `ticketid`, `title`, `updated_at`) VALUES ('WESTPAC', ' ', NULL, '2012-02-21 04:55:09', '2011-05-08 00:00:00', 'OPSNZ', '-', 'ANNZ', 'Anusha Konduti', 3, 'OPEN', '03261509', 'P2PTSM002:-INFOMAN ONLY (TONZ): ANR2578W SCHEDULE WEEKLY_SYS', '2012-02-21 04:55:09')[0m 
    [1m[35m (1.4ms)[0m COMMIT 
+1

您是否考虑过直接将CSV导入到MySQL中,使用SQL进行清理,然后将清理后的内容复制到其最终位置?我喜欢Ruby,但是MySQL在数据争夺方面会快很多。 – 2012-02-21 05:15:11

+0

我不同意你在说什么。不过,如果可能的话,我想尽可能将其保留在一个包中 - 也就是说,我希望将其与RAILS应用隔离开来。 – 2012-02-21 05:29:43

+0

您可以从Rails内部执行任意SQL。 – 2012-02-21 05:54:47

回答

1

无论出于何种原因,InnoDB只花了很长时间。我切换到iSAM,它完全快得多。

2

或许你可以尝试批量将行插入到数据库,而不是单行插入。使用这种方法提高了性能,我获得了一些好运。

对于这个叫activerecord-import的人来说有一个非常好的宝石。只需将所有新对象收集到数组中,并在循环结束时将其批量插入。

+0

噢,真好!我一定会放弃这一点。我已经认识到这样的解决方案,因为它目前非常复杂。 – 2012-02-21 05:30:27

+0

我的数据有一点是有些行包含一张新票,而其他行则是我已经添加但已更新数据的票。批量INSERT会自动覆盖数据吗? – 2012-02-22 01:26:36

+0

我相信有一个选项可以同步现有记录,但我个人没有经验。查看源代码[here](https://github.com/zdennis/activerecord-import/blob/master/lib/activerecord-import/synchronize.rb) – 2012-02-22 02:24:21