2014-12-08 42 views
0

我是nutch的新手。几周后玩它。我终于可以开始爬行了。Nutch 1.9命令抓取只取一个级别

我安装了nutch 1.9和solr 4.1,我的seed.txt文件只包含1个url,我的regex-urlfiler.txt被设置为接受所有内容。我运行这个命令:

bin/crawl urls crawl http://104.131.94.**:8983/solr/ 1 -depth 3 -topN 5 

这里是输出:这里

Injector: starting at 2014-12-07 18:41:31 
Injector: crawlDb: crawl/crawldb 
Injector: urlDir: urls 
Injector: Converting injected urls to crawl db entries. 
Injector: overwrite: false 
Injector: update: false 
Injector: Total number of urls rejected by filters: 0 
Injector: Total number of urls after normalization: 1 
Injector: Total new urls injected: 1 
Injector: finished at 2014-12-07 18:41:33, elapsed: 00:00:01 
Sun Dec 7 18:41:33 EST 2014 : Iteration 1 of 1 
Generating a new segment 
Generator: starting at 2014-12-07 18:41:34 
Generator: Selecting best-scoring urls due for fetch. 
Generator: filtering: false 
Generator: normalizing: true 
Generator: topN: 50000 
Generator: Partitioning selected urls for politeness. 
Generator: segment: crawl/segments/20141207184137 
Generator: finished at 2014-12-07 18:41:38, elapsed: 00:00:03 
Operating on segment : 20141207184137 
Fetching : 20141207184137 
Fetcher: starting at 2014-12-07 18:41:39 
Fetcher: segment: crawl/segments/20141207184137 
Fetcher Timelimit set for : 1418006499487 
Using queue mode : byHost 
Fetcher: threads: 50 
Fetcher: time-out divisor: 2 
QueueFeeder finished: total 1 records + hit by time limit :0 
Using queue mode : byHost 
Using queue mode : byHost 
fetching http://www.wenxuecity.com/ (queue crawl delay=5000ms) 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=6 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=5 
Thread FetcherThread has no more work available 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=3 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=2 
Thread FetcherThread has no more work available 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=2 
-finishing thread FetcherThread, activeThreads=5 
-finishing thread FetcherThread, activeThreads=1 
-finishing thread FetcherThread, activeThreads=4 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=1 
Using queue mode : byHost 
Fetcher: throughput threshold: -1 
Thread FetcherThread has no more work available 
Fetcher: throughput threshold retries: 5 
-finishing thread FetcherThread, activeThreads=1 
fetcher.maxNum.threads can't be < than 50 : using 50 instead 
Thread FetcherThread has no more work available 
-finishing thread FetcherThread, activeThreads=0 
-activeThreads=0, spinWaiting=0, fetchQueues.totalSize=0, fetchQueues.getQueueCount=0 
-activeThreads=0 
Fetcher: finished at 2014-12-07 18:41:42, elapsed: 00:00:02 
Parsing : 20141207184137 
ParseSegment: starting at 2014-12-07 18:41:43 
ParseSegment: segment: crawl/segments/20141207184137 
Parsed (17ms):http://www.wenxuecity.com/ 
ParseSegment: finished at 2014-12-07 18:41:46, elapsed: 00:00:02 
CrawlDB update 
CrawlDb update: starting at 2014-12-07 18:41:48 
CrawlDb update: db: crawl/crawldb 
CrawlDb update: segments: [crawl/segments/20141207184137] 
CrawlDb update: additions allowed: true 
CrawlDb update: URL normalizing: false 
CrawlDb update: URL filtering: false 
CrawlDb update: 404 purging: false 
CrawlDb update: Merging segment data into db. 
CrawlDb update: finished at 2014-12-07 18:41:49, elapsed: 00:00:01 
Link inversion 
LinkDb: starting at 2014-12-07 18:41:51 
LinkDb: linkdb: crawl/linkdb 
LinkDb: URL normalize: true 
LinkDb: URL filter: true 
LinkDb: internal links will be ignored. 
LinkDb: adding segment: crawl/segments/20141207184137 
LinkDb: finished at 2014-12-07 18:41:52, elapsed: 00:00:01 
Dedup on crawldb 
Indexing 20141207184137 on SOLR index -> http://104.131.94.36:8983/solr/ 
Indexer: starting at 2014-12-07 18:41:58 
Indexer: deleting gone documents: false 
Indexer: URL filtering: false 
Indexer: URL normalizing: false 
Active IndexWriters : 
SOLRIndexWriter 
     solr.server.url : URL of the SOLR instance (mandatory) 
     solr.commit.size : buffer size when sending to SOLR (default 1000) 
     solr.mapping.file : name of the mapping file for fields (default solrindex-mapping.xml) 
     solr.auth : use authentication (default false) 
     solr.auth.username : use authentication (default false) 
     solr.auth : username for authentication 
     solr.auth.password : password for authentication 


Indexer: finished at 2014-12-07 18:42:01, elapsed: 00:00:03 
Cleanup on SOLR index -> http://104.131.94.36:8983/solr/ 

有几个问题:

  1. 用它抓取并没有把我的TOPN 5,而不是topN = 50000,然后我看看抓取脚本,它被硬编码为50000并不真正采用-topN参数。我想我可以修改脚本。

  2. 深度3也被忽略了,在我看来,脚本中也没有参数来照顾深度。

我可以看到很多例子都在运行命令nutch crawl,但是使用1.9的命令不能再使用了。我真的被困在这里,任何建议,将不胜感激。

索尔索引工作正常,我总是得到1文件索引。我尝试了几个可以抓取的网站,脚本总是停在第一层。

感谢 鹏程

回答

1

尝试使用独立的命令关于网络爬虫。然后检查第二次运行可以抓取多少页。如果它的0页然后检查你的包含路径(应该像+^http://www.google.com/)regex-urlfilter.txt。

请参阅如何运行Individual command

2

它现在的工作,第一轮只有1页被取出轮和第二轮被提取的大量网页,我猜回合数是一样的深度。