我在解析和计算性能时遇到问题导航计时数据我在csv中。使用CSV对Logstash进行分析和计算
我能够解析字段,但不知道如何正确地进行计算(下面)。有几点要牢记:
数据集由加粗值组合在一起(这是在21个数据点被带到 ACMEPage-1486643427973,unloadEventEnd,1486643372422个 2.Calculations需要的TS要与数据进行积分 我假设需要做一些标记和分组,但是我对如何实现它没有清晰的认识,任何帮助都将不胜感激。
感谢,
---------------计算-----------------
- 总第一个字节时间= responseStart - navigationStart
- 时延= responseStart - fetchStart
- DNS /域名查找时间= domainLookupEnd - domainLookupStart
- 服务器连接时间= connectEnd - connectStart
- 服务器响应时间= responseStart - requestStart
- 页面加载时间= loadEventStart - navigationStart
- 传输/网页下载时间= responseEnd - responseStart
- DOM互动时间= domInteractive - navigationStart
- DOM内容加载时间= domContentLoadedEventEnd - navigationStart
- DOM Processing to Interactive = domInteractive - domLoading
- DOM Interactive to Complete = domComplete - domInteractive
- 的Onload = loadEventEnd - loadEventStart
-------数据在CSV -----------
ACMEPage-1486643427973,unloadEventEnd,1486643372422 ACMEPage-1486643427973,responseEnd,1486643372533 ACMEPage-1486643427973,responseStart,1486643372416 ACMEPage-1486643427973,domInteractive,1486643373030 ACMEPage-1486643427973,domainLookupEnd,1486643372194 ACMEPage-1486643427973,卸载EventStart,1486643372422 ACMEPage-1486643427973,domComplete,1486643373512 ACMEPage-1486643427973,domContentLoadedEventStart,1486643373030 ACMEPage-1486643427973,domainLookupStart,1486643372194 ACMEPage-1486643427973,redirectEnd,0 ACMEPage-1486643427973,redirectStart,0 ACMEPage-1486643427973,connectEnd, 1486643372194 ACMEPage-1486643427973,的toJSON,{} ACMEPage-1486643427973,connectStart,1486643372194 ACMEPage-1486643427973,loadEventStart,1486643373512 ACMEPage-1486643427973,navigationStart,1486643372193 ACMEPage-1486643427973,requestStart,1486643372203 ACMEPage-1486643427973,secureConnectionStart,0 ACMEPage-1486643427973,fetchStart,1486643372194 ACMEPage-1486643427973,domContentLoadedEventEnd,1486643373058 ACMEPage-1486643427973,domLoading,1486643372433 ACMEPage-1486643427973,loadEventEnd,1486643373514
----------输出 - --------------
"path" => "/Users/philipp/Downloads/build2/logDataPoints_com.concur.automation.cge.ui.admin.ADCLookup_1486643340910.csv",
"@timestamp" => 2017-02-09T12:29:57.763Z,
"navigationTimer" => "connectStart",
"@version" => "1",
"host" => "15mbp-09796.local",
"elapsed_time" => "1486643372194",
"pid" => "1486643397763",
"page" => "ADCLookupDataPage",
"message" => "ADCLookupDataPage-1486643397763,connectStart,1486643372194",
"type" => "csv"
}
-------------- logstash。CONF ----------------
input {
file {
type => "csv"
path => "/Users/path/logDataPoints_com.concur.automation.acme.ui.admin.acme_1486643340910.csv"
start_position => beginning
# to read from the beginning of file
sincedb_path => "/dev/null"
}
}
filter {
csv {
columns => ["page_id", "navigationTimer", "elapsed_time"]
}
if (["elapsed_time"] == "{}") {
drop{}
}
else {
grok {
match => { "page_id" => "%{WORD:page}-%{INT:pid}"
}
remove_field => [ "page_id" ]
}
}
date {
match => [ "pid", "UNIX_MS" ]
target => "@timestamp"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}