1
我试图抓取一个网站的数据。以下是我正在使用的脚本的简化版本。在这个例子中disease.table通过html_table在最后一行retreived应该包括所有的从年2014年2017年。目前,它仅包含2017年在rvest中使用submit_form
# Initiate wahis.session website session.
url <- "http://www.oie.int/wahis_2/public/wahid.php/Diseaseinformation/statusdetail"
wahis.session <- html_session(url)
# Get forms with searchable fields in current session.
form <- html_form(wahis.session)
#------------------------------------------------------------------------------
# Fill out and submit forms
# First, Copy form.
filled.form <- form
# Set values in form #3
filled.form[[3]] <- set_values(filled.form[[3]],
selected_start_year = "2014") # start year
# Set form url to empty character string to prevent error message when
# submitting.
filled.form[[3]]$url <- ""
# Submit form #3
submit_form(session = wahis.session,
form = filled.form[[3]],
submit = 'disease_id_terrestrial')
#------------------------------------------------------------------------------
# Retreive data
# Find data table
disease.table <- wahis.session %>%
html_node("div.OverflowAutoScroll table.TableFoyers") %>%
html_table(fill = TRUE)