我想直接导入网络报废的数据到PostgreSQL中,而不是先导出到.csv。从网站直接导入抓取的数据到PostgreSQL
下面是我正在使用的代码,将数据导出到.csv文件,然后我手动导入它。任何帮助,将不胜感激
from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
my_url = 'http://tis.nhai.gov.in/TollInformation?TollPlazaID=236'
uClient = uReq(my_url)
page1_html = uClient.read()
uClient.close()
#html parsing
page1_soup = soup(page1_html,"html.parser")
filename = "TollDetail12.csv"
f = open(filename,"w")
headers = "ID, tollname, location, highwayNumber\n"
f.write(headers)
#grabing data
containers = page1_soup.findAll("div",{"class":"PA15"})
for container in containers:
toll_name = container.p.b.text
search1 = container.findAll('b')
highway_number = search1[1].text
location = list(container.p.descendants)[10]
ID = my_url[my_url.find("?"):]
mystr = ID.strip("?")
print("ID: " + mystr)
print("toll_name: " + toll_name)
print("location: " + location)
print("highway_number: " + highway_number)
f.write(mystr + "," + toll_name + "," + location + "," + highway_number.replace(",","|") + "\n")
f.close()
[在数据进入的PostgreSQL插入(http://www.postgresqltutorial.com/postgresql-python/insert/)阅读。它会帮助你解决你的问题。 –