我比较陌生python
。我打算
a)从以下URL(https://aviation-safety.net/database/)获取URL列表,以及从1919年开始的数据(https://aviation-safety.net/database/dblist.php?Year = 1919)。
b)获取从1919年到当年的数据(日期,类型,注册,操作者,肥胖,位置,猫)
但是,我遇到了一些问题,仍然陷于a)
感谢您提供任何形式的帮助,非常感谢!
#import packages
import numpy as np
import pandas as pd
from bs4 import BeautifulSoup
#start of code
mainurl = "https://aviation-safety.net/database/"
def getAndParseURL(mainurl):
result = requests.get(mainurl)
soup = BeautifulSoup(result.content, 'html.parser')
datatable = soup.find('a', href = True)
#try clause to go through the content and grab the URLs
try:
for row in datatable:
cols = row.find_all("|")
if len(cols) > 1:
links.append(x, cols = cols)
except: pass
#place links into numpy array
links_array = np.asarray(links)
len(links_array)
#check if links are in dataframe
df = pd.DataFrame(links_array)
df.columns = ['url']
df.head(10)
我似乎无法获得网址
如果我能得到以下将会很棒
S / N URL 1 https://aviation-safety.net/database/dblist.php?Year=1919 2 https://aviation-safety.net/database/dblist.php?Year=1920 3 https://航空-safety.net/database/dblist.php?Year=1921
您不是href
从要提取的标签中提取属性。您想要做的是找到所有<a>
带有链接的标签(您这样做了,但是您需要使用find_all
as,find
它只会返回找到的第一个标签。)然后遍历这些标签。我选择只是让它寻找子字符串'Year'
,如果需要,将其放入列表中。
#import packages
import numpy as np
import pandas as pd
from bs4 import BeautifulSoup
import requests
#start of code
mainurl = "https://aviation-safety.net/database/"
def getAndParseURL(mainurl):
result = requests.get(mainurl)
soup = BeautifulSoup(result.content, 'html.parser')
datatable = soup.find_all('a', href = True)
return datatable
datatable = getAndParseURL(mainurl)
#go through the content and grab the URLs
links = []
for link in datatable:
if 'Year' in link['href']:
url = link['href']
links.append(mainurl + url)
#check if links are in dataframe
df = pd.DataFrame(links, columns=['url'])
df.head(10)
输出:
df.head(10)
Out[24]:
url
0 https://aviation-safety.net/database/dblist.ph...
1 https://aviation-safety.net/database/dblist.ph...
2 https://aviation-safety.net/database/dblist.ph...
3 https://aviation-safety.net/database/dblist.ph...
4 https://aviation-safety.net/database/dblist.ph...
5 https://aviation-safety.net/database/dblist.ph...
6 https://aviation-safety.net/database/dblist.ph...
7 https://aviation-safety.net/database/dblist.ph...
8 https://aviation-safety.net/database/dblist.ph...
9 https://aviation-safety.net/database/dblist.ph...
本文收集自互联网,转载请注明来源。
如有侵权,请联系 [email protected] 删除。
我来说两句