新聞中心
Cookie的英文原意是“點(diǎn)心”,它是在客戶端訪問(wèn)Web服務(wù)器時(shí),服務(wù)器在客戶端硬盤上存放的信息,好像是服務(wù)器發(fā)送給客戶的“點(diǎn)心”。服務(wù)器可以根據(jù)Cookie來(lái)跟蹤客戶狀態(tài),這對(duì)于需要區(qū)別客戶的場(chǎng)合(如電子商務(wù))特別有用。
當(dāng)客戶端首次請(qǐng)求訪問(wèn)服務(wù)器時(shí),服務(wù)器先在客戶端存放包含該客戶的相關(guān)信息的Cookie,以后客戶端每次請(qǐng)求訪問(wèn)服務(wù)器時(shí),都會(huì)在HTTP請(qǐng)求數(shù)據(jù)中包含Cookie,服務(wù)器解析HTTP請(qǐng)求中的Cookie,就能由此獲得關(guān)于客戶的相關(guān)信息。
下面我們就來(lái)看一下python3爬蟲(chóng)帶上cookie的方法:
1、直接將Cookie寫在header頭部
# coding:utf-8 import requests from bs4 import BeautifulSoup cookie = '''cisession=19dfd70a27ec0eecf1fe3fc2e48b7f91c7c83c60;CNZZDATA1000201968=181584 6425-1478580135-https%253A%252F%252Fwww.baidu.com%252F%7C1483922031;Hm_lvt_f805f7762a9a2 37a0deac37015e9f6d9=1482722012,1483926313;Hm_lpvt_f805f7762a9a237a0deac37015e9f6d9=14839 26368''' header = { 'User-Agent': 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Geck o) Chrome/53.0.2785.143 Safari/537.36', 'Connection': 'keep-alive', 'accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8', 'Cookie': cookie} url = 'https://www.jb51.net/article/191947.htm' wbdata = requests.get(url,headers=header).text soup = BeautifulSoup(wbdata,'lxml') print(soup)
2、使用requests插入Cookie
# coding:utf-8 import requests from bs4 import BeautifulSoup cookie = { "cisession":"19dfd70a27ec0eecf1fe3fc2e48b7f91c7c83c60", "CNZZDATA100020196":"1815846425-1478580135-https%253A%252F%252Fwww.baidu.com%252F%7C1483 922031", "Hm_lvt_f805f7762a9a237a0deac37015e9f6d9":"1482722012,1483926313", "Hm_lpvt_f805f7762a9a237a0deac37015e9f6d9":"1483926368" } url = 'https://www.jb51.net/article/191947.htm' wbdata = requests.get(url,cookies=cookie).text soup = BeautifulSoup(wbdata,'lxml') print(soup)
實(shí)例擴(kuò)展:
使用cookie登錄哈工大ACM站點(diǎn)
獲取站點(diǎn)登錄地址
http://acm.hit.edu.cn/hoj/system/login
查看要傳送的post數(shù)據(jù)
user和password
Code:
#!/usr/bin/env python # -*- coding: utf-8 -*- """ __author__ = 'pi' __email__ = 'pipisorry@126.com' """ import urllib.request, urllib.parse, urllib.error import http.cookiejar LOGIN_URL = 'http://acm.hit.edu.cn/hoj/system/login' values = {'user': '******', 'password': '******'} # , 'submit' : 'Login' postdata = urllib.parse.urlencode(values).encode() user_agent = r'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/44.0.2403.157 Safari/537.36' headers = {'User-Agent': user_agent, 'Connection': 'keep-alive'} cookie_filename = 'cookie.txt' cookie = http.cookiejar.MozillaCookieJar(cookie_filename) handler = urllib.request.HTTPCookieProcessor(cookie) opener = urllib.request.build_opener(handler) request = urllib.request.Request(LOGIN_URL, postdata, headers) try: response = opener.open(request) page = response.read().decode() # print(page) except urllib.error.URLError as e: print(e.code, ':', e.reason) cookie.save(ignore_discard=True, ignore_expires=True) # 保存cookie到cookie.txt中 print(cookie) for item in cookie: print('Name = ' + item.name) print('Value = ' + item.value) get_url = 'http://acm.hit.edu.cn/hoj/problem/solution/?problem=1' # 利用cookie請(qǐng)求訪問(wèn)還有一個(gè)網(wǎng)址 get_request = urllib.request.Request(get_url, headers=headers) get_response = opener.open(get_request) print(get_response.read().decode()) # print('You have not solved this problem' in get_response.read().decode())
推薦教程:《Python教程》
文章題目:Python3爬蟲(chóng)帶上cookie
分享路徑:http://www.ef60e0e.cn/article/cghije.html