I. Grab List
First tap on the Dance section to first select the Otaku Dance list.
Then open F12's control panel and you can find an entry/x/web-interface/newlist?rid=20&type=0&pn=1&ps=20&jsonp=jsonp&callback=jsonCallback_bili_57905715749828263
url, where rid is the subcategory of station B and pn is the number of pages.
I tried to open the address in the browser actually reported 404, but in the control panel the return value of this address is clearly the video list. I tried to remove the callback parameter and got the desired result.
As we all know, bid is the unique ID of a video on B site, to get the bid, we can extract the aid from the return value of the url above, and then convert the aid to bid.
Str = 'fZodR9XQDSUm21yCkr6zBqiveYah8bt4xsWpHnJE7jL5VG3guMTKNPAwcF' # A prepared string of specified characters Dict = {} # Put each character of the string into the dictionary one by one, e.g. f for 0 Z for 1 and so on. for i in range(58): Dict[Str[i]] = i s = [11, 10, 3, 8, 4, 6, 2, 9, 5, 7] # Necessary decryption lists xor = 177451812 add = 100618342136696320 # The string of numbers to be subtracted or added at the end # def algorithm_enc(av): ret = av av = int(av) av = (av ^ xor) + add # Convert the format of the BV number (BV + 10 characters) into a list for later manipulation. r = list('BV ') for i in range(10): r[s[i]] = Str[av // 58 ** i % 58] return ''.join(r) def find_bid(p): bids = [] r = ( '/x/web-interface/newlist?&rid=20&type=0&pn={}&ps=50&jsonp=jsonp'.format(p)) data = () archives = data['data']['archives'] for item in archives: aid = item['aid'] bid = algorithm_enc(aid) (bid) return bids
Second, get the CID of the video
To download a 1080 video, bid is not enough, you need the SESSDATA value from the login cookie and the cid.
First log in to Site B to copy the SESSDATA from the cookie into the object header. Copy the SESSDATA from the cookie into the object header with the address/x/player/pagelist?bvid=
url returns the cid.
def get_cid(bid): url = '/x/player/pagelist?bvid=' + bid headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36', 'Cookie': 'SESSDATA=182cd036%2C1636985829%2C3b393%2A51', 'Host': '' } html = (url, headers=headers).json() infos = [] data = html['data'] cid_list = data for item in cid_list: cid = item['cid'] title = item['part'] ({'bid': bid, 'cid': cid, 'title': title}) return infos
III. Download video
video/x/player/playurl
Comes from a list of recommendations after each video is played.
Finally, use the function to download the video.
def get_video_list(aid, cid, quality): url_api = '/x/player/playurl?cid={}&bvid={}&qn={}'.format(cid, aid, quality) headers = { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36', 'Cookie': 'SESSDATA=182cd036%2C1636985829%2C3b393%2A51', 'Host': '' } html = (url_api, headers=headers).json() video_list = [] for i in html['data']['durl']: video_list.append(i['url']) return video_list def schedule_cmd(blocknum, blocksize, totalsize): percent = 100.0 * blocknum * blocksize/ totalsize s = ('#' * round(percent)).ljust(100, '-') ('%.2f%%' % percent + '[' + s + ']' + '\r') () def download(video_list, title, bid): for i in video_list: opener = .build_opener() = [ ('User-Agent', 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/90.0.4430.212 Safari/537.36'), ('Accept', '*/*'), ('Accept-Language', 'en-US,en;q=0.5'), ('Accept-Encoding', 'gzip, deflate, br'), ('Range', 'bytes=0-'), ('Referer', '/video/'+bid), ('Origin', ''), ('Connection', 'keep-alive'), ] filename=('D:\\video', r'{}_{}.mp4'.format(bid,title)) try: .install_opener(opener) (url=i, filename=filename, reporthook=schedule_cmd) except: print(bid + "Download exception, file:" + filename)
To this point this article about writing a Python script to download all the videos in the Beili dance area is introduced to this article, more related python download Beili video content please search for my previous articles or continue to browse the following related articles I hope you will support me more in the future!