The import requests
The from bs4 import BeautifulSoup
Res=requests. Get (' https://music.163.com/#/artist? Id=5771 ')
Print (res) status_code)
HTML=res. Text
Soup=BeautifulSoup (HTML, '. The HTML parser)
Info=soup. Find_all (class_='n - plunged)
Print (info)
CodePudding user response:
That's because these data are dynamically loaded, check the web page source code to know,
Want to download netease cloud music, can see my this blog!
https://blog.csdn.net/qq_45404396/article/details/105886268
CodePudding user response:
reference 1/f, perserve _liu reply: that's because these data are dynamically loaded, check the web page source code to know, Want to download netease cloud music, can see my this blog! https://blog.csdn.net/qq_45404396/article/details/105886268 For this kind of dynamic web page, how to capture them? CodePudding user response:
Should be below the XHR or js folder, but want to get these information, netease cloud music seems to be a post request! And the request data is encrypted CodePudding user response:
refer to the second floor weixin_45903952 response: Quote: refer to 1st floor perserve _liu reply: That's because these data are dynamically loaded, check the web page source code to know, Want to download netease cloud music, can see my this blog! https://blog.csdn.net/qq_45404396/article/details/105886268 For this kind of dynamic web page, how to capture them? Anyway, is more complex CodePudding user response:
references 4 floor perserve _liu reply: Quote: refer to the second floor weixin_45903952 response: Quote: refer to 1st floor perserve _liu reply: That's because these data are dynamically loaded, check the web page source code to know, Want to download netease cloud music, can see my this blog! https://blog.csdn.net/qq_45404396/article/details/105886268 For this kind of dynamic web page, how to capture them? Anyway, is more complex See what kind of book can solve, or bosses will give a solution to learn directly CodePudding user response:
reference 5 floor weixin_45903952 reply: Quote: refer to 4th floor perserve _liu reply: Quote: refer to the second floor weixin_45903952 response: Quote: refer to 1st floor perserve _liu reply: That's because these data are dynamically loaded, check the web page source code to know, Want to download netease cloud music, can see my this blog! https://blog.csdn.net/qq_45404396/article/details/105886268 For this kind of dynamic web page, how to capture them? Anyway, is more complex See what kind of book can solve, or bosses directly give a solution to learn I am not, I just study in this winter vacation's white, really, don't know anything before winter vacation, for basically CodePudding user response:
Can see the bosses of this blog post, I believe you will have a deeper understanding of the crawler, https://blog.csdn.net/liaoningxinmin/article/details/80794774 CodePudding user response:
refer to 7th floor perserve _liu reply: can see the bosses of this blog post, believe that you will have a deeper understanding of the crawler, https://blog.csdn.net/liaoningxinmin/article/details/80794774 How to extract the xHR here article, automatically access network address?? CodePudding user response:
refer to the eighth floor weixin_45903952 response: Quote: refer to 7th floor perserve _liu reply: Can see the bosses of this blog post, I believe you will have a deeper understanding of the crawler, https://blog.csdn.net/liaoningxinmin/article/details/80794774 How to extract the xHR here article, automatically access network address?? Look at the article I wrote! https://blog.csdn.net/qq_45404396/article/details/105875749 https://blog.csdn.net/qq_45404396/article/details/105471434 CodePudding user response:
references 9/f, perserve _liu reply: Quote: refer to the eighth floor weixin_45903952 response: Quote: refer to 7th floor perserve _liu reply: Can see the bosses of this blog post, I believe you will have a deeper understanding of the crawler, https://blog.csdn.net/liaoningxinmin/article/details/80794774 How to extract the xHR here article, automatically access network address?? Look at the article I wrote! https://blog.csdn.net/qq_45404396/article/details/105875749 https://blog.csdn.net/qq_45404396/article/details/105471434 The second haven't finished? Can't see the content? CodePudding user response:
nullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnullnull