Here i'm trying to read the page and create a csv with columns respectively. But i'm unable to read the parsed data to use find function. The soup data doesn't have the data present in webpage
import requests
import pandas as pd
from bs4 import BeautifulSoup
url = "https://www.fancraze.com/marketplace/sales/mornemorkel1?tab=latest-sales"
r = requests.get(url)
soup = BeautifulSoup(r.content, "html.parser")
CodePudding user response:
Data is dynamically loaded by JavaScript from API calls json response as GET method and bs4 can't render js. So You are unable to parse data
Example:
import requests
api_url = "https://api.faze.app/v1/latestSalesInAGroup/mornemorkel1"
r = requests.get(api_url).json()
for item in r['data']:
print(item['price'])
Output:
8
7
10
6
9
9
9
6
9
9
6
9
37
37
34
35
4
8
10
30
16
10
4
6
9
7
30
20
7
21
20
4
6
6
6
20
19
5
20
25
3
17
22
5
20
22
19
19
17
15
7
4
12
10
3
14
5
4
36
5
10
3
15
3
3
3
3
52
4
3
3
3
13
3
3
5
8
3
5
5
8
9
4
5
4
4
6
8
15
15
22
11
8
8
4
7
5
5
6
5
CodePudding user response:
Site use API to get data, so you can handle it
import pandas as pd
import requests
url = 'https://api.faze.app/v1/latestSalesInAGroup/mornemorkel1'
result = []
response = requests.get(url=url)
for data in response.json()['data']:
data = {
'id': data['momentId']['id'],
'seller': data['sellerAddress']['userName'],
'buyer': data['buyerAddress']['userName'],
'price': data['price'],
'created': data['createdAt']
}
result.append(data)
df = pd.DataFrame(result)
print(df)
OUTPUT:
id seller ... price created
0 1882 singal22 ... 8 2022-06-22T14:34:39.403Z
1 1737 olive_creepy2343 ... 7 2022-06-22T14:09:32.070Z
2 1256 tomato_wicked3294 ... 10 2022-06-22T13:49:20.895Z
3 1931 aquamarine_productive9244 ... 6 2022-06-22T13:41:49.153Z
4 1603 aquamarine_productive9244 ... 9 2022-06-22T13:28:01.624Z
.. ... ... ... ... ...
95 1026 olive_creepy2343 ... 7 2022-04-16T18:00:00.662Z
96 1719 Hhassan136 ... 5 2022-04-14T23:14:12.037Z
97 2054 Cricket101 ... 5 2022-04-14T21:30:13.185Z
98 1961 emzeden_9 ... 6 2022-04-14T18:02:05.194Z
99 1194 amaranth_curious1871 ... 5 2022-04-14T17:45:25.266Z