Behind the item. Two class of py (specific content of the definition of omitted) :
The class FangtxItem (scrapy. Item) :
The class FangtxesfItem (scrapy. Item) :
The crawler file imports for two class:
The from fangtx. Items import FangtxItem, FangtxesfItem
It's no problem for me to return the two item (two items) in different functions:
The item=FangtxItem (name=name, price=price, province=province, city=city)
The item=FangtxesfItem (room=room, area=area, detail_url=detail_url, info=info, place=place, province=province, city=city)
The code in the pipeline. The py:
The class FangtxPipeline (object) :
Def __init__ (self) :
Self. Newhouse_fp=open (' newhouse. Json ', 'wb)
Self. Esf_fp=open (' esf. Json ', 'wb)
Self. Esf_exporter=JsonLinesItemExporter (self. Esf_fp ensure_ascii=False)
Self. Newhouse_exporter=JsonLinesItemExporter (self. Newhouse_fp ensure_ascii=False)
Self. Newhouse_exporter. Start_exporting ()
Self. Esf_exporter. Start_exporting ()
Def process_item (self, item, spiders) :
Self. Newhouse_exporter. Export_item (item)
Self. Esf_exporter. Export_item (item)
Return the item
Def close_spider (self, spiders) :
Self. Newhouse_exporter. Finish_exporting ()
Self. Esf_exporter. Finish_exporting ()
Setting is on:
ITEM_PIPELINES={
'fangtx. Pipelines. FangtxPipeline: 300,
}
Then two json file content, MAO, I threw up,,,
CodePudding user response:
A different nameCodePudding user response:
Def process_item (self, item, spiders) :If spiders. Name=="" :
Pass
Elif spiders. Name=="" :
pass
CodePudding user response: