有个即时爬虫的需求,每一次请求过来,就去网站爬取多条数据,所以用到了 gevent.spawn,代码如下
spawns =[]
for param in param_list:
spawns.append(gevent.spawn(s_search, param))
gevent.joinall(spawns)
s_search 是具体的爬取的代码,第一次请求的时候正常,第二次请求异常,第三次又正常了,就这样正常一次,错误一次。异常如下:
File "C:\Users\duanbingjie\PycharmProjects\IpProxy\IpProxy\IPrice.py", line 107, in get_price
gevent.joinall(spawns)
File "C:\python27\lib\site-packages\gevent\greenlet.py", line 649, in joinall
return wait(greenlets, timeout=timeout, count=count)
File "C:\python27\lib\site-packages\gevent\hub.py", line 1038, in wait
return list(iwait(objects, timeout, count))
File "C:\python27\lib\site-packages\gevent\hub.py", line 985, in iwait
item = waiter.get()
File "C:\python27\lib\site-packages\gevent\hub.py", line 939, in get
Waiter.get(self)
File "C:\python27\lib\site-packages\gevent\hub.py", line 899, in get
return self.hub.switch()
File "C:\python27\lib\site-packages\gevent\hub.py", line 630, in switch
return RawGreenlet.switch(self)
LoopExit: ('This operation would block forever', <Hub at 0x987f030 select pending=0 ref=0>)
1
vicalloy 2017-07-11 10:48:19 +08:00
不要在 http 请求里直接使用线程(或 gevent 之类的东西),你应当把异步任务交给 celery 处理。
|
3
zhengxiaowai 2017-07-11 10:52:29 +08:00
celery 也能阻塞调用,没毛病
|
4
vicalloy 2017-07-11 10:57:52 +08:00
|
5
mansur 2017-07-11 11:02:20 +08:00
用 gunicorn 的 gevent 模式起 django
|
6
creatorYC 2017-08-29 13:42:12 +08:00
我也遇到这个问题了,请问您是怎么解决的啊
|