python 多线程logger问题

我写了一个多进程和多线程但是每次都执行不完代码如下,可以直接运行我就把打印的东西都放日志里面了但是日志总是少几十行找不到原因按说50个进程50个线程应该有2500行日志但... 我写了一个多进程和多线程 但是每次都执行不完
代码如下,可以直接运行
我就把打印的东西都放日志里面了 但是日志总是少几十行
找不到原因 按说50个进程 50个线程 应该有2500行日志
但是就是2400多行 每次都是 logger 不是线程安全的吗?
求指导 谢谢
#!/usr/bin/env python
#coding=UTF-8
import multiprocessing
from multiprocessing import Process
from threading import Thread
from time import sleep
import string
import time
import sys
import logging
import random
import os
reload(sys)
sys.setdefaultencoding('utf8')
logging.basicConfig(level=logging.DEBUG,
format='%(asctime)s %(filename)s[line:%(lineno)d] %(funcName)s %(levelname)s %(message)s',
datefmt='%a, %d %b %Y %H:%M:%S',
filename='log.log',
filemode='a')
#num是并发线程总数
processNum = 50
threadNum = 50
def processWorking(userlist):
threads = []
for i in range(threadNum):
t = Thread(target=theadWorking,args=(i,))
t.setDaemon(True)
t.start() #把JOBS排入队列
threads.append(t)
for t in threads:
t.join()
#time.sleep(10)
def theadWorking(num):
#print num
logging.error(num)

if __name__ == '__main__':

processs = []
for num in range(processNum):
p = Process(target=processWorking, args=('2',))
processs.append(p)
p.start()
for p in processs:
p.join()
#print u'执行完毕'
展开
 我来答
mayadong7349
推荐于2018-05-06 · TA获得超过2356个赞
知道大有可为答主
回答量:362
采纳率:0%
帮助的人:679万
展开全部

因为logging是threadsafe的,但不是process-safe(应该没有这个词儿,只是为了便于理解)的。这段代码就是多个进程共同操作一个日志文件。这种情况下,logging的行为就很难说了。


我测试了一下,日志中大概几百行。而且,可以看到一些顺序错乱现象:

Fri, 08 Aug 2014 01:19:38 logging_in_multithread.py[line:40] theadWorking ERROR 2
FFri, 08 Aug 2014 01:19:36 logging_in_multithread.py[line:40] theadWorking ERROR 11(注意这里的FFri)


把代码这样改:

    for num in range(processNum):
        p = Process(target=processWorking, args=('2',))
        processs.append(p) 
        p.start()
        p.join()

 

还有其他方法,比如:为logging实现一个FileHandler,以使logging在multiple process的环境下也能正常工作。这是我从网上了解到的做法,自己还没实践过。


Python Manual中logging Cookbook中有这么一段话:

Logging to a single file from multiple processes

Although logging is thread-safe, and logging to a single file from multiple threads in a single process is supported, logging to a single file from multiple processes is not supported, because there is no standard way to serialize access to a single file across multiple processes in Python. If you need to log to a single file from multiple processes, one way of doing this is to have all the processes log to a SocketHandler, and have a separate process which implements a socket server which reads from the socket and logs to file. (If you prefer, you can dedicate one thread in one of the existing processes to perform this function.)

这段话中也提出了另外一种解决方案。

推荐律师服务: 若未解决您的问题,请您详细描述您的问题,通过百度律临进行免费专业咨询

为你推荐:

下载百度知道APP,抢鲜体验
使用百度知道APP,立即抢鲜体验。你的手机镜头里或许有别人想知道的答案。
扫描二维码下载
×

类别

我们会通过消息、邮箱等方式尽快将举报结果通知您。

说明

0/200

提交
取消

辅 助

模 式