43.Python 进程互斥锁Lock

最后更新于:2020-03-21 11:53:43

和前面讲到的  python线程互斥锁Lock 类似,当有多个进程Process同时读写同一个文件时,为了避免数据读写产生异常,我们需要为正在操作的进程加上互斥锁,互斥锁的原理不管是对线程threading还是对进程Process而言都是一样。

一.线程互斥锁和进程互斥锁注意事项

1.创建线程互斥锁

# 导入线程threading模块
import threading

# 创建线程互斥锁
mutex = threading.Lock()

 

2.创建进程互斥锁

from multip# 导入进程模块
from multiprocessing import Process,Lock

# 创建进程互斥锁
mutex = Lock()

注意导入模块的区别,不要混淆使用!

 

二.进程互斥锁Lock函数介绍

acquire()— 锁定资源;

release() — 释放资源;

 

三.进程互斥锁Lock使用

案例一:使用进程,但不使用互斥锁

from multiprocessing import Lock, Process
import time
import random
import os


def foo(i, mutex):
    print('%s: %s is running' % (i, os.getpid()))
    time.sleep(random.random())
    print('%s:%s is done' % (i, os.getpid()))


if __name__ == '__main__':
    mutex = Lock()
    for i in range(10):
        process = Process(target=foo, args=(i, mutex))
        process.start()

输出结果:

0: 17008 is running
1: 5288 is running
2: 1228 is running
3: 9724 is running
4: 7520 is running
5: 10236 is running
3:9724 is done
6: 16452 is running
7: 13328 is running
0:17008 is done
8: 9356 is running
9: 16432 is running
8:9356 is done
2:1228 is done
5:10236 is done
9:16432 is done
7:13328 is done
4:7520 is done
6:16452 is done
1:5288 is done

重输出的结果来看,多个进程同时在操作,如果是对同一个文件读写操作,很明显已经乱套了,这并不是我们想要的;如果多进程在读写同一文件时想要保证数据安全,必然需要加上互斥锁,例如下面这个demo;

 

案例二:进程互斥锁的使用

from multiprocessing import Lock, Process
import time
import random
import os


def foo(i, mutex):
    mutex.acquire()
    print('%s: %s is running' % (i, os.getpid()))
    time.sleep(random.random())
    print('%s:%s is done' % (i, os.getpid()))
    mutex.release()


if __name__ == '__main__':
    mutex = Lock()
    for i in range(10):
        process = Process(target=foo, args=(i, mutex))
        process.start()

输出结果:

0: 6908 is running
0:6908 is done
1: 7976 is running
1:7976 is done
3: 7824 is running
3:7824 is done
2: 17328 is running
2:17328 is done
4: 7844 is running
4:7844 is done
5: 15900 is running
5:15900 is done
6: 12648 is running
6:12648 is done
7: 16516 is running
7:16516 is done
8: 17348 is running
8:17348 is done
9: 13180 is running
9:13180 is done

完美,即便是对同一个文件进行读写操作,进程Process使用互斥锁Lock之后也不会造成数据混乱的问题,同时也提高了效率,完美解决案例一的问题!

 

案例三:对全局变量累计求和看看计算结果

# !usr/bin/env python
# -*- coding:utf-8 _*-
"""
@Author:何以解忧
@Blog(个人博客地址): shuopython.com
@WeChat Official Account(微信公众号):猿说python
@Github:www.github.com

@File:python_process_lock.py
@Time:2019/12/31 21:25

@Motto:不积跬步无以至千里,不积小流无以成江海,程序人生的精彩需要坚持不懈地积累!
"""

# 导入进程模块
from multiprocessing import Process,Lock

num = 0

def get_sum1():

    global num  # 声明全局变量
    for i in range(10000):
        num = num +1
    print("get_sum1:",num)

def get_sum2():

    global num  # 声明全局变量
    for i in range(10000):
        num = num + 1
    print("get_sum2:", num)


def main():
    global num  # 声明全局变量
    p1 = Process(target=get_sum1)
    p1.start()

    p2 = Process(target=get_sum2)
    p2.start()

    p1.join()
    p2.join()
    print("main:",num)


if __name__ == "__main__":

    main()
    print("main exit")

输出结果:

get_sum1: 10000
get_sum2: 10000
main: 0
main exit

可能有小伙伴会觉得很纳闷,main函数中得num值怎么会是0,明明主进程/两个子进程都用关键字 global 声明了全局变量,即便没有互斥锁,也应该是一个小于20000的随机数,在文章 python 进程Process与线程threading区别 中有详细讲解,同一进程的所有线程共享该进程的所有资源,进程与进程之间资源相互独立,互不影响(类似深拷贝)

上面的程序有三个进程,这就意味着num变量实际上有三份资源,其中两个进程对num分别做了10000次累计加1,所以每个子进程的值都是10000,主进程没有对num任何操作,所以主进程num值为0;

猜你喜欢:

1.python线程的创建threading.Thread

2.python线程互斥锁

3.python进程Process

4.python进程Process与线程threading的区别

 

转载请注明猿说Python » python 进程互斥锁Lock

 

技术交流、商务合作请直接联系博主
扫码或搜索:猿说python
python教程公众号
猿说python
微信公众号 扫一扫关注
赞赏

微信赞赏支付宝赞赏