Concurrent Programming Four

GIL Global Interpreter Lock

Reserve knowledge

1.python Interpreters are also written in programming languages
	Cpython  use C Written out
	Jpython  use Java Written out
	Pypython use python Written out

ps: The most common is Cpython (default)
Explanation of GIL in official documents:

In CPython, the global interpreter lock, or GIL, is a mutex that prevents multiple native threads from executing Python bytecodes at
once. This lock is necessary mainly because CPython's memory
management is not thread-safe. (However, since the GIL exists, other
features have grown to depend on the guarantees that it enforces.)

"""

1.GIL research is a feature of the Cpython interpreter but not of the python language

2.GIL is also a mutex in nature

3. The existence of GILs makes it impossible for multiple threads in the same process to execute at the same time (a key implication): Multithreads in a single process are inefficient to take advantage of the multicore advantage!!!

4. GILs exist mainly because the garbage collection mechanism in the cpython interpreter is not thread-safe

"""
1. Misconception: Python's multithreading is that multithreading that does not take advantage of python's multicore advantage is indeed unable to use multicore advantage but is useful in IO-intensive tasks
2. Misconception: Since there are GIL s, we don't need to mutex our code in the future. Don't just make sure the interpreter level data isn't messy (garbage collection mechanism) You should lock your own data in the program
3. All interpretive programming languages cannot be executed simultaneously by multiple threads in the same process
"""

Verify the existence of GIL s

from threading import Thread

money = 888


def task():
    global money
    money -= 1


for i in range(888):
    t = Thread(target=task)
    t.start()
# Wait for all threads to finish to see how much money is
print(money)  # 0


money = 888


def task():
    global money
    money -= 1

t_list = []
for i in range(888):
    t = Thread(target=task)
    t.start()
    t_list.append(t)

for t in t_list:
    t.join()
# Wait for all threads to finish to see how much money is
print(money)  # 0

Verify the characteristics of GIL s

from threading import Thread
import time


money = 888


def task():
    global money
    tmp = money
    time.sleep(0.2)
    money = tmp-1

t_list = []
for i in range(888):
    t = Thread(target=task)
    t.start()
    t_list.append(t)

for t in t_list:
    t.join()
# Wait for all threads to finish to see how much money is
print(money)  # 887

GIL does not affect program-level data and does not guarantee that its modifications are secure To lock yourself

from threading import Thread, Lock
import time


money = 888
mutex = Lock()

def task():
    mutex. acquire()
    global money
    tmp = money
    time.sleep(0.01)
    money = tmp-1
    mutex.release()

t_list = []
for i in range(888):
    t = Thread(target=task)
    t.start()
    t_list.append(t)

for t in t_list:
    t.join()
# Wait for all threads to finish to see how much money is
print(money)  # 0

Verify usefulness of python multithreading

Situation 1:

    single CPU
    Multiple CPU

Situation 2:

     IO Dense (code has IO Operation)
     Computing intensive (code not available) IO Operation)

Single CPU&IO intensive:

     Multiprocess
              Request extra space, consume more resources
      Multithreaded
               Relatively low resource consumption through multichannel technology
            Multithreading Advantage

Single CPU&Computing intensive;

        Multiprocess
              Request extra space, consume more resources(Time taken for tests+Request Space+Copy Code+Switch)
         Multithreaded
               Relatively low resource consumption, through multichannel technology (total time consuming)+Switch)
         Multithreading has advantages

Multiple CPU&IO intensive:

         Multiprocess
             Total time consumed (single process)+IO+Request Space+Copy Code)
      Multithreaded
              Total time consumed (single process)+IO)
         Multithreading Advantage

Multiple CPUs & Computing Intensive

      Multiprocess
             Total time consumed (single process)
      Multithreaded
              Total Time-consuming (Overall Drinking Process)  
              Multiprocess Advantage           
from threading import Thread
from multiprocessing import Process
import os
import time


def work():
    # Computing intensive
    res = 1
    for i in range(1, 100000):
        res *= i


if __name__ == '__main__':
    # print(os.cpu_count())  # 12 View the current number of CPU s on your computer
    start_time = time.time()
    # p_list = []
    # for i in range(12):  # Create 12 processes at once
    #     p = Process(target=work)
    #     p.start()
    #     p_list.append(p)
    # for p in p_list:  # Ensure that all processes are running
    #     p.join()
    t_list = []
    for i in range(12):
        t = Thread(target=work)
        t.start()
        t_list.append(t)
    for t in t_list:
        t.join()
    print('Time taken for tests:%s' % (time.time() - start_time))  # Getting Total Time

"""
Computing intensive
    Multiprocess:5.665567398071289
    Multithreaded:30.233906745910645
"""

def work():
    time.sleep(2)   # Simulate pure IO operation


if __name__ == '__main__':
    start_time = time.time()
    # t_list = []
    # for i in range(100):
    #     t = Thread(target=work)
    #     t.start()
    # for t in t_list:
    #     t.join()
    p_list = []
    for i in range(100):
        p = Process(target=work)
        p.start()
    for p in p_list:
        p.join()
    print('Time taken for tests:%s' % (time.time() - start_time))

"""
IO Dense
    Multithreaded:0.0149583816528320
    Multiprocess:0.6402878761291504
"""

Deadlock Phenomena

"""
Although we already know how to use mutexes
Preempt lock before release lock
"""

from threading import Thread, Lock
import time

mutexA = Lock()  # Class name bracketed each time a new object is executed
mutexB = Lock()  # Class name bracketed each time a new object is executed


class MyThread(Thread):
    def run(self):
        self.func1()
        self.func2()

    def func1(self):
        mutexA.acquire()
        print(f'{self.name}Grabbed A lock')
        mutexB.acquire()
        print(f'{self.name}Grabbed B lock')
        mutexB.release()
        print(f'{self.name}Released B lock')
        mutexA.release()
        print(f'{self.name}Released A lock')

    def func2(self):
        mutexB.acquire()
        print(f'{self.name}Grabbed B lock')
        time.sleep(1)
        mutexA.acquire()
        print(f'{self.name}Grabbed A lock')
        mutexA.release()
        print(f'{self.name}Released A lock')
        mutexB.release()
        print(f'{self.name}Released B lock')


for i in range(10):
    t = MyThread()
    t.start()

Semaphore

Semaphores are also mutually exclusive in nature, but they are multiple locks
"""
Emphasize:
Semaphores may have different meanings in different knowledge systems
In concurrent programming, semaphores are mutexes
In django, semaphores refer to automatic triggering (middleware) when a condition is reached
...
"""
'''
We used Lock to generate a single lock
Similar to a single toilet
The semaphore is equivalent to creating multiple toilets at once
Similar to a public toilet
'''

from threading import Thread, Lock, Semaphore
import time
import random


sp = Semaphore(5)  # Five locks at a time


class MyThread(Thread):
    def run(self):
        sp.acquire()
        print(self.name)
        time.sleep(random.randint(1, 3))
        sp.release()


for i in range(20):
    t = MyThread()
    t.start()

Evet event

Subprocesses\Subthreads can wait for each other
eg:
When subA runs to A code location, the signal tells sub B to start running

from threading import Thread, Event
import time

event = Event()  # It's like making a traffic light


def light():
    print('Everyone with a red light can't move')
    time.sleep(3)
    print('The green light turns on the throttle and stamps on it to flush me!!!')
    event.set()


def car(name):
    print('%s Waiting for the red light' % name)
    event.wait()
    print('%s Gas Throttle Speed' % name)


t = Thread(target=light)
t.start()
for i in range(20):
    t = Thread(target=car, args=('Panda PRO%s' % i,))
    t.start()

Control traffic lights through event events

The initial value of the event is False, so it starts with a red light, which simulates the rules of the traffic light, changes the light every two seconds, and then simulates the traffic of a vehicle to combine the two events.
When the event is False, it is a red light, the vehicle is waiting, waiting all the time, but when the event is True, it turns green, the congestion is canceled, and the vehicle is in traffic. In this file, set vehicle travel completion
Execution ends, so the function to manage vehicle traffic is set to join, and the traffic light function ends with the end of the main process, so it is set as a daemon.

import time

from multiprocessing import Event,Process

def traffic(e):
    print("33[31;1m Red light 33[0m")
    while 1:
        if e.is_set():
            time.sleep(2)
            print("33[31;1m Red light 33[0m")
            e.clear()

        else:
            time.sleep(2)
            print("33[32;1m Green light 33[0m")
            e.set()

def car(e,i):
    if not e.is_set():
        print("%s Waiting for a car..." %i)
        e.wait()
    print("%s adopt" %i)

if __name__ == '__main__':
    e = Event()
    c_l = []
    p = Process(target=traffic, args=(e,))
    p.daemon = True
    p.start()
    for i in range(20):
        time.sleep(random.randint(0,2)) #Simulating random vehicle traffic
        c = Process(target=car,args=(e,"car%s"%i))
        c.start()
        c_l.append(c)
    for c in c_l:
        c.join()

Process and Thread Pools

Multi-process and multi-threaded
Hardware tolerance also needs to be considered when opening multi-process or multi-threaded
pool
Reduce the efficiency of program execution to ensure the security of computer hardware
Process Pool
Create a fixed number of processes ahead of time for the program to use without creating them later
Thread Pool
Create a fixed number of threads ahead of time for the program to use later

Operation of process pools and thread pools

from concurrent.futures import ProcessPoolExecutor,ThreadPoolExecutor
add_done_callback

from concurrent.futures import ProcessPoolExecutor,ThreadPoolExecutor
from  threading import current_thread
import time

pool = ThreadPoolExecutor(3)


def  task(n):
    print(current_thread().name)
    # print(n)
    time.sleep(1)


def func(*args,**kwargs):
    print('func',args,kwargs)

for i in range(10):
    pool.submit(task,123).add_done_callback(func)
    """Call-backs:The mechanism is triggered automatically when a result of asynchronous task execution completes"""

Protocol

Process: Resource Units
Thread: Execution Unit
Concurrency in a single thread
Decepting the CPU at the code level makes the CPU feel that there are no IO operations in our code
There are two conditions for cpu to leave:
1 Encountered IO
2 Long running of a program (saved state) (the technology is entirely programmer's own name and programmer's own
Core: Write your own code to complete the switch + save state

import time
from gevent import monkey;

monkey.patch_all()  # Fixed write to detect all IO operations (monkey patch)
from gevent import spawn


def func1():
    print('func1 running')
    time.sleep(3)
    print('func1 over')


def func2():
    print('func2 running')
    time.sleep(5)
    print('func2 over')


if __name__ == '__main__':
    start_time = time.time()
    # func1()
    # func2()
    s1 = spawn(func1)  # Detect code that automatically switches once IO is present (wait for IO to end performing an operation without IO to change direction)
    s2 = spawn(func2)
    s1.join()
    s2.join()
    print(time.time() - start_time)  # 8.01237154006958 Partnership 5.015487432479858

Collaboration for TCP Server-side Concurrency

import socket
from gevent import monkey;monkey.patch_all()  # Fixed write to detect all IO operations (monkey patch)
from gevent import spawn


def communication(sock):
    while True:
        data = sock.recv(1024)
        print(data.decode('utf8'))
        sock.send(data.upper())


def get_server():
    server = socket.socket()
    server.bind(('127.0.0.1', 8080))
    server.listen(5)
    while True:
        sock, addr = server.accept()  # IO Operation
        spawn(communication, sock)

s1 = spawn(get_server)
s1.join()

Tags: Python

Posted by ReeceSayer on Sun, 11 Sep 2022 01:47:30 +0930