python-multithreading

When to call thread.join in a GUI application

好久不见. 提交于 2019-12-02 06:08:13
import wx import json import queue from collections import namedtuple import threading class MyDialog(wx.Frame): def __init__(self, parent, title): self.no_resize = wx.DEFAULT_FRAME_STYLE & ~ (wx.RESIZE_BORDER | wx.MAXIMIZE_BOX) wx.Frame.__init__(self, parent, title=title, size=(500, 450),style = self.no_resize) self.panel = wx.Panel(self, size=(250, 270)) self.emp_selection = wx.ComboBox(self.panel, -1, pos=(40, 50), size=(200,100)) self.start_read_thread() #code to load other GUI components self.Centre() self.Show(True) def read_employees(self, read_file): list_of_emails = queue.Queue() with

Verify continuous condition with time

情到浓时终转凉″ 提交于 2019-12-02 05:55:20
问题 I would like to develop a python program that, starting from a moment of time, wait 60 seconds before performing an action. Another feature that must have the program, is that if I update the initial time, it must start to check the condition. I thought about doing it with threads, but I do not know how to stop the thread and get it started again with the new start time. import thread import time # Define a function for the thread def check_message (last, timer): oldtime = time.time() print

call function of main thread from secondary thread

自古美人都是妖i 提交于 2019-12-02 05:47:34
I am making a GUI in PyQt for user to create backup of huge data. The GUI ( main thread ) is taking inputs from user. rsync command ( for backup ) is also being called in main thread hence the window is freezing. Aim is to try qthread such that app runs without freezing. My search material : 1 : https://www.youtube.com/watch?v=o81Q3oyz6rg . This video shows how to not freeze GUI by running other task in secondary thread. I've tried it and it works. But it does not help in running the command in worker thread. Inspite of calling rsync in secondary thread, the gui still freezes. What am I doing

Should I use Events, Semaphores, Locks, Conditions, or a combination thereof to manage safely exiting my multithreaded Python program?

穿精又带淫゛_ 提交于 2019-12-02 04:28:48
问题 I am writing a multithreaded python program in which the main thread and the other threads it spawns run as daemons (but not with Thread.daemon=True) that look for certain files in certain directories and perform actions with them when they exist. It is possible that an error occurs in one/any of the threads which would require the whole program to exit. However, I need the other threads to finish their current job before exiting. From what I understand, if I set myThread.daemon=True for my

running script multiple times simultaniously in python 2.7

纵饮孤独 提交于 2019-12-02 03:53:57
Hello I am trying to run a script multiple times but would like this to take place at the same time from what I understood i was to use subprocess and threading together however when i run it it still looks like it is being executed sequentially can someone help me so that i can get it to run the same script over and over but at the same time? is it in fact working and just really slow? edit forgot the last piece of code now at the bottom here is what i have so far import os import datetime import threading from subprocess import Popen today = datetime.date.today() os.makedirs("C:/newscript

How to call MessageLoopWork in cefpython?

泄露秘密 提交于 2019-12-02 01:12:43
I made a simple offscreen renderer with cefpython . I used cefpython.MessageLoop() but I would like to execute a javascript function with browser.GetFocusedFrame().ExecuteFunction which must be called from main UI thread. Is there a way to set a callback on cefpython's message loop? Alternatively I could use MessageLoopWork , but I don't know how. I tried to call it in a separate thread but it does not work: import threading def main_loop(): cefpython.MessageLoopWork() threading.Timer(0.01, main_loop).start() threading.Timer(0.01, main_loop).start() I get the following error: [0324/174806

Increasing throughput in a python script

余生颓废 提交于 2019-12-02 01:09:47
I'm processing a list of thousands of domain names from a DNSBL through dig, creating a CSV of URLs and IPs. This is a very time-consuming process that can take several hours. My server's DNSBL updates every fifteen minutes. Is there a way I can increase throughput in my Python script to keep pace with the server's updates? Edit : the script, as requested. import re import subprocess as sp text = open("domainslist", 'r') text = text.read() text = re.split("\n+", text) file = open('final.csv', 'w') for element in text: try: ip = sp.Popen(["dig", "+short", url], stdout = sp.PIPE) ip = re.split("

Multiprocessing inside a child thread

半城伤御伤魂 提交于 2019-12-02 00:16:40
I was learning about multi-processing and multi-threading. From what I understand, threads run on the same core, so I was wondering if I create multiple processes inside a child thread will they be limited to that single core too? I'm using python, so this is a question about that specific language but I would like to know if it is the same thing with other languages? I'm not a pyhton expert but I expect this is like in other languages, because it's an OS feature in general. Process A process is executed by the OS and owns one thread which will be executed. This is in general your programm.

Share a variable between workers with Python multiprocessing [duplicate]

一笑奈何 提交于 2019-12-01 22:52:41
问题 This question already has answers here : Python multiprocessing and a shared counter (5 answers) Closed 5 years ago . How can I read and update a variable shared between multiple workers in Python? For example, I'm scanning through a list of files using multiple processes in Python, and would like to check if the parent directory has been scanned or not. def readFile(filename): """ Add the parent folder to the database and process the file """ path_parts = os.path.split(filename) dirname = os

Share a variable between workers with Python multiprocessing [duplicate]

六眼飞鱼酱① 提交于 2019-12-01 19:11:53
This question already has an answer here: Python multiprocessing and a shared counter 4 answers How can I read and update a variable shared between multiple workers in Python? For example, I'm scanning through a list of files using multiple processes in Python, and would like to check if the parent directory has been scanned or not. def readFile(filename): """ Add the parent folder to the database and process the file """ path_parts = os.path.split(filename) dirname = os.path.basename(path_parts[0]) if dirname not in shared_variable: # Insert into the database #Other file functions def main():