Is iterating over a Python file object thread safe?
While searching for a neat solution to this problem I was wondering if
iterating over a Python file object is thread safe or not.
from concurrent.futures import ThreadPoolExecutor
import sys, time
f = open("big_file")
def worker():
running = True
while running:
try:
line = next(f)
except StopIteration:
return
# process line
time.sleep(3)
sys.stdout.write(line + "\n")
no_workers = 4
with ThreadPoolExecutor(max_workers=no_workers) as e:
for _ in range(no_workers):
e.submit(worker)
f.close()
My question is, if the example above is safe, or if not, what is an easy
way to get a thread-safe file object (for reading a file line by line,
writing is not required).
No comments:
Post a Comment