Python: When Writing To A Large File, Keep The File Open Or To Open It And Append To It As Needed?
I am wondering how to best handle writing to a big file in python. My python code loops many times running an external program (ancient Fortran with a weird input file format), re
Solution 1:
Opening and closing the files definitely has a cost. However if your legacy program takes one or more second to respond you propably won't notice.
def func1():
for x in range(1000):
x = str(x)
with open("test1.txt", "a") as k:
k.write(x)
1 loops, best of 3: 2.47 s per loop
def func2():
with open("test2.txt", "a") as k:
for x in range(1000):
x = str(x)
k.write(x)
100 loops, best of 3: 6.66 ms per loop
However if your file get's really big it becomes slower: (800+mb)
def func3(file):
for x in range(10):
x = str(x)
with open(file, "a") as k:
k.write(x)
12kb file:
10 loops, best of 3: 33.4 ms per loop
800mb+ file:
1 loops, best of 3: 24.5 s per loop
Keeping the file open will mainly cost you memory.
I would suggest using SQlite to store your data.
Post a Comment for "Python: When Writing To A Large File, Keep The File Open Or To Open It And Append To It As Needed?"