Let’s say I have a survey application where after each action a form is serialized and saved as a json file.
python json_path.write_text( json.dumps( { "answers": answers } ) )
Today I got a report where user data was lost (0 bytes file) because disk got full. Should I report it as python issue? How can it be fixed? Any thoughts are appreciated.
Would a partial write be preferable to an empty file?
with json_path.open("w") as f:
json.dump({"answers": answers}, f)
Is json_path bring regularly rewritten and appended to you might like GitHub - untitaker/python-atomicwrites: Powerful Python library for atomic file writes. or sqlite instead
I would expect it to raise an exception. If that’s what it’s doing, then it’s not a Python issue.
As for a fix, using a bigger disk is one solution…
I use json.dumps
to avoid partial writes. Thanks for your suggestion, python-atomicwrites suggests to use os.replace
which is what I end up doing. I hope there will be a one-liner for writing a file atomically.
The command that removes all the file content doesn’t raise an exception self.open(mode='w', ...)
, but the one that writes f.write(data)
does.
The usual solution is to write a new file and then replace the existing file with the new file.