3

I've made a map editor in Python2.7.9 for a small project and I'm looking for ways to preserve the data I edit in the event of some unhandled exception. My editor already has a method for saving out data, and my current solution is to have the main loop wrapped in a try..finally block, similar to this example:

import os, datetime #..and others.
if __name__ == '__main__':
    DataMgr = DataManager() # initializes the editor.
    save_note = None
    try:
        MainLoop()  # unsurprisingly, this calls the main loop.
    except Exception as e: # I am of the impression this will catch every type of exception.
        save_note = "Exception dump: %s : %s." % (type(e).__name__, e) # A memo appended to the comments in the save file.
    finally:
        exception_fp = DataMgr.cwd + "dump_%s.kmap" % str(datetime.datetime.now())
        DataMgr.saveFile(exception_fp, memo = save_note) # saves out to a dump file using a familiar method with a note outlining what happened.

This seems like the best way to make sure that, no matter what happens, an attempt is made to preserve the editor's current state (to the extent that saveFile() is equipped to do so) in the event that it should crash. But I wonder if encapsulating my entire main loop in a try block is actually safe and efficient and good form. Is it? Are there risks or problems? Is there a better or more conventional way?

Augusta
  • 7,171
  • 5
  • 24
  • 39
  • Ideally, you would thoroughly test your code and fix any crashing bugs, and then you wouldn't need a `try` around your code. However, since complete testing is sometimes unrealistic, it may be a decent "just in case" measure, as long as it doesn't become a laziness parachute (e.g., "well, I guess this function could fail sometimes, but the main `try` will catch it, so I'll fix it in the next release, maybe"). – TigerhawkT3 Apr 25 '15 at 02:12
  • @TigerhawkT3 That's pretty much my attitude on it, but in view of the fact that I'm still pretty bad at this stuff, I'm just looking for a way to preserve my time and effort in between smoothing out life's little wrinkles, so to speak. I'm asking mainly to to make sure that `try..finally` or `try..except..finally` blocks don't have any special behaviour (read: harmful or slowing effects) that crops up if you go too long without concluding them. – Augusta Apr 25 '15 at 02:17

2 Answers2

2

Wrapping the main loop in a try...finally block is the accepted pattern when you need something to happen no matter what. In some cases it's logging and continuing, in others it's saving everything possible and quitting.

So you're code is fine.

Ethan Furman
  • 63,992
  • 20
  • 159
  • 237
1

If your file isn't that big, I would suggest maybe reading the entire input file into memory, closing the file, then doing your data processing on the copy in memory, this will solve any problems you have with not corrupting your data at the cost of potentially slowing down your runtime.

Alternatively, take a look at the atexit python module. This allows you to register a function(s) for a automatic callback function when the program exits.

That being said what you have should work reasonably well.

Dan Hogan
  • 2,323
  • 1
  • 16
  • 16
  • `atexit` seems really useful for exactly this sort of situation. I'll take a closer look at it. Seems useful for non-emergency conditions as well. – Augusta Apr 25 '15 at 02:22
  • I have used it extensively for both error handling and control flow. It is actually a pretty clever module! – Dan Hogan Apr 25 '15 at 02:24