3

I have been working on a project where it is necessary to program a binary file, of a certain kind, to a AT28C256 chip. The specifics are not important beyond the fact that the file needs to be 32,768 bytes in size (exactly).

I have some "minimal problem" code here:

o = open("images.bin", "wb")
c = 0
for i in range(256):
    for j in range(128):
        c += 1
        o.write(chr(0).encode('utf-8'))
print(c)

This, to me, would appear to write 32,768 bytes to a file (the split into i,j is necessary because I need to write an image to the device) as 128*256 = 32768. And the output of c is 32768!

But the file it creates is 28672 bytes long! The fact that this is 7000 in hex has not passed me by but I'm not sure why this is happening. Any ideas?

blhsing
  • 91,368
  • 6
  • 71
  • 106
Isky Mathews
  • 250
  • 1
  • 10

1 Answers1

2

You should call o.close() to flush the write buffer and close the file properly.

blhsing
  • 91,368
  • 6
  • 71
  • 106
  • 1
    That did it! Do you have any insight as to what happens when you don't do it? – Isky Mathews Mar 31 '20 at 18:29
  • When you write to the output in bytes the actual disk writes are done in chunks whenever the write buffer is full. If you don't close the file handle the last chunk of data will remain in the memory buffer. Usually right before the program exits it will automatically close any open file handles for you, but the behavior is not always guaranteed. It is best that you always close the file handle once you're done writing data to it, which can usually be more Pythonically done with a `with` statement instead. – blhsing Mar 31 '20 at 18:34
  • To be specific, `__del__` methods are not guaranteed to be called on objects that are still around when the interpreter exits. – juanpa.arrivillaga Mar 31 '20 at 19:19