21

I tar a directory full of JPEG images:

tar cvfz myarchive.tar.gz mydirectory

When I untar the archive:

tar xvfz myarchive.tar.gz

I get an error:

tar: Unexpected EOF in archive

Looking at the output, it fails in the middle of one particular JPEG image.

What am I doing wrong?

icedwater
  • 4,701
  • 3
  • 35
  • 50

5 Answers5

13

Interesting. I have a few questions which may point out the problem.

1/ Are you untarring on the same platform as you're tarring on? They may be different versions of tar (e.g., GNU and old-unix)? If they're different, can you untar on the same box you tarred on?

2/ What happens when you simply gunzip myarchive.tar.gz? Does that work? Maybe your file is being corrupted/truncated. I'm assuming you would notice if the compression generated errors, yes?

Based on the GNU tar source, it will only print that message if find_next_block() returns 0 prematurely which is usually caused by truncated archive.

paxdiablo
  • 854,327
  • 234
  • 1,573
  • 1,953
  • Thanks for your quick answer 1/ same machine (for testing purposes): Ubuntu. Eventually, I want to untar on Mac OS X (the error is the same there...) 2/ gunzip works fine No error when the archive is created... –  Aug 25 '09 at 04:10
  • Okay, remove the offending JPEG temporarily and see what happens. This will let you know if it's that specific JPEG or tar itself. Also try without the zip option. – paxdiablo Aug 25 '09 at 04:26
  • Ah, you put me on the right track here: the tar archive is truncated. I didn't see any errors in my syslog (the archive is created by a cron job), but something must go wrong. There is plenty of space on the disk (when I create the archive "by hand", it is not truncated)... I'll dig deeper now, thanks for your replies. –  Aug 25 '09 at 04:39
  • How big is the tar file? You may be hitting your ulimit file size limit. In addition, your cron jobs should always send stdout/stderr to /tmp/some-file-or-other (cleaned up with yet another cron job) to aid debugging. Perhaps change your script to run ulimit as its first command, then change the cron job to capture output (e.g., in your crontab, change "/mypath/myprog" to "/mypath/myprog >/tmp/myprog.out 2>&1". – paxdiablo Aug 25 '09 at 07:45
  • Hi there, the tar file is about 8M. I added logging to the cron job and... it solved the problem! My guess is that adding logging slows down the archiving and makes it work... For the record, I have only 243 files in the archive and ulimit -n gives me 1024, so it shouldn't be a problem. Anyway, thank you very much for your help, very appreciated. –  Aug 25 '09 at 21:36
  • 1
    @Cyrille, ulimit -n (#files) shouldn't matter since it's unlikely tar would keep them all open at once anyway. I was thinking more of ulimit (ulimit -f is the default) which shows the maximum allowable file *size*. I'd also be uncomfortable with a Heisenbug so I'm willing to work further to get it sorted out (it's hard to imagine that adding logging to the script would affect the tar command at all!) but, if you're happy to let it go, that's fine too. Let me know. Cheers. – paxdiablo Aug 26 '09 at 01:30
8

May be you have ftped the file in ascii mode instead of binary mode ? If not, this might help.

$ gunzip myarchive.tar.gz

And then untar the resulting tar file using

$ tar xvf myarchive.tar

Hope this helps.

anoopknr
  • 3,177
  • 2
  • 23
  • 33
Saradhi
  • 477
  • 6
  • 13
  • tar xf myarchive.tar was giving me this unexpected EOF error. Your suggestion of using tar xvf was helpful - the trucated file still hit the error and stopped, but this command unarchived everything else up to that point. – squarecandy Jan 07 '13 at 21:49
7

I had a similar problem with truncated tar files being produced by a cron job and redirecting standard out to a file fixed the issue.

From talking to a colleague, cron creates a pipe and limits the amount of output that can be sent to standard out. I fixed mine by removing -v from my tar command, making it much less verbose and keeping the error output in the same spot as the rest of my cron jobs. If you need the verbose tar output, you'll need to redirect to a file, though.

Jerome
  • 1,429
  • 11
  • 13
3

In my case, I had started untar before the uploading of the tar file was complete.

KawaiKx
  • 9,558
  • 19
  • 72
  • 111
0

I had a similar error, but in my case the cause was file renaming. I was creating a gzipped file file1.tar.gz and repeatedly updating it in another tarfile with tar -uvf ./combined.tar ./file1.tar.gz. I got the unexpected EOF error when after untarring combined.tar and trying to untar file1.tar.gz.

I noticed there was a difference in the output of file before and after tarring:

$file file1.tar.gz
file1.tar.gz: gzip compressed data, was "file1.tar", last modified: Mon Jul 29 12:00:00 2019, from Unix
$tar xvf combined.tar
$file file1.tar.gz
file1.tar.gz: gzip compressed data, was "file_old.tar", last modified: Mon Jul 29 12:00:00 2019, from Unix

So, it appears that the file had a different name when I originally created combined.tar, and using the tar update function doesn't overwrite the metadata for the gzipped filename. The solution was to recreate combined.tar from scratch instead of updating it.

I still don't know exactly what happened, since changing the name of a gzipped file doesn't normally break it.