I've tried to estimate, how much of the file my program has processed, and for me an obvious solution was to use lsof -o
. But surprisingly, OFFSET
in lsof
's output was always equal to SIZE (like in lsof -s --
), so I decided to write some simple programs to test that behaviour, and...
C:
#include <unistd.h>
#include <fcntl.h>
#include <stdio.h>
int main(void) {
int filedesc = open("path/to/file", O_RDONLY);
printf("%i\n", filedesc);
while(1) {};
}
Scala:
io.Source.fromFile(path)
Python:
open(path)
OFFSET was always at the end of file under OS X:
$ lsof -o /path/to/file
COMMAND PID USER FD TYPE DEVICE OFFSET NODE NAME
a.out 5390 folex 3r REG 1,4 631302648 48453926 /path/to/file
$ lsof -s -- /path/to/file
COMMAND PID USER FD TYPE DEVICE SIZE NODE NAME
a.out 5390 folex 3r REG 1,4 631302648 48453926 /path/to/file
Any explanations for each of these languages would be much appreciated.
UPDATE: Works as expected under Ubuntu. Wrong offset only under OS X.