1

I’ve run into a frustrating brick wall when using:

projectItem.setInPoint(seconds)
projectItem.setOutPoint(seconds)

…about 50% of the time the I/O points (in the source window) are set with a 1 frame error (sometimes 2 frames out). I feel like I’ve tried everything to discover what the pattern is, but it seems entirely random.
 
I thought it might be something to do with drop frame, variable frame rates, the clip being different from the sequence, or other oddities, but the error occurs at simple constant frame rates like 25 fps. There just seems to be no rhyme or reason to the errors (although the same error occurs consistently on certain frames).
 
There’s even a bigger problem with subclips, as the scripting environment thinks that all subclips start at frame 0.
 
I’ve tried everything, including working in ticks, seconds or frames, and converting between them. Nothing made a difference.
 
What I’m trying to accomplish is to set in/out on a set of clips, run a script to make smaller cuts from those source clips, and then restore the clips to the original I/O points. Got most of this working except I can’t restore all the clips to the original I/O points with this bug.
 
Below is a test script I wrote. It gets the current I/O positions, stores them, and then sets them back to the same clip. Half the time the values are not the same!  Argh!  This makes it impossible set the clips I/O accurately.

function framesToSeconds (frames, fps)
{
    return frames / fps;
}

function secondsToFrames (sec, fps)
{
    return sec * fps;
}

/*---------------------------------------------------*/

var projItems = app.project.rootItem.children;
var clip = projItems[2];
var fps = clip.getFootageInterpretation().frameRate;

var setIn = clip.getInPoint().seconds;
var setOut = clip.getOutPoint().seconds;

var inFrame = secondsToFrames (setIn, fps);
var outFrame = secondsToFrames (setOut, fps);

var secIn = framesToSeconds (inFrame, fps);
var secOut = framesToSeconds (outFrame, fps);

clip.setInPoint( secIn );
clip.setOutPoint( secOut );

var setIn = clip.getInPoint().seconds;
var setOut = clip.getOutPoint().seconds;

2 Answers2

1

I’ve done some more testing. Although I don’t quite understand the source of the error, I believe I have figured out a fix for it.

I tested 2 slugs, each 10 seconds long with different frame rates, and ran them through a loop and set the IO points of each frame. I checked each frame to see which frames came back wrong. What I found was:

test_25fps_1280x720.mov : error on frames 211,209,207,205,203,201

test_29fps_1024x576.mov : error on frames 251,244,242,122,121,61

These errors weren’t random. Whenever I tried to set an in or out point on these frames it would ALWAYS round down 1 frame (I was wrong before about 50% of the frames being off, it’s actually less than 3%).

My best guess is that there’s some precision error because there’s a calculation going somewhere involving large floating point numbers. I can’t confirm it, nor do I really understand how to fix that. But I did figure out I could test the in and out points after setting them, and if it didn’t match expectations I could reset the point by adding the duration of half a frame (in seconds). A full frame would just repeat the error, but half a frame would get Premiere to round up to the correct frame.

This is the main part of my code:

/*---------------------------------------------------*/
function fixAnyFrameErrors (clip, inFrame, outFrame, fps, halfFrame)
/*---------------------------------------------------*/
{
    var inSecSet    = clip.getInPoint().seconds;
    var inFrameSet  = secondsToFrames (inSecSet, fps);
    var outSecSet   = clip.getOutPoint().seconds;
    var outFrameSet = secondsToFrames (outSecSet, fps);

    if ( parseFloat(inFrame) != parseFloat(inFrameSet) ) {
        clip.setInPoint( secIn + halfFrame );
    }

    if ( parseFloat(outFrame) != parseFloat(outFrameSet) ) {
        clip.setOutPoint( secOut + halfFrame );
    }
}

/*---------------------------------------------------*/

var tps = 254016000000; // 2.54016e11 ticks per second (Premiere Pro constant)

var projItems = app.project.rootItem.children;
var clip = projItems[2];
clip.addMetadata();
var fps = clip.getFootageInterpretation().frameRate;
var tpf = clip.videoInPoint.frame_rate;
var frameDuration = tpf / tps; // in seconds
var halfFrame = (frameDuration * 0.5);

var inFrame = 201;
var outFrame = 211;
  
var secIn  = framesToSeconds (inFrame,  fps);
var secOut = framesToSeconds (outFrame, fps);

clip.setInPoint ( secIn  );
clip.setOutPoint( secOut );

fixAnyFrameErrors (clip, inFrame, outFrame, fps, halfFrame);
Community
  • 1
  • 1
  • What is ``clip.videoInPoint.frame_rate``? A clip object does not have a direct property ``videoInPoint``, and I could not find any property called ``frame_rate``... – Guntram Oct 21 '20 at 13:00
  • I must have not included my full code (not wanting it to be too long). If I recall that was after I parsed the XML string that comes from the 'addMetadata()' method. – urbanspaceman Oct 21 '20 at 19:29
  • thanks for the info! i filed a bug for that in their uservoice, and we also have transmitted it to enterprise support. this is still present in ppro v14.5. – Guntram Oct 26 '20 at 14:41
0

My observations are:

(We encounter the problem when setting markers.)

Our calculation is correct, but Premiere sets marker seconds e.g. to 8.35999999999606 instead of 8.36 (at 25fps), the marker is shown at 8secs 9frames, but the displayed value is 8secs 8frames.

From Adobe Forums: https://community.adobe.com/t5/premiere-pro/clip-marker-different-start-end-time-in-seconds/m-p/9309128?page=1

they suggest using an epsilon to check if a value is very close to the calculated value. In the example, the difference is 0,00000000000394 - which is very low and should be corrected to 0.

I do not want to know what happens if we used framerates with dropframes (23.976 for example)...

EDIT: you would have to correct the time inside the existing marker, in my case, so this does not work. i will try to add half a frame...

Guntram
  • 961
  • 14
  • 19