I have a DCM in my design clocked at 100MHz:
COMPONENT DCM_100
PORT(
CLKIN_IN : IN std_logic;--100MHz
RST_IN : IN std_logic;
CLKIN_IBUFG_OUT : OUT std_logic;
CLKOUT0_OUT : OUT std_logic; --divided by 1
CLKOUT1_OUT : OUT std_logic; --divided by 2
CLKOUT2_OUT : OUT std_logic;--divided by 4
CLKOUT3_OUT : OUT std_logic;--divided by 8
CLKOUT4_OUT : OUT std_logic;--divided by 16
CLKOUT5_OUT : OUT std_logic;--divided by 32
LOCKED_OUT : OUT std_logic
);
END COMPONENT;
The different clocks selected by input switches are used in the design. For example, for switch position 0, CLKOUT0_OUT will be used which is actually the input clock divided by 1. I have used timing constraint only on input clock, as follows:
TIMESPEC "TS_clk" = PERIOD "clk_in" 100 MHz HIGH 50 %;
The DCM outputs are automatically constrained by the tool. Then timing analysis with all the constraints shows that one constraint is not met:
================================================================================
Timing constraint: TS_Inst_DCM_100_CLKOUT0_BUF = PERIOD TIMEGRP "Inst_DCM_100_CLKOUT0_BUF" TS_Inst_DCM_100_CLK0_BUF HIGH 50%;
43782956 paths analyzed, 19293 endpoints analyzed, 145 failing endpoints
145 timing errors detected. (145 setup errors, 0 hold errors, 0 component switching limit errors)
Minimum period is 11.280ns.
--------------------------------------------------------------------------------
and max. frequency is:
Design statistics: Minimum period: 11.280ns{1} (Maximum frequency: 88.652MHz) Maximum path delay from/to any node: 2.771ns
While when I only select the constraint for the input clock, which is the same as the CLOCK0_OUT then all the constraints are met.
Design statistics: Minimum period: 8.332ns{1} (Maximum frequency: 120.019MHz)
Can anyone please explain this behavior. Should I consider the auto generated constraints on DCM outputs or it is sufficient to consider the constraint on the input clock? Regards