Syntax first:
double iADX( string symbol, // BEST AS: _Symbol
int timeframe, // BEST AS: one of {}-ENUMs ~ PERIOD_CURRENT
int period, // averaging period
int applied_price, // BEST AS: one of {}-ENUMs ~ PRICE_CLOSE
int mode, // BEST AS: one of {}-ENUMs ~ MODE_PLUSDI
int shift // shift
);
Why 0.0
?
Once we read into the calling interface, requirement to average the selected sequence of PRICE_CLOSE
records, kept for the current Symbol()
( NULL
) seems fair, but just notice, that doing that for zero-consecutive bars instructs to do nothing, instead of taking some reasonable calculus of SUM( Close[i..j] )/period
to allow for any meaningful processing.
Experiment with non-zero periods and you are back on the rails, aiming towards your goals.
double DI_plus,
DI_minus;
int ADX_PERIOD = 8;
int OnInit() {
ObjectCreate( ChartID(), "GUI-SHOW+DI", ... ); // LABEL for +DI
ObjectCreate( ChartID(), "GUI-SHOW-DI", ... ); // LABEL for -DI
}
int OnTick() {
DI_plus = iADX( _Symbol,
PERIOD_CURRENT,
ADX_PERIOD,
PRICE_CLOSE,
MODE_PLUSDI,
0
);
DI_minus = iADX( _Symbol,
PERIOD_CURRENT,
ADX_PERIOD,
PRICE_CLOSE,
MODE_MINUSDI,
0
);
ObjectSetString( Chart_ID(),
"GUI-SHOW+DI",
OBJPROP_TEXT,
StringFormat("+DI %12.6f", DI_plus )
);
ObjectSetString( Chart_ID(),
"GUI-SHOW-DI",
OBJPROP_TEXT,
StringFormat("-DI %12.6f", DI_minus )
);
}