I am work on a version of a product which will be used by the users in India and am working from India (GMT+5:30). Our Servers are in US. Some scheme is to be applied which has a start date and end date (eg - From Aug 01, 00:00:00 to Aug 31 11:59:59 it must be active). Am using http://www.ruddwire.com/handy-code/date-to-millisecond-calculators/ for I used this code assuming getTime will have a consistent UTC time-
// dateStr is what user enters on UI like 1/08/2013 format = dd/MM/yyyy
public static long getUTCDateWithFormat(String dateStr, String format) throws ParseException {
SimpleDateFormat sdf = new SimpleDateFormat(format);
Date date = sdf.parse(dateStr);
return date.getTime();
}
So considering start time, when I run from localhost, For August 1, I got result = 1375295400000, which means Thu Aug 01 2013 00:00:00 GMT+0530 (India Standard Time). (So I understand there is a 5 hrs 30mins milliseconds to be subtracted from the UTC since at runtime I have to compare system time with this value.)
Expected similar UTC to be saved when run from server (US), but it saved 1375336800000 which means Thu Aug 01 2013 11:30:00 GMT+0530 (India Standard Time) so UTC time is different and there is 11:30 hours flaw in addition to 5:30 hrs flaw.
Did not use the following code on the server but I expect similar results -
Calendar calendar = Calendar.getInstance();
String[] arr = dateStr.split("/");
calendar.set(new Integer(arr[2]), new Integer(arr[0]), new Integer(
arr[1]));
return calendar.getTime();
Kindly help me resolve this problem. Also while doing comparison at runtime I have to use new Date() so how should that be?