I have a JavaScript issue which I can't seem to solve:
I convert a Date to timestamp, and when I convert it back, it shows the correct DateTime:
DateTime to Timestamp:
var ts = Math.floor(Date.now() / 1000);
and back - Timestamp to Date:
var a = new Date(ts * 1000);
var months = ['Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec'];
var year = a.getFullYear();
var month = months[a.getMonth()];
var date = a.getDate();
var hour = a.getHours();
var min = a.getMinutes();
var sec = a.getSeconds();
var formattedTime = date + ' ' + month + ' ' + year + ' ' + hour + ':' + min + ':' + sec ;
The problem is when I do it with input data, instead of using the Date.now(), it converts to timestamp, but when I convert it back to dateTime, the Hour parameter is exactly 4 hours early.
var destinationDateTimeStr_ = document.getElementById("dateyear").value+"-"+document.getElementById("datemonth").value+"-"+document.getElementById("dateday").value+"T"+document.getElementById("datehour").value+":"+document.getElementById("dateminute").value+":00";
//convert date as string to timezone
var date2 = new Date(destinationDateTimeStr_); //2018-06-16T15:35:00
ts_ = Math.floor(date2 / 1000);
and back - Timestamp to Date:
var a = new Date(ts_ * 1000);
var months = ['Jan','Feb','Mar','Apr','May','Jun','Jul','Aug','Sep','Oct','Nov','Dec'];
var year = a.getFullYear();
var month = months[a.getMonth()];
var date = a.getDate();
var hour = a.getHours();
var min = a.getMinutes();
var sec = a.getSeconds();
var formattedTime = date + ' ' + month + ' ' + year + ' ' + hour + ':' + min + ':' + sec ;
//returns: 2018-06-16T11:35:00 - 4 hours earlier
how can that be if all conversions are done on the same client (the same timezone)?
then I tried:
var offset = new Date().getTimezoneOffset();
but it returns -180 which is only 3 hours back and not 4
then I tried:
var tz = Intl.DateTimeFormat().resolvedOptions().timeZone;
but it gives me a text (which means -2 hours in my case: Asia/Jerusalem)
What am I doing wrong?