This question is purely to satisfy my curiosity.
In the JavaScript Date object, when you call getMonth()
it returns the month but it counts from 0.
Link1
Date.prototype.getDate()
Returns the day of the month (1-31) for the specified date according to local time.
Link2
A Date object contains a number representing a particular instant in time to within a millisecond For example, if you specify 150 seconds, JavaScript redefines that number as two minutes and 30 seconds.
When you implement methods in Javascript to find the difference between two times specified in miliseconds, you would need to return a date
which needs to be greater than 0
for obvious reasons.
var startTime = new Date('1/1/1990');
var startMsec = startTime.getMilliseconds();
startTime.setTime(5000000);
var elapsed = (startTime.getTime() - startMsec) / 1000;
document.write(elapsed);
// Output: 5000
As explained by "SomeShinyObject" that
var months = ["January", "February", "March", "April", "May", "June", "July",
"August", "September", "October", "November", "December"];
helps in referencing them through array index.
Hence getDay
, getHours
, getMonths
starts from 0.