I would like help with this query to determine the average hourly sales for a user for a month and year. For most months everything is fine, but in this one case, it is producing 13884476534.2960. I know I have some math wrong here, but can't figure it out. I thought maybe divide by zero error but wouldn't that return null?
This table has columns such as the start and end of the day where you made sales, represented as unix timestamps (milliseconds). For ease of use, I also record the month/day/year created. Entering in a clocked in/clocked out time is not mandatory, so for some months there is no start/end time (in this case, I just set the start and ending time to be equal; it is time you entered the shift, basically the current timestamp).
As I mentioned, I want to determine the hourly average for the month in sales, so I do this,
select SUM(SalesTotal) /
SUM(
((SalesDayEndTime - SalesDayStartTime) / (1000.0 * 60.0 * 60.0))
)
from SalesDays s
where s.Month = 12
AND s.Year = 2016
AND s.user_id = 1
GROUP BY s.Month
So in this particular month, I don't believe the user entered their hours (so clock in/out defaulted to current time of entry for that day, and they are equal). The SUM(SalesTotal) produces 3846 so they made 3846 for that month. The sum for the time worked / 1000.0 * 60.0 * 60.0 produces: 0.0000
So that's interesting I would have expected a divide by zero error but it is not the case, it returns this high number instead.
When I just do SUM(SalesDayEndTime - SalesDayStartTime) it produces 1.
Any ideas why the output is 3884476534.2960 ?
Thank you!