Refining Productivity & Efficiency QueryQuery to normalize table/combine row textDatawarehouse Design:...

Unable to login to ec2 instance after running “sudo chmod 2770 /”

Substitute ./ and ../ directories by actual names

A dragon's soul trapped in a ring of mind shielding wants a new body; what magic could enable her to do so?

Can a rabbi conduct a marriage if the bride is already pregnant from the groom?

Negotiating 1-year delay to my Assistant Professor Offer

For US ESTA, should I mention a visa denial from before I got UK citizenship?

How to write a character overlapping another character

Why does finding small effects in large studies indicate publication bias?

Does limiting the number of sources help simplify the game for a new DM with new and experienced players?

Why does Python copy numpy arrays where the length of the dimensions are the same?

Does the phrase がんばする makes sense?

Why would you use 2 alternate layout buttons instead of 1, when only one can be selected at once

Father gets chickenpox, but doesn't infect his two children. How is this possible?

Does an increasing sequence of reals converge if the difference of consecutive terms approaches zero?

Would Refreshing a Sandbox Wipe Out Certain Metadata?

Coworker is trying to get me to sign his petition to run for office. How to decline politely?

How should I ship cards?

Why is it a problem for Freddie if the guys from Munich did what he wanted?

Why there is square in MSE (mean squared error)?

Ethernet cable only works in certain positions

How to know if I am a 'Real Developer'

Identical projects by students at two different colleges: still plagiarism?

Limit involving inverse functions

Badly designed reimbursement form. What does that say about the company?



Refining Productivity & Efficiency Query


Query to normalize table/combine row textDatawarehouse Design: Combined Date Time dimension vs. Separate Day and Time dimensions and timezonesStrange behaviour DBCC ShrinkfileAuto Update Statistics Duration and InterruptionsCan I implement Intellisense into my stored procedure with custom constants?Conversion failed when converting the varchar value 'Hours' to data type intSQL Server 2016 Query Crashes ServerSpeed to read small tableFind Max Value for each month for the last 3 months, properlyEfficiency of Scalar UDF vs TVF













1















I'm trying to write a query that is, in effect, a Productivity and Efficiency Report for measuring the statistics of those of us (such as myself) who type up the paper work orders that our mechanics fill out when repairing our buses.



Here's a drawing of the report's layout:



              +------------------------------------------++-------------------------------------------------------+
| DAILY || MONTHLY |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| Creator | Count | Time | Average | Min. | Max. || Count | Daily Avg. | Time | Average | Min. | Max. |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| williamsonb | 42 | 3:56 | 00:08:34 | 00:02 | 00:34 || ... | ... | ... | ... | ... | ... |
| petersonl | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
| ... | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+


Here's how our software "works" (if you could call it that): If I were to open a window to start typing up a work order that one of our mechanic had just completed and I didn't wait until I had ALL of the information for that work order typed in before I clicked 'Save', and then continued typing the rest of the information, if 'petersonl' started typing up another work order after I had initially clicked 'Save' and also finished typing theirs before me the system would prevent them from clicking 'Save' until I was finished with my work order. It's incredibly annoying.



If I created a Work Order at 9:00 AM, petersonl created one at 9:30 AM, I started another at 10:00 AM, and then she started another at 10:15 AM, the duration of my first Work Order would be 1:00:00, whereas the duration of her first work order would be 00:45:00 . The query would look at the time between the work orders that distinct users have created/updated, like f(n+1)-f(n).



We finally got some laptops so that the mechanics will VERY soon be able to start inputting the parts used on a work order themselves, if any (meaning that this query will then need to shift from analyzing CreatedTime to UpdatedTime for those in data entry who STILL have to type in the mechanic's/s' notes), meaning our inventory will be more accurate from moment to the next, but if/when we start having the mechanics type ALL of their work directly into the database this report will need to be modified to conditionally analyze CreatedTime and UpdatedTime (if not null). However, considering that's likely not going to happen for at least another year or two...



Below is the SQL code for what I have so far:



CREATE TABLE WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

INSERT INTO WorkOrders values
('williamsonb', '2017/03/26 12:05:00.000 PM'),
('williamsonb', '2017/03/26 12:12:00.000 PM'),
('williamsonb', '2017/03/26 01:31:00.000 PM'),
('williamsonb', '2017/03/02 09:45:00.000 AM'),
('williamsonb', '2017/03/02 09:49:00.000 AM'),
('williamsonb', '2017/03/02 09:51:00.000 AM'),
('williamsonb', '2017/02/04 02:14:00.000 PM'),
('petersonl', '2017/03/26 07:42:00.000 AM'),
('petersonl', '2017/03/26 07:45:00.000 AM'),
('petersonl', '2017/03/26 07:46:00.000 AM'),
('petersonl', '2017/03/05 08:47:00.000 AM'),
('petersonl', '2017/03/05 08:49:00.000 AM'),
('petersonl', '2017/03/05 08:50:00.000 AM'),
('petersonl', '2017/01/17 06:54:00.000 AM');





DECLARE 
@DayCount INT = COUNT(SELECT CreatedBy
FROM WorkOrders
WHERE CreatedTime = DATEADD(DAY, -1, GETDATE()),

@DayTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))),

@DaySumHours INT = @DayTimeSum - MOD(@DayTimeSum % 1),

@DaySumMinutes INT = (MOD(@DayTimeSum % 1) - MOD(MOD(@DayTimeSum % 1) % 1/60) ) * 60,

@DayAvgTime DECIMAL = @DayCount / @DayTimeSum,

@DayAvgHours INT = @DayAvgTime - MOD(@DayAvgTime % 1),

@DayAvgMinutes INT = (MOD(@DayAvgTime % 1) - MOD(MOD(@DayAvgTime % 1) % 1/60)) * 60,

@DayAvgSeconds INT = ((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60))
- MOD((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60) ) % 1/3600) ) * 3600,

@MonthTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE())));

SELECT DISTINCT CreatedBy
,@DayCount AS [Count]
,@DaySumHours + ':' + @DaySumMinutes + ':' + @DaySumSeconds AS [Total]
,@DayAvgHours + ':' + @DayAvgMinutes + ':' + @DayAvgSeconds AS [Average]
,LAST(SELECT CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Min.]
,TOP 1(CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Max.]
FROM WorkOrders
WHERE CreatedTime BETWEEN DATEADD(MONTH,-1,GETDATE()) AND GETDATE()
GROUP BY CreatedBy
ORDER BY CreatedTime DESC;


At the very least I'm having trouble working out the 'Min.' and 'Max.' columns.




Notes:




  1. If you're scratching your head wondering why I chose to use so many variables, I'm still working this query out- it wouldn't surprise me if there are ways to optimize this report computation-wise.


  2. I'm not sure how to go about obtaining a reasonable list of the time duration between each CreationTime, and I know there is no way to get the duration for the very last Work Ordered entered each day, but that's tolerable.


  3. The ReportBuilder interface built into our shop's software uses... Delphi? However, I'm pretty confident that our Technology Dept. uses Microsoft SQL Server for upkeep on the database, but I don't have permission to get software that powerful installed on my computer.


  4. The reason the 'Average' columns have the time format HH:MM:SS and yet the 'Time', 'Min.', and 'Max.' columns are of the format HH:MM is because when CreatedTime is recorded in our software it always comes out in the format HH:MM:00.000 .


  5. I went ahead and let the 'Average' columns have seconds soas to make the read-out more differentiated from everything else.


  6. CreatedTime is the time that progress on the work order was first saved to the system. There is another field called UpdatedTime which deals with the datetime when the work order was modified after its creation, and stays null until a work order that has been saved for the first time is edited, then saved again.











share|improve this question
















bumped to the homepage by Community 3 hours ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
















  • I would seriously question the value of the report you are writing. As i understand it, it doesn't reflect how long you actually spend entering a WO. It doesnt reflect taking breaks or the complexity of in WO vs the next. You might as well take thier first and last times and divide by the number. Assuming the updatedtime is set when you click save you would be better off working out the duration for each WO by subtracting the createdtime from the updated time. Then work out count, max, min, avg duration per user.

    – Sir Swears-a-lot
    Mar 28 '17 at 8:05













  • Actually, I think @Peter is making the same point I brought up. However, this may be a matter of misunderstanding what you're actually measuring. I think many readers thought you were measuring the time it took to fulfill a work order. Now, I think I understand better. Your mechanics complete a work order in full on paper, and then you have to type it into the system. It's the time it takes to enter it into the system that you're trying to measure, which makes more sense.

    – RDFozz
    Mar 28 '17 at 14:18






  • 1





    Do you want to ignore the last work order of the day, or assume it was completed at EOD (let's say, 17:00)?

    – RDFozz
    Mar 28 '17 at 15:06











  • Can you confirm that the SQL you've shown above is pencil and paper stuff? it seems to involve some functions that don't exist, and assume that you can DECLARE and set a variable, and then reference it when setting another variable in the same DECLARE, which doesn't work.

    – RDFozz
    Mar 28 '17 at 15:32













  • @RDFozz - It was more pen-and-paper kind of stuff. I'm guessing that if the resulting list of work order processing times where sorted by their duration, the TOP 1 and LAST expressions could probably be used to get the 'Min.' and 'Max.' times, but I couldn't get that to work.

    – Ben Williamson
    Mar 28 '17 at 15:40
















1















I'm trying to write a query that is, in effect, a Productivity and Efficiency Report for measuring the statistics of those of us (such as myself) who type up the paper work orders that our mechanics fill out when repairing our buses.



Here's a drawing of the report's layout:



              +------------------------------------------++-------------------------------------------------------+
| DAILY || MONTHLY |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| Creator | Count | Time | Average | Min. | Max. || Count | Daily Avg. | Time | Average | Min. | Max. |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| williamsonb | 42 | 3:56 | 00:08:34 | 00:02 | 00:34 || ... | ... | ... | ... | ... | ... |
| petersonl | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
| ... | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+


Here's how our software "works" (if you could call it that): If I were to open a window to start typing up a work order that one of our mechanic had just completed and I didn't wait until I had ALL of the information for that work order typed in before I clicked 'Save', and then continued typing the rest of the information, if 'petersonl' started typing up another work order after I had initially clicked 'Save' and also finished typing theirs before me the system would prevent them from clicking 'Save' until I was finished with my work order. It's incredibly annoying.



If I created a Work Order at 9:00 AM, petersonl created one at 9:30 AM, I started another at 10:00 AM, and then she started another at 10:15 AM, the duration of my first Work Order would be 1:00:00, whereas the duration of her first work order would be 00:45:00 . The query would look at the time between the work orders that distinct users have created/updated, like f(n+1)-f(n).



We finally got some laptops so that the mechanics will VERY soon be able to start inputting the parts used on a work order themselves, if any (meaning that this query will then need to shift from analyzing CreatedTime to UpdatedTime for those in data entry who STILL have to type in the mechanic's/s' notes), meaning our inventory will be more accurate from moment to the next, but if/when we start having the mechanics type ALL of their work directly into the database this report will need to be modified to conditionally analyze CreatedTime and UpdatedTime (if not null). However, considering that's likely not going to happen for at least another year or two...



Below is the SQL code for what I have so far:



CREATE TABLE WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

INSERT INTO WorkOrders values
('williamsonb', '2017/03/26 12:05:00.000 PM'),
('williamsonb', '2017/03/26 12:12:00.000 PM'),
('williamsonb', '2017/03/26 01:31:00.000 PM'),
('williamsonb', '2017/03/02 09:45:00.000 AM'),
('williamsonb', '2017/03/02 09:49:00.000 AM'),
('williamsonb', '2017/03/02 09:51:00.000 AM'),
('williamsonb', '2017/02/04 02:14:00.000 PM'),
('petersonl', '2017/03/26 07:42:00.000 AM'),
('petersonl', '2017/03/26 07:45:00.000 AM'),
('petersonl', '2017/03/26 07:46:00.000 AM'),
('petersonl', '2017/03/05 08:47:00.000 AM'),
('petersonl', '2017/03/05 08:49:00.000 AM'),
('petersonl', '2017/03/05 08:50:00.000 AM'),
('petersonl', '2017/01/17 06:54:00.000 AM');





DECLARE 
@DayCount INT = COUNT(SELECT CreatedBy
FROM WorkOrders
WHERE CreatedTime = DATEADD(DAY, -1, GETDATE()),

@DayTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))),

@DaySumHours INT = @DayTimeSum - MOD(@DayTimeSum % 1),

@DaySumMinutes INT = (MOD(@DayTimeSum % 1) - MOD(MOD(@DayTimeSum % 1) % 1/60) ) * 60,

@DayAvgTime DECIMAL = @DayCount / @DayTimeSum,

@DayAvgHours INT = @DayAvgTime - MOD(@DayAvgTime % 1),

@DayAvgMinutes INT = (MOD(@DayAvgTime % 1) - MOD(MOD(@DayAvgTime % 1) % 1/60)) * 60,

@DayAvgSeconds INT = ((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60))
- MOD((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60) ) % 1/3600) ) * 3600,

@MonthTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE())));

SELECT DISTINCT CreatedBy
,@DayCount AS [Count]
,@DaySumHours + ':' + @DaySumMinutes + ':' + @DaySumSeconds AS [Total]
,@DayAvgHours + ':' + @DayAvgMinutes + ':' + @DayAvgSeconds AS [Average]
,LAST(SELECT CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Min.]
,TOP 1(CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Max.]
FROM WorkOrders
WHERE CreatedTime BETWEEN DATEADD(MONTH,-1,GETDATE()) AND GETDATE()
GROUP BY CreatedBy
ORDER BY CreatedTime DESC;


At the very least I'm having trouble working out the 'Min.' and 'Max.' columns.




Notes:




  1. If you're scratching your head wondering why I chose to use so many variables, I'm still working this query out- it wouldn't surprise me if there are ways to optimize this report computation-wise.


  2. I'm not sure how to go about obtaining a reasonable list of the time duration between each CreationTime, and I know there is no way to get the duration for the very last Work Ordered entered each day, but that's tolerable.


  3. The ReportBuilder interface built into our shop's software uses... Delphi? However, I'm pretty confident that our Technology Dept. uses Microsoft SQL Server for upkeep on the database, but I don't have permission to get software that powerful installed on my computer.


  4. The reason the 'Average' columns have the time format HH:MM:SS and yet the 'Time', 'Min.', and 'Max.' columns are of the format HH:MM is because when CreatedTime is recorded in our software it always comes out in the format HH:MM:00.000 .


  5. I went ahead and let the 'Average' columns have seconds soas to make the read-out more differentiated from everything else.


  6. CreatedTime is the time that progress on the work order was first saved to the system. There is another field called UpdatedTime which deals with the datetime when the work order was modified after its creation, and stays null until a work order that has been saved for the first time is edited, then saved again.











share|improve this question
















bumped to the homepage by Community 3 hours ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
















  • I would seriously question the value of the report you are writing. As i understand it, it doesn't reflect how long you actually spend entering a WO. It doesnt reflect taking breaks or the complexity of in WO vs the next. You might as well take thier first and last times and divide by the number. Assuming the updatedtime is set when you click save you would be better off working out the duration for each WO by subtracting the createdtime from the updated time. Then work out count, max, min, avg duration per user.

    – Sir Swears-a-lot
    Mar 28 '17 at 8:05













  • Actually, I think @Peter is making the same point I brought up. However, this may be a matter of misunderstanding what you're actually measuring. I think many readers thought you were measuring the time it took to fulfill a work order. Now, I think I understand better. Your mechanics complete a work order in full on paper, and then you have to type it into the system. It's the time it takes to enter it into the system that you're trying to measure, which makes more sense.

    – RDFozz
    Mar 28 '17 at 14:18






  • 1





    Do you want to ignore the last work order of the day, or assume it was completed at EOD (let's say, 17:00)?

    – RDFozz
    Mar 28 '17 at 15:06











  • Can you confirm that the SQL you've shown above is pencil and paper stuff? it seems to involve some functions that don't exist, and assume that you can DECLARE and set a variable, and then reference it when setting another variable in the same DECLARE, which doesn't work.

    – RDFozz
    Mar 28 '17 at 15:32













  • @RDFozz - It was more pen-and-paper kind of stuff. I'm guessing that if the resulting list of work order processing times where sorted by their duration, the TOP 1 and LAST expressions could probably be used to get the 'Min.' and 'Max.' times, but I couldn't get that to work.

    – Ben Williamson
    Mar 28 '17 at 15:40














1












1








1








I'm trying to write a query that is, in effect, a Productivity and Efficiency Report for measuring the statistics of those of us (such as myself) who type up the paper work orders that our mechanics fill out when repairing our buses.



Here's a drawing of the report's layout:



              +------------------------------------------++-------------------------------------------------------+
| DAILY || MONTHLY |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| Creator | Count | Time | Average | Min. | Max. || Count | Daily Avg. | Time | Average | Min. | Max. |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| williamsonb | 42 | 3:56 | 00:08:34 | 00:02 | 00:34 || ... | ... | ... | ... | ... | ... |
| petersonl | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
| ... | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+


Here's how our software "works" (if you could call it that): If I were to open a window to start typing up a work order that one of our mechanic had just completed and I didn't wait until I had ALL of the information for that work order typed in before I clicked 'Save', and then continued typing the rest of the information, if 'petersonl' started typing up another work order after I had initially clicked 'Save' and also finished typing theirs before me the system would prevent them from clicking 'Save' until I was finished with my work order. It's incredibly annoying.



If I created a Work Order at 9:00 AM, petersonl created one at 9:30 AM, I started another at 10:00 AM, and then she started another at 10:15 AM, the duration of my first Work Order would be 1:00:00, whereas the duration of her first work order would be 00:45:00 . The query would look at the time between the work orders that distinct users have created/updated, like f(n+1)-f(n).



We finally got some laptops so that the mechanics will VERY soon be able to start inputting the parts used on a work order themselves, if any (meaning that this query will then need to shift from analyzing CreatedTime to UpdatedTime for those in data entry who STILL have to type in the mechanic's/s' notes), meaning our inventory will be more accurate from moment to the next, but if/when we start having the mechanics type ALL of their work directly into the database this report will need to be modified to conditionally analyze CreatedTime and UpdatedTime (if not null). However, considering that's likely not going to happen for at least another year or two...



Below is the SQL code for what I have so far:



CREATE TABLE WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

INSERT INTO WorkOrders values
('williamsonb', '2017/03/26 12:05:00.000 PM'),
('williamsonb', '2017/03/26 12:12:00.000 PM'),
('williamsonb', '2017/03/26 01:31:00.000 PM'),
('williamsonb', '2017/03/02 09:45:00.000 AM'),
('williamsonb', '2017/03/02 09:49:00.000 AM'),
('williamsonb', '2017/03/02 09:51:00.000 AM'),
('williamsonb', '2017/02/04 02:14:00.000 PM'),
('petersonl', '2017/03/26 07:42:00.000 AM'),
('petersonl', '2017/03/26 07:45:00.000 AM'),
('petersonl', '2017/03/26 07:46:00.000 AM'),
('petersonl', '2017/03/05 08:47:00.000 AM'),
('petersonl', '2017/03/05 08:49:00.000 AM'),
('petersonl', '2017/03/05 08:50:00.000 AM'),
('petersonl', '2017/01/17 06:54:00.000 AM');





DECLARE 
@DayCount INT = COUNT(SELECT CreatedBy
FROM WorkOrders
WHERE CreatedTime = DATEADD(DAY, -1, GETDATE()),

@DayTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))),

@DaySumHours INT = @DayTimeSum - MOD(@DayTimeSum % 1),

@DaySumMinutes INT = (MOD(@DayTimeSum % 1) - MOD(MOD(@DayTimeSum % 1) % 1/60) ) * 60,

@DayAvgTime DECIMAL = @DayCount / @DayTimeSum,

@DayAvgHours INT = @DayAvgTime - MOD(@DayAvgTime % 1),

@DayAvgMinutes INT = (MOD(@DayAvgTime % 1) - MOD(MOD(@DayAvgTime % 1) % 1/60)) * 60,

@DayAvgSeconds INT = ((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60))
- MOD((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60) ) % 1/3600) ) * 3600,

@MonthTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE())));

SELECT DISTINCT CreatedBy
,@DayCount AS [Count]
,@DaySumHours + ':' + @DaySumMinutes + ':' + @DaySumSeconds AS [Total]
,@DayAvgHours + ':' + @DayAvgMinutes + ':' + @DayAvgSeconds AS [Average]
,LAST(SELECT CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Min.]
,TOP 1(CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Max.]
FROM WorkOrders
WHERE CreatedTime BETWEEN DATEADD(MONTH,-1,GETDATE()) AND GETDATE()
GROUP BY CreatedBy
ORDER BY CreatedTime DESC;


At the very least I'm having trouble working out the 'Min.' and 'Max.' columns.




Notes:




  1. If you're scratching your head wondering why I chose to use so many variables, I'm still working this query out- it wouldn't surprise me if there are ways to optimize this report computation-wise.


  2. I'm not sure how to go about obtaining a reasonable list of the time duration between each CreationTime, and I know there is no way to get the duration for the very last Work Ordered entered each day, but that's tolerable.


  3. The ReportBuilder interface built into our shop's software uses... Delphi? However, I'm pretty confident that our Technology Dept. uses Microsoft SQL Server for upkeep on the database, but I don't have permission to get software that powerful installed on my computer.


  4. The reason the 'Average' columns have the time format HH:MM:SS and yet the 'Time', 'Min.', and 'Max.' columns are of the format HH:MM is because when CreatedTime is recorded in our software it always comes out in the format HH:MM:00.000 .


  5. I went ahead and let the 'Average' columns have seconds soas to make the read-out more differentiated from everything else.


  6. CreatedTime is the time that progress on the work order was first saved to the system. There is another field called UpdatedTime which deals with the datetime when the work order was modified after its creation, and stays null until a work order that has been saved for the first time is edited, then saved again.











share|improve this question
















I'm trying to write a query that is, in effect, a Productivity and Efficiency Report for measuring the statistics of those of us (such as myself) who type up the paper work orders that our mechanics fill out when repairing our buses.



Here's a drawing of the report's layout:



              +------------------------------------------++-------------------------------------------------------+
| DAILY || MONTHLY |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| Creator | Count | Time | Average | Min. | Max. || Count | Daily Avg. | Time | Average | Min. | Max. |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+
| williamsonb | 42 | 3:56 | 00:08:34 | 00:02 | 00:34 || ... | ... | ... | ... | ... | ... |
| petersonl | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
| ... | ... | ... | ... | ... | ... || ... | ... | ... | ... | ... | ... |
+-------------+-------+-------+----------+-------+-------++-------+------------+-------+----------+-------+-------+


Here's how our software "works" (if you could call it that): If I were to open a window to start typing up a work order that one of our mechanic had just completed and I didn't wait until I had ALL of the information for that work order typed in before I clicked 'Save', and then continued typing the rest of the information, if 'petersonl' started typing up another work order after I had initially clicked 'Save' and also finished typing theirs before me the system would prevent them from clicking 'Save' until I was finished with my work order. It's incredibly annoying.



If I created a Work Order at 9:00 AM, petersonl created one at 9:30 AM, I started another at 10:00 AM, and then she started another at 10:15 AM, the duration of my first Work Order would be 1:00:00, whereas the duration of her first work order would be 00:45:00 . The query would look at the time between the work orders that distinct users have created/updated, like f(n+1)-f(n).



We finally got some laptops so that the mechanics will VERY soon be able to start inputting the parts used on a work order themselves, if any (meaning that this query will then need to shift from analyzing CreatedTime to UpdatedTime for those in data entry who STILL have to type in the mechanic's/s' notes), meaning our inventory will be more accurate from moment to the next, but if/when we start having the mechanics type ALL of their work directly into the database this report will need to be modified to conditionally analyze CreatedTime and UpdatedTime (if not null). However, considering that's likely not going to happen for at least another year or two...



Below is the SQL code for what I have so far:



CREATE TABLE WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

INSERT INTO WorkOrders values
('williamsonb', '2017/03/26 12:05:00.000 PM'),
('williamsonb', '2017/03/26 12:12:00.000 PM'),
('williamsonb', '2017/03/26 01:31:00.000 PM'),
('williamsonb', '2017/03/02 09:45:00.000 AM'),
('williamsonb', '2017/03/02 09:49:00.000 AM'),
('williamsonb', '2017/03/02 09:51:00.000 AM'),
('williamsonb', '2017/02/04 02:14:00.000 PM'),
('petersonl', '2017/03/26 07:42:00.000 AM'),
('petersonl', '2017/03/26 07:45:00.000 AM'),
('petersonl', '2017/03/26 07:46:00.000 AM'),
('petersonl', '2017/03/05 08:47:00.000 AM'),
('petersonl', '2017/03/05 08:49:00.000 AM'),
('petersonl', '2017/03/05 08:50:00.000 AM'),
('petersonl', '2017/01/17 06:54:00.000 AM');





DECLARE 
@DayCount INT = COUNT(SELECT CreatedBy
FROM WorkOrders
WHERE CreatedTime = DATEADD(DAY, -1, GETDATE()),

@DayTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(DAY,-1,GETDATE()))),

@DaySumHours INT = @DayTimeSum - MOD(@DayTimeSum % 1),

@DaySumMinutes INT = (MOD(@DayTimeSum % 1) - MOD(MOD(@DayTimeSum % 1) % 1/60) ) * 60,

@DayAvgTime DECIMAL = @DayCount / @DayTimeSum,

@DayAvgHours INT = @DayAvgTime - MOD(@DayAvgTime % 1),

@DayAvgMinutes INT = (MOD(@DayAvgTime % 1) - MOD(MOD(@DayAvgTime % 1) % 1/60)) * 60,

@DayAvgSeconds INT = ((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60))
- MOD((MOD(@DayAvgTime % 1)
- MOD(MOD(@DayAvgTime % 1) % 1/60) ) % 1/3600) ) * 3600,

@MonthTimeSum DECIMAL = SUM(DATEDIFF(HOUR, SELECT (HOUR(CreatedTime)
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(MINUTE, SELECT (MINUTE(CreationDate)/60
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE()))
+ DATEDIFF(SECOND, SELECT (SECOND(CreationDate)/3600
FROM WorkOrders
WHERE DATEADD(MONTH,-1,GETDATE())));

SELECT DISTINCT CreatedBy
,@DayCount AS [Count]
,@DaySumHours + ':' + @DaySumMinutes + ':' + @DaySumSeconds AS [Total]
,@DayAvgHours + ':' + @DayAvgMinutes + ':' + @DayAvgSeconds AS [Average]
,LAST(SELECT CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Min.]
,TOP 1(CAST(SELECT CAST(CreatedTime AS TIME) AS CHAR(5))) AS [Max.]
FROM WorkOrders
WHERE CreatedTime BETWEEN DATEADD(MONTH,-1,GETDATE()) AND GETDATE()
GROUP BY CreatedBy
ORDER BY CreatedTime DESC;


At the very least I'm having trouble working out the 'Min.' and 'Max.' columns.




Notes:




  1. If you're scratching your head wondering why I chose to use so many variables, I'm still working this query out- it wouldn't surprise me if there are ways to optimize this report computation-wise.


  2. I'm not sure how to go about obtaining a reasonable list of the time duration between each CreationTime, and I know there is no way to get the duration for the very last Work Ordered entered each day, but that's tolerable.


  3. The ReportBuilder interface built into our shop's software uses... Delphi? However, I'm pretty confident that our Technology Dept. uses Microsoft SQL Server for upkeep on the database, but I don't have permission to get software that powerful installed on my computer.


  4. The reason the 'Average' columns have the time format HH:MM:SS and yet the 'Time', 'Min.', and 'Max.' columns are of the format HH:MM is because when CreatedTime is recorded in our software it always comes out in the format HH:MM:00.000 .


  5. I went ahead and let the 'Average' columns have seconds soas to make the read-out more differentiated from everything else.


  6. CreatedTime is the time that progress on the work order was first saved to the system. There is another field called UpdatedTime which deals with the datetime when the work order was modified after its creation, and stays null until a work order that has been saved for the first time is edited, then saved again.








sql-server count time max






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Mar 28 '17 at 14:53







Ben Williamson

















asked Mar 27 '17 at 18:46









Ben WilliamsonBen Williamson

315




315





bumped to the homepage by Community 3 hours ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.







bumped to the homepage by Community 3 hours ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.















  • I would seriously question the value of the report you are writing. As i understand it, it doesn't reflect how long you actually spend entering a WO. It doesnt reflect taking breaks or the complexity of in WO vs the next. You might as well take thier first and last times and divide by the number. Assuming the updatedtime is set when you click save you would be better off working out the duration for each WO by subtracting the createdtime from the updated time. Then work out count, max, min, avg duration per user.

    – Sir Swears-a-lot
    Mar 28 '17 at 8:05













  • Actually, I think @Peter is making the same point I brought up. However, this may be a matter of misunderstanding what you're actually measuring. I think many readers thought you were measuring the time it took to fulfill a work order. Now, I think I understand better. Your mechanics complete a work order in full on paper, and then you have to type it into the system. It's the time it takes to enter it into the system that you're trying to measure, which makes more sense.

    – RDFozz
    Mar 28 '17 at 14:18






  • 1





    Do you want to ignore the last work order of the day, or assume it was completed at EOD (let's say, 17:00)?

    – RDFozz
    Mar 28 '17 at 15:06











  • Can you confirm that the SQL you've shown above is pencil and paper stuff? it seems to involve some functions that don't exist, and assume that you can DECLARE and set a variable, and then reference it when setting another variable in the same DECLARE, which doesn't work.

    – RDFozz
    Mar 28 '17 at 15:32













  • @RDFozz - It was more pen-and-paper kind of stuff. I'm guessing that if the resulting list of work order processing times where sorted by their duration, the TOP 1 and LAST expressions could probably be used to get the 'Min.' and 'Max.' times, but I couldn't get that to work.

    – Ben Williamson
    Mar 28 '17 at 15:40



















  • I would seriously question the value of the report you are writing. As i understand it, it doesn't reflect how long you actually spend entering a WO. It doesnt reflect taking breaks or the complexity of in WO vs the next. You might as well take thier first and last times and divide by the number. Assuming the updatedtime is set when you click save you would be better off working out the duration for each WO by subtracting the createdtime from the updated time. Then work out count, max, min, avg duration per user.

    – Sir Swears-a-lot
    Mar 28 '17 at 8:05













  • Actually, I think @Peter is making the same point I brought up. However, this may be a matter of misunderstanding what you're actually measuring. I think many readers thought you were measuring the time it took to fulfill a work order. Now, I think I understand better. Your mechanics complete a work order in full on paper, and then you have to type it into the system. It's the time it takes to enter it into the system that you're trying to measure, which makes more sense.

    – RDFozz
    Mar 28 '17 at 14:18






  • 1





    Do you want to ignore the last work order of the day, or assume it was completed at EOD (let's say, 17:00)?

    – RDFozz
    Mar 28 '17 at 15:06











  • Can you confirm that the SQL you've shown above is pencil and paper stuff? it seems to involve some functions that don't exist, and assume that you can DECLARE and set a variable, and then reference it when setting another variable in the same DECLARE, which doesn't work.

    – RDFozz
    Mar 28 '17 at 15:32













  • @RDFozz - It was more pen-and-paper kind of stuff. I'm guessing that if the resulting list of work order processing times where sorted by their duration, the TOP 1 and LAST expressions could probably be used to get the 'Min.' and 'Max.' times, but I couldn't get that to work.

    – Ben Williamson
    Mar 28 '17 at 15:40

















I would seriously question the value of the report you are writing. As i understand it, it doesn't reflect how long you actually spend entering a WO. It doesnt reflect taking breaks or the complexity of in WO vs the next. You might as well take thier first and last times and divide by the number. Assuming the updatedtime is set when you click save you would be better off working out the duration for each WO by subtracting the createdtime from the updated time. Then work out count, max, min, avg duration per user.

– Sir Swears-a-lot
Mar 28 '17 at 8:05







I would seriously question the value of the report you are writing. As i understand it, it doesn't reflect how long you actually spend entering a WO. It doesnt reflect taking breaks or the complexity of in WO vs the next. You might as well take thier first and last times and divide by the number. Assuming the updatedtime is set when you click save you would be better off working out the duration for each WO by subtracting the createdtime from the updated time. Then work out count, max, min, avg duration per user.

– Sir Swears-a-lot
Mar 28 '17 at 8:05















Actually, I think @Peter is making the same point I brought up. However, this may be a matter of misunderstanding what you're actually measuring. I think many readers thought you were measuring the time it took to fulfill a work order. Now, I think I understand better. Your mechanics complete a work order in full on paper, and then you have to type it into the system. It's the time it takes to enter it into the system that you're trying to measure, which makes more sense.

– RDFozz
Mar 28 '17 at 14:18





Actually, I think @Peter is making the same point I brought up. However, this may be a matter of misunderstanding what you're actually measuring. I think many readers thought you were measuring the time it took to fulfill a work order. Now, I think I understand better. Your mechanics complete a work order in full on paper, and then you have to type it into the system. It's the time it takes to enter it into the system that you're trying to measure, which makes more sense.

– RDFozz
Mar 28 '17 at 14:18




1




1





Do you want to ignore the last work order of the day, or assume it was completed at EOD (let's say, 17:00)?

– RDFozz
Mar 28 '17 at 15:06





Do you want to ignore the last work order of the day, or assume it was completed at EOD (let's say, 17:00)?

– RDFozz
Mar 28 '17 at 15:06













Can you confirm that the SQL you've shown above is pencil and paper stuff? it seems to involve some functions that don't exist, and assume that you can DECLARE and set a variable, and then reference it when setting another variable in the same DECLARE, which doesn't work.

– RDFozz
Mar 28 '17 at 15:32







Can you confirm that the SQL you've shown above is pencil and paper stuff? it seems to involve some functions that don't exist, and assume that you can DECLARE and set a variable, and then reference it when setting another variable in the same DECLARE, which doesn't work.

– RDFozz
Mar 28 '17 at 15:32















@RDFozz - It was more pen-and-paper kind of stuff. I'm guessing that if the resulting list of work order processing times where sorted by their duration, the TOP 1 and LAST expressions could probably be used to get the 'Min.' and 'Max.' times, but I couldn't get that to work.

– Ben Williamson
Mar 28 '17 at 15:40





@RDFozz - It was more pen-and-paper kind of stuff. I'm guessing that if the resulting list of work order processing times where sorted by their duration, the TOP 1 and LAST expressions could probably be used to get the 'Min.' and 'Max.' times, but I couldn't get that to work.

– Ben Williamson
Mar 28 '17 at 15:40










2 Answers
2






active

oldest

votes


















0














All right - I think this will get you most of the way there. NOTE: I'm using some of the same things as Paparazzi, but put this together from scratch myself. I've tried to make some of the steps obvious, so it's entirely possible his solution would be faster.



-- Test Data
CREATE TABLE #WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

INSERT INTO #WorkOrders values
('williamsonb', '2017/03/27 12:05:00.000 PM'),
('williamsonb', '2017/03/27 12:12:00.000 PM'),
('williamsonb', '2017/03/27 01:31:00.000 PM'),
('williamsonb', '2017/03/26 09:04:00.000 PM'),
('williamsonb', '2017/03/26 09:12:00.000 PM'),
('williamsonb', '2017/03/26 10:32:00.000 PM'),
('williamsonb', '2017/03/02 09:45:00.000 AM'),
('williamsonb', '2017/03/02 09:49:00.000 AM'),
('williamsonb', '2017/03/02 09:51:00.000 AM'),
('williamsonb', '2017/02/04 02:14:00.000 PM'),
('petersonl', '2017/03/27 07:42:00.000 AM'),
('petersonl', '2017/03/27 07:45:00.000 AM'),
('petersonl', '2017/03/27 07:46:00.000 AM'),
('petersonl', '2017/03/26 08:41:00.000 AM'),
('petersonl', '2017/03/26 08:46:00.000 AM'),
('petersonl', '2017/03/26 08:48:00.000 AM'),
('petersonl', '2017/03/05 08:47:00.000 AM'),
('petersonl', '2017/03/05 08:49:00.000 AM'),
('petersonl', '2017/03/05 08:50:00.000 AM'),
('petersonl', '2017/01/17 06:54:00.000 AM');



WITH BaseData AS
(SELECT CreatedBy
,CreatedTime
,LEAD(CreatedTime,1) OVER (PARTITION BY CreatedBy, DATEDIFF(DAY, 0, CreatedTime) ORDER BY CreatedTime) as EndTime
FROM #WorkOrders
)
--SELECT * FROM BaseData
,DataWDuration AS
(SELECT CreatedBy
,CreatedTime
,EndTime
,CASE WHEN DATEDIFF(SECOND, CreatedTime, EndTime) < 60 THEN 60 ELSE DATEDIFF(SECOND, CreatedTime, EndTime) END as Duration
FROM BaseData
)
,DayTotal AS
(SELECT CreatedBy
,COUNT(*) as dailyCount
,SUM(Duration) as dailyTimeSec
,AVG(Duration) as DailyAvgTimeSec
,MIN(Duration) as DailyMinTimeSec
,MAX(Duration) as DailyMaxTimeSec
FROM DataWDuration
WHERE DATEDIFF(DAY, 0, CreatedTime) = DATEDIFF(DAY, 1, GETDATE()) -- compares date part of CreatedTime with yesterday's date
GROUP BY CreatedBy
)
,MTDTotal AS
(SELECT CreatedBy
,COUNT(*) as mtdCount
,COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as ActiveDays
,COUNT(*) * 1.0 / DATEPART(DAY, GETDATE()) as mtdAvgDailyCount
,COUNT(*) * 1.0 / COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as mtdActiveDayAvgCount
,SUM(Duration) as mtdTimeSec
,AVG(Duration) as mtdAvgTimeSec
,MIN(Duration) as mtdMinTimeSec
,MAX(Duration) as mtdMaxTimeSec
FROM DataWDuration
WHERE CreatedTime >= DATEADD(DAY, 1 - DATEPART(DAY, DATEADD(DAY, -1, GETDATE())), CAST(DATEDIFF(DAY, 1, GETDATE()) as datetime)) -- picks up all activity for current calendar month to date
AND CreatedTime < DATEDIFF(DAY, 0, GETDATE()) -- ignore anything entered today.
GROUP BY CreatedBy
)
SELECT *
FROM DayTotal dt
FULL JOIN MTDTotal mt ON (dt.CreatedBy = mt.CreatedBy)
;


So, the first part is just creating some data to work with. I took what you had provided, and added data for 3/27 (so there would be something from yesterday to look at).



Next, you see the query. I broke down the process using CTEs (Common Table Expressions), which are basically sub-queries you can reference in multiple places in your query (with a few bonus features, like recursive CTEs, but that's not necessary for this). If you want to see exactly what any of the CTE's returns, comment out everything after that CTE, put in SELECT * FROM and the CTE name; see the example right below the BaseData CTE.



The BaseData CTE adds an end time for each of the WorkOrder rows. Note that the end time for the last work order on any given day is NULL. As Paparazzi suggested, the LEAD functions helps us out here; it gives us the CreatedTime of the next record to use as the EndTime of the current record. The OVER clause is what makes this work; it tells SQL to just use the records for a specific CreatedBy, for a specific date, ordered by CreatedTime.



We need to find all the records for a specific date, regardless of the time, so we need a way to strip the time out of CreatedTime. There are a number of ways to do this. The fastest way is CAST(CreatedTime as DATE); however, this only works with SQL Server 2008 and up, and I wasn't certain what version you had. The second fastest way is supposed to be CAST(DATEDIFF(DAY, 0, CreatedTime) as datetime). This uses the fact that dates are stored as floating point integers, with the number left of the decimal point representing the date part and the number right of the decimal point representing the time part. So, we get the number of days between the date represented by 0 and today, then we have the server cast that to a datetime value. We get today's date, with the time set to midnight. If you are using SQL 2008 or later, I'd replace this with casting the datetime value to datatype DATE.



Note that when we just need to compare two dates, the number of days value works as well as the actual date, so we save some processing by not doing the final CAST to datetime.



The LEAD function accepts three arguments - the field to look at; how many rows ahead to look; and the default value to use if there is no row there. The default for the third argument is NULL; that works with what we're doing, so we just used that.



A final note on BaseData: this would be a good point to limit the rows you're bringing back. I didn't, because the test data is small, and because I'm not sure that you're not ultimately going to want to use this to pull back quarterly or annual results. We do put limits in later, for the specific subtotals you want. However, I would recommend limiting the rows you look at in practice, as prepping a year's worth of data to get monthly totals would take much longer than necessary.



The DataWDuration simply adds the amount of time (in seconds) that each work order took to process. You could combine this with the previous step, but the SQL gets really hard to follow, and I wanted to avoid that. Again, the last work order of each day has a NULL EndTime, and that means it has a NULL Duration.



The DayTotal CTE gives us the requested values for everyone who worked yesterday. Since we have an entry for each work order, getting MIN and MAX are straight-forward, as are the COUNT and AVG.



Two notes here: COUNT(*) gives you the total number of rows in your selection (for each group from GROUP BY). COUNT(Duration), on the other hand, would only count the rows where Duration was not NULL. I'm assuming that, even though we have to ignore the last work order of the day for the purposes of MIN, MAX, and AVG, we want it included in the count.



Also, we're using that "number of days" trick again to find the data for yesterday. Just like DATEDIFF(DAY, 0, CreatedTime) gives us the date part of CreatedTime, DATEDIFF(DAY, 0, GETDATE()) gives us the date part of today's date. However, we actually want the date part of yesterday's date. So, we want the number of days between the zero date and today's date - 1; which, it turns out, is the same as the number of days between the date that corresponds to day 1, and today's date. Keep this in mind if you're looking to replace this method with the "CAST as DATE" method.



Next, we have the MTDTotal (for Month-To-Date) CTE. I wasn't sure how you were defining a month, so I went with this. If you want to use DATEADD(MONTH, -1, GETDATE()) (SQL's definition of "one month ago"), or DATEADD(DAY, -30, GETDATE()), you should be able to modify this appropriately.



How you'd want to calculate the daily average number of work orders wasn't 100% clear, so I gave you two options. The first calculates the average based on the number of work orders by the person, divided by the number of days in the month so far (again, you should be able to adjust this for a different definition of month, if necessary). The second gives the average number of work orders per day that this particular person worked. So, if Ms. Peterson did 21 work orders on the 7 days she worked, and you did 30 on the 10 days you worked, you'd both have an average of 3. You should be able to modify this to even figure out working days (based on dates where someone had a work order), by pulling COUNT(DISTINCT DATEDIFF(DAY,0,CreatedTime) for the same set of records we're looking at now (without a GROUP BY), by adding another CTE.



Again, just to cover it, the WHERE clause is restricting the query to rows where CreatedTime is midnight on the first day of the current month or later, but is before midnight today. We're using the "number of days" trick again, but actually casting the value to datetime. We move back to the first of the month by getting the "day" part of yesterday's date, and subtracting (day - 1) from the date (for instance, if today is April 1, then yesterday was March 31, to we subtract 30 days to get March 1; If today is April 2, then yesterday was April 1, so we subtract 0 days to get - April 1!).



As I alluded to above, you could add CTEs to give you quarterly or annual totals, if you wanted.



I used a FULL join to display the values, so that you would still see the monthly totals for someone who was out yesterday. I left formatting the seconds as time values up to you. I do have a recommendation; if you're translating the seconds to "HH:MM" (not "HH:MM:SS"), then add 30 seconds when you're doing your calculations; that will round the result up to the next minute (329 seconds = "00:05:29" would become "00:05"; 330 seconds = "00:05:30" would become "00:06").



Results shown below (I manually edited out the second CreatedBy column, and of course using the column names rather than * would have prevented it altogether):



CreatedBy       dailyCount  dailyTimeSec DailyAvgTimeSec DailyMinTimeSec DailyMaxTimeSec mtdCount    ActiveDays  mtdAvgDailyCount      mtdActiveDayAvgCount   mtdTimeSec  mtdAvgTimeSec mtdMinTimeSec mtdMaxTimeSec
--------------- ----------- ------------ --------------- --------------- --------------- ----------- ----------- --------------------- ---------------------- ----------- ------------- ------------- -------------
petersonl 3 240 120 60 180 9 3 0.321428571428 3.000000000000 840 140 60 300
williamsonb 3 5160 2580 420 4740 9 3 0.321428571428 3.000000000000 10800 1800 120 4800


Hope this helps.



[EDIT: updated the calculation of Duration to ensure that any values less than 60 seconds are bumped up to 60 seconds; NULL values remain NULL].






share|improve this answer


























  • Comments are not for extended discussion; this conversation has been moved to chat.

    – Paul White
    Apr 16 '17 at 8:53



















0














Here is part



select cast(CreatedTime as Date) as CreatedDate, CreatedBy      
, CONVERT(VARCHAR(8), min(CreatedTime), 108) as [start]
, CONVERT(VARCHAR(8), max(CreatedTime), 108) as [stop]
, datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) as [start-stop]
, count(*) as count
, cast(datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) / cast((count(*) - 1) as decimal) as decimal(6,2)) as [avg]
from WorkOrders
group by cast(CreatedTime as Date), CreatedBy
having count(*) > 1
union
select cast(CreatedTime as Date) as CreatedDate, CreatedBy
, CONVERT(VARCHAR(8), max(CreatedTime), 108)
, null
, null
, count(*) as count
, null
from WorkOrders
group by cast(CreatedTime as Date), CreatedBy
having count(*) = 1
order by cast(CreatedTime as Date) desc, CreatedBy


This has everything you need



select [CreatedDate], CreatedBy
, CONVERT(VARCHAR(8), CreatedTime, 108) as [time]
, CONVERT(VARCHAR(8), [min], 108) as [min]
, CONVERT(VARCHAR(8), [max], 108) as [max]
, datediff(MINUTE, [min], [max]) as [min-max]
, CONVERT(VARCHAR(8), [next], 108) as [next]
, datediff(MINUTE, [next], [CreatedTime]) as [time-next]
, [count]
, [first], [last]
from
( select CreatedBy, cast(CreatedTime as date) as [CreatedDate], CreatedTime
, LEAD(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [next]
, min(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [min]
, max(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [max]
, COUNT(*) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date)) AS [count]
, ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [last]
, ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [first]
from WorkOrders
) tt
-- where tt.first = 1 or tt.last = 1
order by tt.CreatedDate desc, CreatedBy





share|improve this answer

























    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "182"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f168302%2frefining-productivity-efficiency-query%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    All right - I think this will get you most of the way there. NOTE: I'm using some of the same things as Paparazzi, but put this together from scratch myself. I've tried to make some of the steps obvious, so it's entirely possible his solution would be faster.



    -- Test Data
    CREATE TABLE #WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

    INSERT INTO #WorkOrders values
    ('williamsonb', '2017/03/27 12:05:00.000 PM'),
    ('williamsonb', '2017/03/27 12:12:00.000 PM'),
    ('williamsonb', '2017/03/27 01:31:00.000 PM'),
    ('williamsonb', '2017/03/26 09:04:00.000 PM'),
    ('williamsonb', '2017/03/26 09:12:00.000 PM'),
    ('williamsonb', '2017/03/26 10:32:00.000 PM'),
    ('williamsonb', '2017/03/02 09:45:00.000 AM'),
    ('williamsonb', '2017/03/02 09:49:00.000 AM'),
    ('williamsonb', '2017/03/02 09:51:00.000 AM'),
    ('williamsonb', '2017/02/04 02:14:00.000 PM'),
    ('petersonl', '2017/03/27 07:42:00.000 AM'),
    ('petersonl', '2017/03/27 07:45:00.000 AM'),
    ('petersonl', '2017/03/27 07:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:41:00.000 AM'),
    ('petersonl', '2017/03/26 08:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:48:00.000 AM'),
    ('petersonl', '2017/03/05 08:47:00.000 AM'),
    ('petersonl', '2017/03/05 08:49:00.000 AM'),
    ('petersonl', '2017/03/05 08:50:00.000 AM'),
    ('petersonl', '2017/01/17 06:54:00.000 AM');



    WITH BaseData AS
    (SELECT CreatedBy
    ,CreatedTime
    ,LEAD(CreatedTime,1) OVER (PARTITION BY CreatedBy, DATEDIFF(DAY, 0, CreatedTime) ORDER BY CreatedTime) as EndTime
    FROM #WorkOrders
    )
    --SELECT * FROM BaseData
    ,DataWDuration AS
    (SELECT CreatedBy
    ,CreatedTime
    ,EndTime
    ,CASE WHEN DATEDIFF(SECOND, CreatedTime, EndTime) < 60 THEN 60 ELSE DATEDIFF(SECOND, CreatedTime, EndTime) END as Duration
    FROM BaseData
    )
    ,DayTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as dailyCount
    ,SUM(Duration) as dailyTimeSec
    ,AVG(Duration) as DailyAvgTimeSec
    ,MIN(Duration) as DailyMinTimeSec
    ,MAX(Duration) as DailyMaxTimeSec
    FROM DataWDuration
    WHERE DATEDIFF(DAY, 0, CreatedTime) = DATEDIFF(DAY, 1, GETDATE()) -- compares date part of CreatedTime with yesterday's date
    GROUP BY CreatedBy
    )
    ,MTDTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as mtdCount
    ,COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as ActiveDays
    ,COUNT(*) * 1.0 / DATEPART(DAY, GETDATE()) as mtdAvgDailyCount
    ,COUNT(*) * 1.0 / COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as mtdActiveDayAvgCount
    ,SUM(Duration) as mtdTimeSec
    ,AVG(Duration) as mtdAvgTimeSec
    ,MIN(Duration) as mtdMinTimeSec
    ,MAX(Duration) as mtdMaxTimeSec
    FROM DataWDuration
    WHERE CreatedTime >= DATEADD(DAY, 1 - DATEPART(DAY, DATEADD(DAY, -1, GETDATE())), CAST(DATEDIFF(DAY, 1, GETDATE()) as datetime)) -- picks up all activity for current calendar month to date
    AND CreatedTime < DATEDIFF(DAY, 0, GETDATE()) -- ignore anything entered today.
    GROUP BY CreatedBy
    )
    SELECT *
    FROM DayTotal dt
    FULL JOIN MTDTotal mt ON (dt.CreatedBy = mt.CreatedBy)
    ;


    So, the first part is just creating some data to work with. I took what you had provided, and added data for 3/27 (so there would be something from yesterday to look at).



    Next, you see the query. I broke down the process using CTEs (Common Table Expressions), which are basically sub-queries you can reference in multiple places in your query (with a few bonus features, like recursive CTEs, but that's not necessary for this). If you want to see exactly what any of the CTE's returns, comment out everything after that CTE, put in SELECT * FROM and the CTE name; see the example right below the BaseData CTE.



    The BaseData CTE adds an end time for each of the WorkOrder rows. Note that the end time for the last work order on any given day is NULL. As Paparazzi suggested, the LEAD functions helps us out here; it gives us the CreatedTime of the next record to use as the EndTime of the current record. The OVER clause is what makes this work; it tells SQL to just use the records for a specific CreatedBy, for a specific date, ordered by CreatedTime.



    We need to find all the records for a specific date, regardless of the time, so we need a way to strip the time out of CreatedTime. There are a number of ways to do this. The fastest way is CAST(CreatedTime as DATE); however, this only works with SQL Server 2008 and up, and I wasn't certain what version you had. The second fastest way is supposed to be CAST(DATEDIFF(DAY, 0, CreatedTime) as datetime). This uses the fact that dates are stored as floating point integers, with the number left of the decimal point representing the date part and the number right of the decimal point representing the time part. So, we get the number of days between the date represented by 0 and today, then we have the server cast that to a datetime value. We get today's date, with the time set to midnight. If you are using SQL 2008 or later, I'd replace this with casting the datetime value to datatype DATE.



    Note that when we just need to compare two dates, the number of days value works as well as the actual date, so we save some processing by not doing the final CAST to datetime.



    The LEAD function accepts three arguments - the field to look at; how many rows ahead to look; and the default value to use if there is no row there. The default for the third argument is NULL; that works with what we're doing, so we just used that.



    A final note on BaseData: this would be a good point to limit the rows you're bringing back. I didn't, because the test data is small, and because I'm not sure that you're not ultimately going to want to use this to pull back quarterly or annual results. We do put limits in later, for the specific subtotals you want. However, I would recommend limiting the rows you look at in practice, as prepping a year's worth of data to get monthly totals would take much longer than necessary.



    The DataWDuration simply adds the amount of time (in seconds) that each work order took to process. You could combine this with the previous step, but the SQL gets really hard to follow, and I wanted to avoid that. Again, the last work order of each day has a NULL EndTime, and that means it has a NULL Duration.



    The DayTotal CTE gives us the requested values for everyone who worked yesterday. Since we have an entry for each work order, getting MIN and MAX are straight-forward, as are the COUNT and AVG.



    Two notes here: COUNT(*) gives you the total number of rows in your selection (for each group from GROUP BY). COUNT(Duration), on the other hand, would only count the rows where Duration was not NULL. I'm assuming that, even though we have to ignore the last work order of the day for the purposes of MIN, MAX, and AVG, we want it included in the count.



    Also, we're using that "number of days" trick again to find the data for yesterday. Just like DATEDIFF(DAY, 0, CreatedTime) gives us the date part of CreatedTime, DATEDIFF(DAY, 0, GETDATE()) gives us the date part of today's date. However, we actually want the date part of yesterday's date. So, we want the number of days between the zero date and today's date - 1; which, it turns out, is the same as the number of days between the date that corresponds to day 1, and today's date. Keep this in mind if you're looking to replace this method with the "CAST as DATE" method.



    Next, we have the MTDTotal (for Month-To-Date) CTE. I wasn't sure how you were defining a month, so I went with this. If you want to use DATEADD(MONTH, -1, GETDATE()) (SQL's definition of "one month ago"), or DATEADD(DAY, -30, GETDATE()), you should be able to modify this appropriately.



    How you'd want to calculate the daily average number of work orders wasn't 100% clear, so I gave you two options. The first calculates the average based on the number of work orders by the person, divided by the number of days in the month so far (again, you should be able to adjust this for a different definition of month, if necessary). The second gives the average number of work orders per day that this particular person worked. So, if Ms. Peterson did 21 work orders on the 7 days she worked, and you did 30 on the 10 days you worked, you'd both have an average of 3. You should be able to modify this to even figure out working days (based on dates where someone had a work order), by pulling COUNT(DISTINCT DATEDIFF(DAY,0,CreatedTime) for the same set of records we're looking at now (without a GROUP BY), by adding another CTE.



    Again, just to cover it, the WHERE clause is restricting the query to rows where CreatedTime is midnight on the first day of the current month or later, but is before midnight today. We're using the "number of days" trick again, but actually casting the value to datetime. We move back to the first of the month by getting the "day" part of yesterday's date, and subtracting (day - 1) from the date (for instance, if today is April 1, then yesterday was March 31, to we subtract 30 days to get March 1; If today is April 2, then yesterday was April 1, so we subtract 0 days to get - April 1!).



    As I alluded to above, you could add CTEs to give you quarterly or annual totals, if you wanted.



    I used a FULL join to display the values, so that you would still see the monthly totals for someone who was out yesterday. I left formatting the seconds as time values up to you. I do have a recommendation; if you're translating the seconds to "HH:MM" (not "HH:MM:SS"), then add 30 seconds when you're doing your calculations; that will round the result up to the next minute (329 seconds = "00:05:29" would become "00:05"; 330 seconds = "00:05:30" would become "00:06").



    Results shown below (I manually edited out the second CreatedBy column, and of course using the column names rather than * would have prevented it altogether):



    CreatedBy       dailyCount  dailyTimeSec DailyAvgTimeSec DailyMinTimeSec DailyMaxTimeSec mtdCount    ActiveDays  mtdAvgDailyCount      mtdActiveDayAvgCount   mtdTimeSec  mtdAvgTimeSec mtdMinTimeSec mtdMaxTimeSec
    --------------- ----------- ------------ --------------- --------------- --------------- ----------- ----------- --------------------- ---------------------- ----------- ------------- ------------- -------------
    petersonl 3 240 120 60 180 9 3 0.321428571428 3.000000000000 840 140 60 300
    williamsonb 3 5160 2580 420 4740 9 3 0.321428571428 3.000000000000 10800 1800 120 4800


    Hope this helps.



    [EDIT: updated the calculation of Duration to ensure that any values less than 60 seconds are bumped up to 60 seconds; NULL values remain NULL].






    share|improve this answer


























    • Comments are not for extended discussion; this conversation has been moved to chat.

      – Paul White
      Apr 16 '17 at 8:53
















    0














    All right - I think this will get you most of the way there. NOTE: I'm using some of the same things as Paparazzi, but put this together from scratch myself. I've tried to make some of the steps obvious, so it's entirely possible his solution would be faster.



    -- Test Data
    CREATE TABLE #WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

    INSERT INTO #WorkOrders values
    ('williamsonb', '2017/03/27 12:05:00.000 PM'),
    ('williamsonb', '2017/03/27 12:12:00.000 PM'),
    ('williamsonb', '2017/03/27 01:31:00.000 PM'),
    ('williamsonb', '2017/03/26 09:04:00.000 PM'),
    ('williamsonb', '2017/03/26 09:12:00.000 PM'),
    ('williamsonb', '2017/03/26 10:32:00.000 PM'),
    ('williamsonb', '2017/03/02 09:45:00.000 AM'),
    ('williamsonb', '2017/03/02 09:49:00.000 AM'),
    ('williamsonb', '2017/03/02 09:51:00.000 AM'),
    ('williamsonb', '2017/02/04 02:14:00.000 PM'),
    ('petersonl', '2017/03/27 07:42:00.000 AM'),
    ('petersonl', '2017/03/27 07:45:00.000 AM'),
    ('petersonl', '2017/03/27 07:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:41:00.000 AM'),
    ('petersonl', '2017/03/26 08:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:48:00.000 AM'),
    ('petersonl', '2017/03/05 08:47:00.000 AM'),
    ('petersonl', '2017/03/05 08:49:00.000 AM'),
    ('petersonl', '2017/03/05 08:50:00.000 AM'),
    ('petersonl', '2017/01/17 06:54:00.000 AM');



    WITH BaseData AS
    (SELECT CreatedBy
    ,CreatedTime
    ,LEAD(CreatedTime,1) OVER (PARTITION BY CreatedBy, DATEDIFF(DAY, 0, CreatedTime) ORDER BY CreatedTime) as EndTime
    FROM #WorkOrders
    )
    --SELECT * FROM BaseData
    ,DataWDuration AS
    (SELECT CreatedBy
    ,CreatedTime
    ,EndTime
    ,CASE WHEN DATEDIFF(SECOND, CreatedTime, EndTime) < 60 THEN 60 ELSE DATEDIFF(SECOND, CreatedTime, EndTime) END as Duration
    FROM BaseData
    )
    ,DayTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as dailyCount
    ,SUM(Duration) as dailyTimeSec
    ,AVG(Duration) as DailyAvgTimeSec
    ,MIN(Duration) as DailyMinTimeSec
    ,MAX(Duration) as DailyMaxTimeSec
    FROM DataWDuration
    WHERE DATEDIFF(DAY, 0, CreatedTime) = DATEDIFF(DAY, 1, GETDATE()) -- compares date part of CreatedTime with yesterday's date
    GROUP BY CreatedBy
    )
    ,MTDTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as mtdCount
    ,COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as ActiveDays
    ,COUNT(*) * 1.0 / DATEPART(DAY, GETDATE()) as mtdAvgDailyCount
    ,COUNT(*) * 1.0 / COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as mtdActiveDayAvgCount
    ,SUM(Duration) as mtdTimeSec
    ,AVG(Duration) as mtdAvgTimeSec
    ,MIN(Duration) as mtdMinTimeSec
    ,MAX(Duration) as mtdMaxTimeSec
    FROM DataWDuration
    WHERE CreatedTime >= DATEADD(DAY, 1 - DATEPART(DAY, DATEADD(DAY, -1, GETDATE())), CAST(DATEDIFF(DAY, 1, GETDATE()) as datetime)) -- picks up all activity for current calendar month to date
    AND CreatedTime < DATEDIFF(DAY, 0, GETDATE()) -- ignore anything entered today.
    GROUP BY CreatedBy
    )
    SELECT *
    FROM DayTotal dt
    FULL JOIN MTDTotal mt ON (dt.CreatedBy = mt.CreatedBy)
    ;


    So, the first part is just creating some data to work with. I took what you had provided, and added data for 3/27 (so there would be something from yesterday to look at).



    Next, you see the query. I broke down the process using CTEs (Common Table Expressions), which are basically sub-queries you can reference in multiple places in your query (with a few bonus features, like recursive CTEs, but that's not necessary for this). If you want to see exactly what any of the CTE's returns, comment out everything after that CTE, put in SELECT * FROM and the CTE name; see the example right below the BaseData CTE.



    The BaseData CTE adds an end time for each of the WorkOrder rows. Note that the end time for the last work order on any given day is NULL. As Paparazzi suggested, the LEAD functions helps us out here; it gives us the CreatedTime of the next record to use as the EndTime of the current record. The OVER clause is what makes this work; it tells SQL to just use the records for a specific CreatedBy, for a specific date, ordered by CreatedTime.



    We need to find all the records for a specific date, regardless of the time, so we need a way to strip the time out of CreatedTime. There are a number of ways to do this. The fastest way is CAST(CreatedTime as DATE); however, this only works with SQL Server 2008 and up, and I wasn't certain what version you had. The second fastest way is supposed to be CAST(DATEDIFF(DAY, 0, CreatedTime) as datetime). This uses the fact that dates are stored as floating point integers, with the number left of the decimal point representing the date part and the number right of the decimal point representing the time part. So, we get the number of days between the date represented by 0 and today, then we have the server cast that to a datetime value. We get today's date, with the time set to midnight. If you are using SQL 2008 or later, I'd replace this with casting the datetime value to datatype DATE.



    Note that when we just need to compare two dates, the number of days value works as well as the actual date, so we save some processing by not doing the final CAST to datetime.



    The LEAD function accepts three arguments - the field to look at; how many rows ahead to look; and the default value to use if there is no row there. The default for the third argument is NULL; that works with what we're doing, so we just used that.



    A final note on BaseData: this would be a good point to limit the rows you're bringing back. I didn't, because the test data is small, and because I'm not sure that you're not ultimately going to want to use this to pull back quarterly or annual results. We do put limits in later, for the specific subtotals you want. However, I would recommend limiting the rows you look at in practice, as prepping a year's worth of data to get monthly totals would take much longer than necessary.



    The DataWDuration simply adds the amount of time (in seconds) that each work order took to process. You could combine this with the previous step, but the SQL gets really hard to follow, and I wanted to avoid that. Again, the last work order of each day has a NULL EndTime, and that means it has a NULL Duration.



    The DayTotal CTE gives us the requested values for everyone who worked yesterday. Since we have an entry for each work order, getting MIN and MAX are straight-forward, as are the COUNT and AVG.



    Two notes here: COUNT(*) gives you the total number of rows in your selection (for each group from GROUP BY). COUNT(Duration), on the other hand, would only count the rows where Duration was not NULL. I'm assuming that, even though we have to ignore the last work order of the day for the purposes of MIN, MAX, and AVG, we want it included in the count.



    Also, we're using that "number of days" trick again to find the data for yesterday. Just like DATEDIFF(DAY, 0, CreatedTime) gives us the date part of CreatedTime, DATEDIFF(DAY, 0, GETDATE()) gives us the date part of today's date. However, we actually want the date part of yesterday's date. So, we want the number of days between the zero date and today's date - 1; which, it turns out, is the same as the number of days between the date that corresponds to day 1, and today's date. Keep this in mind if you're looking to replace this method with the "CAST as DATE" method.



    Next, we have the MTDTotal (for Month-To-Date) CTE. I wasn't sure how you were defining a month, so I went with this. If you want to use DATEADD(MONTH, -1, GETDATE()) (SQL's definition of "one month ago"), or DATEADD(DAY, -30, GETDATE()), you should be able to modify this appropriately.



    How you'd want to calculate the daily average number of work orders wasn't 100% clear, so I gave you two options. The first calculates the average based on the number of work orders by the person, divided by the number of days in the month so far (again, you should be able to adjust this for a different definition of month, if necessary). The second gives the average number of work orders per day that this particular person worked. So, if Ms. Peterson did 21 work orders on the 7 days she worked, and you did 30 on the 10 days you worked, you'd both have an average of 3. You should be able to modify this to even figure out working days (based on dates where someone had a work order), by pulling COUNT(DISTINCT DATEDIFF(DAY,0,CreatedTime) for the same set of records we're looking at now (without a GROUP BY), by adding another CTE.



    Again, just to cover it, the WHERE clause is restricting the query to rows where CreatedTime is midnight on the first day of the current month or later, but is before midnight today. We're using the "number of days" trick again, but actually casting the value to datetime. We move back to the first of the month by getting the "day" part of yesterday's date, and subtracting (day - 1) from the date (for instance, if today is April 1, then yesterday was March 31, to we subtract 30 days to get March 1; If today is April 2, then yesterday was April 1, so we subtract 0 days to get - April 1!).



    As I alluded to above, you could add CTEs to give you quarterly or annual totals, if you wanted.



    I used a FULL join to display the values, so that you would still see the monthly totals for someone who was out yesterday. I left formatting the seconds as time values up to you. I do have a recommendation; if you're translating the seconds to "HH:MM" (not "HH:MM:SS"), then add 30 seconds when you're doing your calculations; that will round the result up to the next minute (329 seconds = "00:05:29" would become "00:05"; 330 seconds = "00:05:30" would become "00:06").



    Results shown below (I manually edited out the second CreatedBy column, and of course using the column names rather than * would have prevented it altogether):



    CreatedBy       dailyCount  dailyTimeSec DailyAvgTimeSec DailyMinTimeSec DailyMaxTimeSec mtdCount    ActiveDays  mtdAvgDailyCount      mtdActiveDayAvgCount   mtdTimeSec  mtdAvgTimeSec mtdMinTimeSec mtdMaxTimeSec
    --------------- ----------- ------------ --------------- --------------- --------------- ----------- ----------- --------------------- ---------------------- ----------- ------------- ------------- -------------
    petersonl 3 240 120 60 180 9 3 0.321428571428 3.000000000000 840 140 60 300
    williamsonb 3 5160 2580 420 4740 9 3 0.321428571428 3.000000000000 10800 1800 120 4800


    Hope this helps.



    [EDIT: updated the calculation of Duration to ensure that any values less than 60 seconds are bumped up to 60 seconds; NULL values remain NULL].






    share|improve this answer


























    • Comments are not for extended discussion; this conversation has been moved to chat.

      – Paul White
      Apr 16 '17 at 8:53














    0












    0








    0







    All right - I think this will get you most of the way there. NOTE: I'm using some of the same things as Paparazzi, but put this together from scratch myself. I've tried to make some of the steps obvious, so it's entirely possible his solution would be faster.



    -- Test Data
    CREATE TABLE #WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

    INSERT INTO #WorkOrders values
    ('williamsonb', '2017/03/27 12:05:00.000 PM'),
    ('williamsonb', '2017/03/27 12:12:00.000 PM'),
    ('williamsonb', '2017/03/27 01:31:00.000 PM'),
    ('williamsonb', '2017/03/26 09:04:00.000 PM'),
    ('williamsonb', '2017/03/26 09:12:00.000 PM'),
    ('williamsonb', '2017/03/26 10:32:00.000 PM'),
    ('williamsonb', '2017/03/02 09:45:00.000 AM'),
    ('williamsonb', '2017/03/02 09:49:00.000 AM'),
    ('williamsonb', '2017/03/02 09:51:00.000 AM'),
    ('williamsonb', '2017/02/04 02:14:00.000 PM'),
    ('petersonl', '2017/03/27 07:42:00.000 AM'),
    ('petersonl', '2017/03/27 07:45:00.000 AM'),
    ('petersonl', '2017/03/27 07:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:41:00.000 AM'),
    ('petersonl', '2017/03/26 08:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:48:00.000 AM'),
    ('petersonl', '2017/03/05 08:47:00.000 AM'),
    ('petersonl', '2017/03/05 08:49:00.000 AM'),
    ('petersonl', '2017/03/05 08:50:00.000 AM'),
    ('petersonl', '2017/01/17 06:54:00.000 AM');



    WITH BaseData AS
    (SELECT CreatedBy
    ,CreatedTime
    ,LEAD(CreatedTime,1) OVER (PARTITION BY CreatedBy, DATEDIFF(DAY, 0, CreatedTime) ORDER BY CreatedTime) as EndTime
    FROM #WorkOrders
    )
    --SELECT * FROM BaseData
    ,DataWDuration AS
    (SELECT CreatedBy
    ,CreatedTime
    ,EndTime
    ,CASE WHEN DATEDIFF(SECOND, CreatedTime, EndTime) < 60 THEN 60 ELSE DATEDIFF(SECOND, CreatedTime, EndTime) END as Duration
    FROM BaseData
    )
    ,DayTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as dailyCount
    ,SUM(Duration) as dailyTimeSec
    ,AVG(Duration) as DailyAvgTimeSec
    ,MIN(Duration) as DailyMinTimeSec
    ,MAX(Duration) as DailyMaxTimeSec
    FROM DataWDuration
    WHERE DATEDIFF(DAY, 0, CreatedTime) = DATEDIFF(DAY, 1, GETDATE()) -- compares date part of CreatedTime with yesterday's date
    GROUP BY CreatedBy
    )
    ,MTDTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as mtdCount
    ,COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as ActiveDays
    ,COUNT(*) * 1.0 / DATEPART(DAY, GETDATE()) as mtdAvgDailyCount
    ,COUNT(*) * 1.0 / COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as mtdActiveDayAvgCount
    ,SUM(Duration) as mtdTimeSec
    ,AVG(Duration) as mtdAvgTimeSec
    ,MIN(Duration) as mtdMinTimeSec
    ,MAX(Duration) as mtdMaxTimeSec
    FROM DataWDuration
    WHERE CreatedTime >= DATEADD(DAY, 1 - DATEPART(DAY, DATEADD(DAY, -1, GETDATE())), CAST(DATEDIFF(DAY, 1, GETDATE()) as datetime)) -- picks up all activity for current calendar month to date
    AND CreatedTime < DATEDIFF(DAY, 0, GETDATE()) -- ignore anything entered today.
    GROUP BY CreatedBy
    )
    SELECT *
    FROM DayTotal dt
    FULL JOIN MTDTotal mt ON (dt.CreatedBy = mt.CreatedBy)
    ;


    So, the first part is just creating some data to work with. I took what you had provided, and added data for 3/27 (so there would be something from yesterday to look at).



    Next, you see the query. I broke down the process using CTEs (Common Table Expressions), which are basically sub-queries you can reference in multiple places in your query (with a few bonus features, like recursive CTEs, but that's not necessary for this). If you want to see exactly what any of the CTE's returns, comment out everything after that CTE, put in SELECT * FROM and the CTE name; see the example right below the BaseData CTE.



    The BaseData CTE adds an end time for each of the WorkOrder rows. Note that the end time for the last work order on any given day is NULL. As Paparazzi suggested, the LEAD functions helps us out here; it gives us the CreatedTime of the next record to use as the EndTime of the current record. The OVER clause is what makes this work; it tells SQL to just use the records for a specific CreatedBy, for a specific date, ordered by CreatedTime.



    We need to find all the records for a specific date, regardless of the time, so we need a way to strip the time out of CreatedTime. There are a number of ways to do this. The fastest way is CAST(CreatedTime as DATE); however, this only works with SQL Server 2008 and up, and I wasn't certain what version you had. The second fastest way is supposed to be CAST(DATEDIFF(DAY, 0, CreatedTime) as datetime). This uses the fact that dates are stored as floating point integers, with the number left of the decimal point representing the date part and the number right of the decimal point representing the time part. So, we get the number of days between the date represented by 0 and today, then we have the server cast that to a datetime value. We get today's date, with the time set to midnight. If you are using SQL 2008 or later, I'd replace this with casting the datetime value to datatype DATE.



    Note that when we just need to compare two dates, the number of days value works as well as the actual date, so we save some processing by not doing the final CAST to datetime.



    The LEAD function accepts three arguments - the field to look at; how many rows ahead to look; and the default value to use if there is no row there. The default for the third argument is NULL; that works with what we're doing, so we just used that.



    A final note on BaseData: this would be a good point to limit the rows you're bringing back. I didn't, because the test data is small, and because I'm not sure that you're not ultimately going to want to use this to pull back quarterly or annual results. We do put limits in later, for the specific subtotals you want. However, I would recommend limiting the rows you look at in practice, as prepping a year's worth of data to get monthly totals would take much longer than necessary.



    The DataWDuration simply adds the amount of time (in seconds) that each work order took to process. You could combine this with the previous step, but the SQL gets really hard to follow, and I wanted to avoid that. Again, the last work order of each day has a NULL EndTime, and that means it has a NULL Duration.



    The DayTotal CTE gives us the requested values for everyone who worked yesterday. Since we have an entry for each work order, getting MIN and MAX are straight-forward, as are the COUNT and AVG.



    Two notes here: COUNT(*) gives you the total number of rows in your selection (for each group from GROUP BY). COUNT(Duration), on the other hand, would only count the rows where Duration was not NULL. I'm assuming that, even though we have to ignore the last work order of the day for the purposes of MIN, MAX, and AVG, we want it included in the count.



    Also, we're using that "number of days" trick again to find the data for yesterday. Just like DATEDIFF(DAY, 0, CreatedTime) gives us the date part of CreatedTime, DATEDIFF(DAY, 0, GETDATE()) gives us the date part of today's date. However, we actually want the date part of yesterday's date. So, we want the number of days between the zero date and today's date - 1; which, it turns out, is the same as the number of days between the date that corresponds to day 1, and today's date. Keep this in mind if you're looking to replace this method with the "CAST as DATE" method.



    Next, we have the MTDTotal (for Month-To-Date) CTE. I wasn't sure how you were defining a month, so I went with this. If you want to use DATEADD(MONTH, -1, GETDATE()) (SQL's definition of "one month ago"), or DATEADD(DAY, -30, GETDATE()), you should be able to modify this appropriately.



    How you'd want to calculate the daily average number of work orders wasn't 100% clear, so I gave you two options. The first calculates the average based on the number of work orders by the person, divided by the number of days in the month so far (again, you should be able to adjust this for a different definition of month, if necessary). The second gives the average number of work orders per day that this particular person worked. So, if Ms. Peterson did 21 work orders on the 7 days she worked, and you did 30 on the 10 days you worked, you'd both have an average of 3. You should be able to modify this to even figure out working days (based on dates where someone had a work order), by pulling COUNT(DISTINCT DATEDIFF(DAY,0,CreatedTime) for the same set of records we're looking at now (without a GROUP BY), by adding another CTE.



    Again, just to cover it, the WHERE clause is restricting the query to rows where CreatedTime is midnight on the first day of the current month or later, but is before midnight today. We're using the "number of days" trick again, but actually casting the value to datetime. We move back to the first of the month by getting the "day" part of yesterday's date, and subtracting (day - 1) from the date (for instance, if today is April 1, then yesterday was March 31, to we subtract 30 days to get March 1; If today is April 2, then yesterday was April 1, so we subtract 0 days to get - April 1!).



    As I alluded to above, you could add CTEs to give you quarterly or annual totals, if you wanted.



    I used a FULL join to display the values, so that you would still see the monthly totals for someone who was out yesterday. I left formatting the seconds as time values up to you. I do have a recommendation; if you're translating the seconds to "HH:MM" (not "HH:MM:SS"), then add 30 seconds when you're doing your calculations; that will round the result up to the next minute (329 seconds = "00:05:29" would become "00:05"; 330 seconds = "00:05:30" would become "00:06").



    Results shown below (I manually edited out the second CreatedBy column, and of course using the column names rather than * would have prevented it altogether):



    CreatedBy       dailyCount  dailyTimeSec DailyAvgTimeSec DailyMinTimeSec DailyMaxTimeSec mtdCount    ActiveDays  mtdAvgDailyCount      mtdActiveDayAvgCount   mtdTimeSec  mtdAvgTimeSec mtdMinTimeSec mtdMaxTimeSec
    --------------- ----------- ------------ --------------- --------------- --------------- ----------- ----------- --------------------- ---------------------- ----------- ------------- ------------- -------------
    petersonl 3 240 120 60 180 9 3 0.321428571428 3.000000000000 840 140 60 300
    williamsonb 3 5160 2580 420 4740 9 3 0.321428571428 3.000000000000 10800 1800 120 4800


    Hope this helps.



    [EDIT: updated the calculation of Duration to ensure that any values less than 60 seconds are bumped up to 60 seconds; NULL values remain NULL].






    share|improve this answer















    All right - I think this will get you most of the way there. NOTE: I'm using some of the same things as Paparazzi, but put this together from scratch myself. I've tried to make some of the steps obvious, so it's entirely possible his solution would be faster.



    -- Test Data
    CREATE TABLE #WorkOrders(CreatedBy CHAR(25), CreatedTime DATETIME)

    INSERT INTO #WorkOrders values
    ('williamsonb', '2017/03/27 12:05:00.000 PM'),
    ('williamsonb', '2017/03/27 12:12:00.000 PM'),
    ('williamsonb', '2017/03/27 01:31:00.000 PM'),
    ('williamsonb', '2017/03/26 09:04:00.000 PM'),
    ('williamsonb', '2017/03/26 09:12:00.000 PM'),
    ('williamsonb', '2017/03/26 10:32:00.000 PM'),
    ('williamsonb', '2017/03/02 09:45:00.000 AM'),
    ('williamsonb', '2017/03/02 09:49:00.000 AM'),
    ('williamsonb', '2017/03/02 09:51:00.000 AM'),
    ('williamsonb', '2017/02/04 02:14:00.000 PM'),
    ('petersonl', '2017/03/27 07:42:00.000 AM'),
    ('petersonl', '2017/03/27 07:45:00.000 AM'),
    ('petersonl', '2017/03/27 07:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:41:00.000 AM'),
    ('petersonl', '2017/03/26 08:46:00.000 AM'),
    ('petersonl', '2017/03/26 08:48:00.000 AM'),
    ('petersonl', '2017/03/05 08:47:00.000 AM'),
    ('petersonl', '2017/03/05 08:49:00.000 AM'),
    ('petersonl', '2017/03/05 08:50:00.000 AM'),
    ('petersonl', '2017/01/17 06:54:00.000 AM');



    WITH BaseData AS
    (SELECT CreatedBy
    ,CreatedTime
    ,LEAD(CreatedTime,1) OVER (PARTITION BY CreatedBy, DATEDIFF(DAY, 0, CreatedTime) ORDER BY CreatedTime) as EndTime
    FROM #WorkOrders
    )
    --SELECT * FROM BaseData
    ,DataWDuration AS
    (SELECT CreatedBy
    ,CreatedTime
    ,EndTime
    ,CASE WHEN DATEDIFF(SECOND, CreatedTime, EndTime) < 60 THEN 60 ELSE DATEDIFF(SECOND, CreatedTime, EndTime) END as Duration
    FROM BaseData
    )
    ,DayTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as dailyCount
    ,SUM(Duration) as dailyTimeSec
    ,AVG(Duration) as DailyAvgTimeSec
    ,MIN(Duration) as DailyMinTimeSec
    ,MAX(Duration) as DailyMaxTimeSec
    FROM DataWDuration
    WHERE DATEDIFF(DAY, 0, CreatedTime) = DATEDIFF(DAY, 1, GETDATE()) -- compares date part of CreatedTime with yesterday's date
    GROUP BY CreatedBy
    )
    ,MTDTotal AS
    (SELECT CreatedBy
    ,COUNT(*) as mtdCount
    ,COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as ActiveDays
    ,COUNT(*) * 1.0 / DATEPART(DAY, GETDATE()) as mtdAvgDailyCount
    ,COUNT(*) * 1.0 / COUNT(DISTINCT DATEDIFF(DAY, 0, CreatedTime)) as mtdActiveDayAvgCount
    ,SUM(Duration) as mtdTimeSec
    ,AVG(Duration) as mtdAvgTimeSec
    ,MIN(Duration) as mtdMinTimeSec
    ,MAX(Duration) as mtdMaxTimeSec
    FROM DataWDuration
    WHERE CreatedTime >= DATEADD(DAY, 1 - DATEPART(DAY, DATEADD(DAY, -1, GETDATE())), CAST(DATEDIFF(DAY, 1, GETDATE()) as datetime)) -- picks up all activity for current calendar month to date
    AND CreatedTime < DATEDIFF(DAY, 0, GETDATE()) -- ignore anything entered today.
    GROUP BY CreatedBy
    )
    SELECT *
    FROM DayTotal dt
    FULL JOIN MTDTotal mt ON (dt.CreatedBy = mt.CreatedBy)
    ;


    So, the first part is just creating some data to work with. I took what you had provided, and added data for 3/27 (so there would be something from yesterday to look at).



    Next, you see the query. I broke down the process using CTEs (Common Table Expressions), which are basically sub-queries you can reference in multiple places in your query (with a few bonus features, like recursive CTEs, but that's not necessary for this). If you want to see exactly what any of the CTE's returns, comment out everything after that CTE, put in SELECT * FROM and the CTE name; see the example right below the BaseData CTE.



    The BaseData CTE adds an end time for each of the WorkOrder rows. Note that the end time for the last work order on any given day is NULL. As Paparazzi suggested, the LEAD functions helps us out here; it gives us the CreatedTime of the next record to use as the EndTime of the current record. The OVER clause is what makes this work; it tells SQL to just use the records for a specific CreatedBy, for a specific date, ordered by CreatedTime.



    We need to find all the records for a specific date, regardless of the time, so we need a way to strip the time out of CreatedTime. There are a number of ways to do this. The fastest way is CAST(CreatedTime as DATE); however, this only works with SQL Server 2008 and up, and I wasn't certain what version you had. The second fastest way is supposed to be CAST(DATEDIFF(DAY, 0, CreatedTime) as datetime). This uses the fact that dates are stored as floating point integers, with the number left of the decimal point representing the date part and the number right of the decimal point representing the time part. So, we get the number of days between the date represented by 0 and today, then we have the server cast that to a datetime value. We get today's date, with the time set to midnight. If you are using SQL 2008 or later, I'd replace this with casting the datetime value to datatype DATE.



    Note that when we just need to compare two dates, the number of days value works as well as the actual date, so we save some processing by not doing the final CAST to datetime.



    The LEAD function accepts three arguments - the field to look at; how many rows ahead to look; and the default value to use if there is no row there. The default for the third argument is NULL; that works with what we're doing, so we just used that.



    A final note on BaseData: this would be a good point to limit the rows you're bringing back. I didn't, because the test data is small, and because I'm not sure that you're not ultimately going to want to use this to pull back quarterly or annual results. We do put limits in later, for the specific subtotals you want. However, I would recommend limiting the rows you look at in practice, as prepping a year's worth of data to get monthly totals would take much longer than necessary.



    The DataWDuration simply adds the amount of time (in seconds) that each work order took to process. You could combine this with the previous step, but the SQL gets really hard to follow, and I wanted to avoid that. Again, the last work order of each day has a NULL EndTime, and that means it has a NULL Duration.



    The DayTotal CTE gives us the requested values for everyone who worked yesterday. Since we have an entry for each work order, getting MIN and MAX are straight-forward, as are the COUNT and AVG.



    Two notes here: COUNT(*) gives you the total number of rows in your selection (for each group from GROUP BY). COUNT(Duration), on the other hand, would only count the rows where Duration was not NULL. I'm assuming that, even though we have to ignore the last work order of the day for the purposes of MIN, MAX, and AVG, we want it included in the count.



    Also, we're using that "number of days" trick again to find the data for yesterday. Just like DATEDIFF(DAY, 0, CreatedTime) gives us the date part of CreatedTime, DATEDIFF(DAY, 0, GETDATE()) gives us the date part of today's date. However, we actually want the date part of yesterday's date. So, we want the number of days between the zero date and today's date - 1; which, it turns out, is the same as the number of days between the date that corresponds to day 1, and today's date. Keep this in mind if you're looking to replace this method with the "CAST as DATE" method.



    Next, we have the MTDTotal (for Month-To-Date) CTE. I wasn't sure how you were defining a month, so I went with this. If you want to use DATEADD(MONTH, -1, GETDATE()) (SQL's definition of "one month ago"), or DATEADD(DAY, -30, GETDATE()), you should be able to modify this appropriately.



    How you'd want to calculate the daily average number of work orders wasn't 100% clear, so I gave you two options. The first calculates the average based on the number of work orders by the person, divided by the number of days in the month so far (again, you should be able to adjust this for a different definition of month, if necessary). The second gives the average number of work orders per day that this particular person worked. So, if Ms. Peterson did 21 work orders on the 7 days she worked, and you did 30 on the 10 days you worked, you'd both have an average of 3. You should be able to modify this to even figure out working days (based on dates where someone had a work order), by pulling COUNT(DISTINCT DATEDIFF(DAY,0,CreatedTime) for the same set of records we're looking at now (without a GROUP BY), by adding another CTE.



    Again, just to cover it, the WHERE clause is restricting the query to rows where CreatedTime is midnight on the first day of the current month or later, but is before midnight today. We're using the "number of days" trick again, but actually casting the value to datetime. We move back to the first of the month by getting the "day" part of yesterday's date, and subtracting (day - 1) from the date (for instance, if today is April 1, then yesterday was March 31, to we subtract 30 days to get March 1; If today is April 2, then yesterday was April 1, so we subtract 0 days to get - April 1!).



    As I alluded to above, you could add CTEs to give you quarterly or annual totals, if you wanted.



    I used a FULL join to display the values, so that you would still see the monthly totals for someone who was out yesterday. I left formatting the seconds as time values up to you. I do have a recommendation; if you're translating the seconds to "HH:MM" (not "HH:MM:SS"), then add 30 seconds when you're doing your calculations; that will round the result up to the next minute (329 seconds = "00:05:29" would become "00:05"; 330 seconds = "00:05:30" would become "00:06").



    Results shown below (I manually edited out the second CreatedBy column, and of course using the column names rather than * would have prevented it altogether):



    CreatedBy       dailyCount  dailyTimeSec DailyAvgTimeSec DailyMinTimeSec DailyMaxTimeSec mtdCount    ActiveDays  mtdAvgDailyCount      mtdActiveDayAvgCount   mtdTimeSec  mtdAvgTimeSec mtdMinTimeSec mtdMaxTimeSec
    --------------- ----------- ------------ --------------- --------------- --------------- ----------- ----------- --------------------- ---------------------- ----------- ------------- ------------- -------------
    petersonl 3 240 120 60 180 9 3 0.321428571428 3.000000000000 840 140 60 300
    williamsonb 3 5160 2580 420 4740 9 3 0.321428571428 3.000000000000 10800 1800 120 4800


    Hope this helps.



    [EDIT: updated the calculation of Duration to ensure that any values less than 60 seconds are bumped up to 60 seconds; NULL values remain NULL].







    share|improve this answer














    share|improve this answer



    share|improve this answer








    edited Mar 29 '17 at 14:19

























    answered Mar 28 '17 at 23:08









    RDFozzRDFozz

    9,89231531




    9,89231531













    • Comments are not for extended discussion; this conversation has been moved to chat.

      – Paul White
      Apr 16 '17 at 8:53



















    • Comments are not for extended discussion; this conversation has been moved to chat.

      – Paul White
      Apr 16 '17 at 8:53

















    Comments are not for extended discussion; this conversation has been moved to chat.

    – Paul White
    Apr 16 '17 at 8:53





    Comments are not for extended discussion; this conversation has been moved to chat.

    – Paul White
    Apr 16 '17 at 8:53













    0














    Here is part



    select cast(CreatedTime as Date) as CreatedDate, CreatedBy      
    , CONVERT(VARCHAR(8), min(CreatedTime), 108) as [start]
    , CONVERT(VARCHAR(8), max(CreatedTime), 108) as [stop]
    , datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) as [start-stop]
    , count(*) as count
    , cast(datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) / cast((count(*) - 1) as decimal) as decimal(6,2)) as [avg]
    from WorkOrders
    group by cast(CreatedTime as Date), CreatedBy
    having count(*) > 1
    union
    select cast(CreatedTime as Date) as CreatedDate, CreatedBy
    , CONVERT(VARCHAR(8), max(CreatedTime), 108)
    , null
    , null
    , count(*) as count
    , null
    from WorkOrders
    group by cast(CreatedTime as Date), CreatedBy
    having count(*) = 1
    order by cast(CreatedTime as Date) desc, CreatedBy


    This has everything you need



    select [CreatedDate], CreatedBy
    , CONVERT(VARCHAR(8), CreatedTime, 108) as [time]
    , CONVERT(VARCHAR(8), [min], 108) as [min]
    , CONVERT(VARCHAR(8), [max], 108) as [max]
    , datediff(MINUTE, [min], [max]) as [min-max]
    , CONVERT(VARCHAR(8), [next], 108) as [next]
    , datediff(MINUTE, [next], [CreatedTime]) as [time-next]
    , [count]
    , [first], [last]
    from
    ( select CreatedBy, cast(CreatedTime as date) as [CreatedDate], CreatedTime
    , LEAD(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [next]
    , min(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [min]
    , max(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [max]
    , COUNT(*) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date)) AS [count]
    , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [last]
    , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [first]
    from WorkOrders
    ) tt
    -- where tt.first = 1 or tt.last = 1
    order by tt.CreatedDate desc, CreatedBy





    share|improve this answer






























      0














      Here is part



      select cast(CreatedTime as Date) as CreatedDate, CreatedBy      
      , CONVERT(VARCHAR(8), min(CreatedTime), 108) as [start]
      , CONVERT(VARCHAR(8), max(CreatedTime), 108) as [stop]
      , datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) as [start-stop]
      , count(*) as count
      , cast(datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) / cast((count(*) - 1) as decimal) as decimal(6,2)) as [avg]
      from WorkOrders
      group by cast(CreatedTime as Date), CreatedBy
      having count(*) > 1
      union
      select cast(CreatedTime as Date) as CreatedDate, CreatedBy
      , CONVERT(VARCHAR(8), max(CreatedTime), 108)
      , null
      , null
      , count(*) as count
      , null
      from WorkOrders
      group by cast(CreatedTime as Date), CreatedBy
      having count(*) = 1
      order by cast(CreatedTime as Date) desc, CreatedBy


      This has everything you need



      select [CreatedDate], CreatedBy
      , CONVERT(VARCHAR(8), CreatedTime, 108) as [time]
      , CONVERT(VARCHAR(8), [min], 108) as [min]
      , CONVERT(VARCHAR(8), [max], 108) as [max]
      , datediff(MINUTE, [min], [max]) as [min-max]
      , CONVERT(VARCHAR(8), [next], 108) as [next]
      , datediff(MINUTE, [next], [CreatedTime]) as [time-next]
      , [count]
      , [first], [last]
      from
      ( select CreatedBy, cast(CreatedTime as date) as [CreatedDate], CreatedTime
      , LEAD(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [next]
      , min(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [min]
      , max(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [max]
      , COUNT(*) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date)) AS [count]
      , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [last]
      , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [first]
      from WorkOrders
      ) tt
      -- where tt.first = 1 or tt.last = 1
      order by tt.CreatedDate desc, CreatedBy





      share|improve this answer




























        0












        0








        0







        Here is part



        select cast(CreatedTime as Date) as CreatedDate, CreatedBy      
        , CONVERT(VARCHAR(8), min(CreatedTime), 108) as [start]
        , CONVERT(VARCHAR(8), max(CreatedTime), 108) as [stop]
        , datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) as [start-stop]
        , count(*) as count
        , cast(datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) / cast((count(*) - 1) as decimal) as decimal(6,2)) as [avg]
        from WorkOrders
        group by cast(CreatedTime as Date), CreatedBy
        having count(*) > 1
        union
        select cast(CreatedTime as Date) as CreatedDate, CreatedBy
        , CONVERT(VARCHAR(8), max(CreatedTime), 108)
        , null
        , null
        , count(*) as count
        , null
        from WorkOrders
        group by cast(CreatedTime as Date), CreatedBy
        having count(*) = 1
        order by cast(CreatedTime as Date) desc, CreatedBy


        This has everything you need



        select [CreatedDate], CreatedBy
        , CONVERT(VARCHAR(8), CreatedTime, 108) as [time]
        , CONVERT(VARCHAR(8), [min], 108) as [min]
        , CONVERT(VARCHAR(8), [max], 108) as [max]
        , datediff(MINUTE, [min], [max]) as [min-max]
        , CONVERT(VARCHAR(8), [next], 108) as [next]
        , datediff(MINUTE, [next], [CreatedTime]) as [time-next]
        , [count]
        , [first], [last]
        from
        ( select CreatedBy, cast(CreatedTime as date) as [CreatedDate], CreatedTime
        , LEAD(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [next]
        , min(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [min]
        , max(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [max]
        , COUNT(*) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date)) AS [count]
        , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [last]
        , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [first]
        from WorkOrders
        ) tt
        -- where tt.first = 1 or tt.last = 1
        order by tt.CreatedDate desc, CreatedBy





        share|improve this answer















        Here is part



        select cast(CreatedTime as Date) as CreatedDate, CreatedBy      
        , CONVERT(VARCHAR(8), min(CreatedTime), 108) as [start]
        , CONVERT(VARCHAR(8), max(CreatedTime), 108) as [stop]
        , datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) as [start-stop]
        , count(*) as count
        , cast(datediff(MINUTE, min(CreatedTime), MAX(CreatedTime)) / cast((count(*) - 1) as decimal) as decimal(6,2)) as [avg]
        from WorkOrders
        group by cast(CreatedTime as Date), CreatedBy
        having count(*) > 1
        union
        select cast(CreatedTime as Date) as CreatedDate, CreatedBy
        , CONVERT(VARCHAR(8), max(CreatedTime), 108)
        , null
        , null
        , count(*) as count
        , null
        from WorkOrders
        group by cast(CreatedTime as Date), CreatedBy
        having count(*) = 1
        order by cast(CreatedTime as Date) desc, CreatedBy


        This has everything you need



        select [CreatedDate], CreatedBy
        , CONVERT(VARCHAR(8), CreatedTime, 108) as [time]
        , CONVERT(VARCHAR(8), [min], 108) as [min]
        , CONVERT(VARCHAR(8), [max], 108) as [max]
        , datediff(MINUTE, [min], [max]) as [min-max]
        , CONVERT(VARCHAR(8), [next], 108) as [next]
        , datediff(MINUTE, [next], [CreatedTime]) as [time-next]
        , [count]
        , [first], [last]
        from
        ( select CreatedBy, cast(CreatedTime as date) as [CreatedDate], CreatedTime
        , LEAD(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [next]
        , min(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [min]
        , max(CreatedTime) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [max]
        , COUNT(*) OVER (PARTITION BY CreatedBy, cast(CreatedTime as date)) AS [count]
        , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime) AS [last]
        , ROW_NUMBER() OVER (PARTITION BY CreatedBy, cast(CreatedTime as date) ORDER BY CreatedTime desc) AS [first]
        from WorkOrders
        ) tt
        -- where tt.first = 1 or tt.last = 1
        order by tt.CreatedDate desc, CreatedBy






        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Mar 29 '17 at 16:58

























        answered Mar 28 '17 at 15:30









        paparazzopaparazzo

        4,6141230




        4,6141230






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Database Administrators Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f168302%2frefining-productivity-efficiency-query%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            ORA-01691 (unable to extend lob segment) even though my tablespace has AUTOEXTEND onORA-01692: unable to...

            Always On Availability groups resolving state after failover - Remote harden of transaction...

            Circunscripción electoral de Guipúzcoa Referencias Menú de navegaciónLas claves del sistema electoral en...