Converting Monthly Rolling SUM (YTD) back to Monthly numbers The 2019 Stack Overflow Developer...

Is it ok to offer lower paid work as a trial period before negotiating for a full-time job?

How to notate time signature switching consistently every measure

Ubuntu Server install with full GUI

How do I free up internal storage if I don't have any apps downloaded?

Mathematics of imaging the black hole

How much of the clove should I use when using big garlic heads?

Why doesn't shell automatically fix "useless use of cat"?

Can there be female White Walkers?

If climate change impact can be observed in nature, has that had any effect on rural, i.e. farming community, perception of the scientific consensus?

How can I have a shield and a way of attacking with a ranged weapon at the same time?

Why couldn't they take pictures of a closer black hole?

What does Linus Torvalds mean when he says that Git "never ever" tracks a file?

How to quickly solve partial fractions equation?

Is bread bad for ducks?

Is it safe to harvest rainwater that fell on solar panels?

How can I add encounters in the Lost Mine of Phandelver campaign without giving PCs too much XP?

Will it cause any balance problems to have PCs level up and gain the benefits of a long rest mid-fight?

Are spiders unable to hurt humans, especially very small spiders?

If my opponent casts Ultimate Price on my Phantasmal Bear, can I save it by casting Snap or Curfew?

Can an undergraduate be advised by a professor who is very far away?

Short story: man watches girlfriend's spaceship entering a 'black hole' (?) forever

Loose spokes after only a few rides

Did the UK government pay "millions and millions of dollars" to try to snag Julian Assange?

Dropping list elements from nested list after evaluation



Converting Monthly Rolling SUM (YTD) back to Monthly numbers



The 2019 Stack Overflow Developer Survey Results Are InData Model For Summarizing Student InfoWhich of these table designs is better for performance?SQL Insert Statement Persists Two Records When I Need Just OneCan I make this multiple join query faster?Rolling SUM with a 30 day rangeHelp constructing a query specifically querying against different datesMS SQL Server 2016 - Single Partitioned Table vs Multiply Separated TablesSet non-negative floor for rolling sum, in SQL ServerMillions of reads for a single table used for reportssubtract missing Value from Last Value





.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







4















I'm having a problem converting rolling SUM by month (monthly year to date) back to only that month's numbers.



For example, see table below. Periods are (YYMM) and a monthly period per year starts with 07 and end with 06:



Name         Period         Amount
AAA 1611 10
BBB 1611 15
CCC 1611 20
AAA 1612 12
BBB 1612 18
CCC 1612 24
AAA 1701 13
BBB 1701 20
CCC 1701 27


The result that we are after is as follows. Period 1611 is the lowest in this example, but can be any YYMM month (07,08,09,10,11,12,01,02,03,04,05,06):



Name         Period         Amount
AAA 1611 10
BBB 1611 15
CCC 1611 20
AAA 1612 2
BBB 1612 3
CCC 1612 4
AAA 1701 1
BBB 1701 2
CCC 1701 3


Basically, it needs to take a higher period's value, minus the next period below it according to the Group By. If it can't find a lower period, then it keeps data the same.



Currently what I did was to Group By the data from the table I'm pulling it from as there would be multiple lines for each combination and we only need unique ones according to the group by.



Select

a.[Actuality], a.[Period], a.[(C) Company Code], a.[(C) Account Code], a.[(C) D1 Code],
a.[(C) D2 Code], a.[(C) D3 Code], a.[(C) D4 Code], a.[(C) Intercompany (To)],
a.[(C) Intercompany (From)], a.[(C) Type], Sum(a.[Amount]) as 'Amount'

From Table1 as a

Group By
a.[Actuality], a.[Period], a.[(C) Company Code],
a.[(C) Account Code], a.[(C) D1 Code], a.[(C) D2 Code]


I was thinking about putting a Where with a difference between periods needing to be 1 (1608-1607) or 89 (1701-1612) or 0 with AND a . [Period] needing to be smallest value for that Group by.



Could I please ask for help in formulating this in SQL?



Additional information




  • In the table only 1 year data is stored. Next year the data is moving to another table and its starts all over again from Period YY07 to YY06

  • In a table during the year around 2.5M rows are made

  • There are no duplicates in the table as a Group by is used to eliminate them

  • Variance between two lines should be also shown if a line exist 1 month and the next month doesn't: YTD in Period 1611 ABC has 20 and if Period 1612 ABC has 0 the line wouldn't appear in the YTD table, but in the month to date (MTD) would need to show Period 1612 ABC -20 (0-20)










share|improve this question
















bumped to the homepage by Community 5 mins ago


This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.






















    4















    I'm having a problem converting rolling SUM by month (monthly year to date) back to only that month's numbers.



    For example, see table below. Periods are (YYMM) and a monthly period per year starts with 07 and end with 06:



    Name         Period         Amount
    AAA 1611 10
    BBB 1611 15
    CCC 1611 20
    AAA 1612 12
    BBB 1612 18
    CCC 1612 24
    AAA 1701 13
    BBB 1701 20
    CCC 1701 27


    The result that we are after is as follows. Period 1611 is the lowest in this example, but can be any YYMM month (07,08,09,10,11,12,01,02,03,04,05,06):



    Name         Period         Amount
    AAA 1611 10
    BBB 1611 15
    CCC 1611 20
    AAA 1612 2
    BBB 1612 3
    CCC 1612 4
    AAA 1701 1
    BBB 1701 2
    CCC 1701 3


    Basically, it needs to take a higher period's value, minus the next period below it according to the Group By. If it can't find a lower period, then it keeps data the same.



    Currently what I did was to Group By the data from the table I'm pulling it from as there would be multiple lines for each combination and we only need unique ones according to the group by.



    Select

    a.[Actuality], a.[Period], a.[(C) Company Code], a.[(C) Account Code], a.[(C) D1 Code],
    a.[(C) D2 Code], a.[(C) D3 Code], a.[(C) D4 Code], a.[(C) Intercompany (To)],
    a.[(C) Intercompany (From)], a.[(C) Type], Sum(a.[Amount]) as 'Amount'

    From Table1 as a

    Group By
    a.[Actuality], a.[Period], a.[(C) Company Code],
    a.[(C) Account Code], a.[(C) D1 Code], a.[(C) D2 Code]


    I was thinking about putting a Where with a difference between periods needing to be 1 (1608-1607) or 89 (1701-1612) or 0 with AND a . [Period] needing to be smallest value for that Group by.



    Could I please ask for help in formulating this in SQL?



    Additional information




    • In the table only 1 year data is stored. Next year the data is moving to another table and its starts all over again from Period YY07 to YY06

    • In a table during the year around 2.5M rows are made

    • There are no duplicates in the table as a Group by is used to eliminate them

    • Variance between two lines should be also shown if a line exist 1 month and the next month doesn't: YTD in Period 1611 ABC has 20 and if Period 1612 ABC has 0 the line wouldn't appear in the YTD table, but in the month to date (MTD) would need to show Period 1612 ABC -20 (0-20)










    share|improve this question
















    bumped to the homepage by Community 5 mins ago


    This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.


















      4












      4








      4








      I'm having a problem converting rolling SUM by month (monthly year to date) back to only that month's numbers.



      For example, see table below. Periods are (YYMM) and a monthly period per year starts with 07 and end with 06:



      Name         Period         Amount
      AAA 1611 10
      BBB 1611 15
      CCC 1611 20
      AAA 1612 12
      BBB 1612 18
      CCC 1612 24
      AAA 1701 13
      BBB 1701 20
      CCC 1701 27


      The result that we are after is as follows. Period 1611 is the lowest in this example, but can be any YYMM month (07,08,09,10,11,12,01,02,03,04,05,06):



      Name         Period         Amount
      AAA 1611 10
      BBB 1611 15
      CCC 1611 20
      AAA 1612 2
      BBB 1612 3
      CCC 1612 4
      AAA 1701 1
      BBB 1701 2
      CCC 1701 3


      Basically, it needs to take a higher period's value, minus the next period below it according to the Group By. If it can't find a lower period, then it keeps data the same.



      Currently what I did was to Group By the data from the table I'm pulling it from as there would be multiple lines for each combination and we only need unique ones according to the group by.



      Select

      a.[Actuality], a.[Period], a.[(C) Company Code], a.[(C) Account Code], a.[(C) D1 Code],
      a.[(C) D2 Code], a.[(C) D3 Code], a.[(C) D4 Code], a.[(C) Intercompany (To)],
      a.[(C) Intercompany (From)], a.[(C) Type], Sum(a.[Amount]) as 'Amount'

      From Table1 as a

      Group By
      a.[Actuality], a.[Period], a.[(C) Company Code],
      a.[(C) Account Code], a.[(C) D1 Code], a.[(C) D2 Code]


      I was thinking about putting a Where with a difference between periods needing to be 1 (1608-1607) or 89 (1701-1612) or 0 with AND a . [Period] needing to be smallest value for that Group by.



      Could I please ask for help in formulating this in SQL?



      Additional information




      • In the table only 1 year data is stored. Next year the data is moving to another table and its starts all over again from Period YY07 to YY06

      • In a table during the year around 2.5M rows are made

      • There are no duplicates in the table as a Group by is used to eliminate them

      • Variance between two lines should be also shown if a line exist 1 month and the next month doesn't: YTD in Period 1611 ABC has 20 and if Period 1612 ABC has 0 the line wouldn't appear in the YTD table, but in the month to date (MTD) would need to show Period 1612 ABC -20 (0-20)










      share|improve this question
















      I'm having a problem converting rolling SUM by month (monthly year to date) back to only that month's numbers.



      For example, see table below. Periods are (YYMM) and a monthly period per year starts with 07 and end with 06:



      Name         Period         Amount
      AAA 1611 10
      BBB 1611 15
      CCC 1611 20
      AAA 1612 12
      BBB 1612 18
      CCC 1612 24
      AAA 1701 13
      BBB 1701 20
      CCC 1701 27


      The result that we are after is as follows. Period 1611 is the lowest in this example, but can be any YYMM month (07,08,09,10,11,12,01,02,03,04,05,06):



      Name         Period         Amount
      AAA 1611 10
      BBB 1611 15
      CCC 1611 20
      AAA 1612 2
      BBB 1612 3
      CCC 1612 4
      AAA 1701 1
      BBB 1701 2
      CCC 1701 3


      Basically, it needs to take a higher period's value, minus the next period below it according to the Group By. If it can't find a lower period, then it keeps data the same.



      Currently what I did was to Group By the data from the table I'm pulling it from as there would be multiple lines for each combination and we only need unique ones according to the group by.



      Select

      a.[Actuality], a.[Period], a.[(C) Company Code], a.[(C) Account Code], a.[(C) D1 Code],
      a.[(C) D2 Code], a.[(C) D3 Code], a.[(C) D4 Code], a.[(C) Intercompany (To)],
      a.[(C) Intercompany (From)], a.[(C) Type], Sum(a.[Amount]) as 'Amount'

      From Table1 as a

      Group By
      a.[Actuality], a.[Period], a.[(C) Company Code],
      a.[(C) Account Code], a.[(C) D1 Code], a.[(C) D2 Code]


      I was thinking about putting a Where with a difference between periods needing to be 1 (1608-1607) or 89 (1701-1612) or 0 with AND a . [Period] needing to be smallest value for that Group by.



      Could I please ask for help in formulating this in SQL?



      Additional information




      • In the table only 1 year data is stored. Next year the data is moving to another table and its starts all over again from Period YY07 to YY06

      • In a table during the year around 2.5M rows are made

      • There are no duplicates in the table as a Group by is used to eliminate them

      • Variance between two lines should be also shown if a line exist 1 month and the next month doesn't: YTD in Period 1611 ABC has 20 and if Period 1612 ABC has 0 the line wouldn't appear in the YTD table, but in the month to date (MTD) would need to show Period 1612 ABC -20 (0-20)







      sql-server sql-server-2008






      share|improve this question















      share|improve this question













      share|improve this question




      share|improve this question








      edited Jan 20 '17 at 6:05









      Paul White

      54.2k14288461




      54.2k14288461










      asked Jan 16 '17 at 7:08









      LukasLukas

      212




      212





      bumped to the homepage by Community 5 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.







      bumped to the homepage by Community 5 mins ago


      This question has answers that may be good or bad; the system has marked it active so that they can be reviewed.
























          1 Answer
          1






          active

          oldest

          votes


















          0














          select name,period
          ,amount-lag(amount,1,0) over (partition by name order by period)
          from table1




          with cte as 
          (
          select *
          ,row_number() over (partition by name order by period) as rn
          from table1
          )
          select t1.name,t1.period
          ,t1.amount-coalesce(t2.amount,0)
          from cte as t1
          left join cte as t2
          on t2.rn = t1.rn-1





          share|improve this answer


























          • Some explaining what lag does would help the OP...

            – joanolo
            Jan 16 '17 at 7:53











          • This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

            – Andriy M
            Jan 16 '17 at 10:26











          • Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

            – Lukas
            Jan 16 '17 at 12:56











          • @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

            – Joe Obbish
            Jan 17 '17 at 0:29











          • Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

            – Lukas
            Jan 17 '17 at 5:40














          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "182"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f161182%2fconverting-monthly-rolling-sum-ytd-back-to-monthly-numbers%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          1 Answer
          1






          active

          oldest

          votes








          1 Answer
          1






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          0














          select name,period
          ,amount-lag(amount,1,0) over (partition by name order by period)
          from table1




          with cte as 
          (
          select *
          ,row_number() over (partition by name order by period) as rn
          from table1
          )
          select t1.name,t1.period
          ,t1.amount-coalesce(t2.amount,0)
          from cte as t1
          left join cte as t2
          on t2.rn = t1.rn-1





          share|improve this answer


























          • Some explaining what lag does would help the OP...

            – joanolo
            Jan 16 '17 at 7:53











          • This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

            – Andriy M
            Jan 16 '17 at 10:26











          • Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

            – Lukas
            Jan 16 '17 at 12:56











          • @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

            – Joe Obbish
            Jan 17 '17 at 0:29











          • Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

            – Lukas
            Jan 17 '17 at 5:40


















          0














          select name,period
          ,amount-lag(amount,1,0) over (partition by name order by period)
          from table1




          with cte as 
          (
          select *
          ,row_number() over (partition by name order by period) as rn
          from table1
          )
          select t1.name,t1.period
          ,t1.amount-coalesce(t2.amount,0)
          from cte as t1
          left join cte as t2
          on t2.rn = t1.rn-1





          share|improve this answer


























          • Some explaining what lag does would help the OP...

            – joanolo
            Jan 16 '17 at 7:53











          • This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

            – Andriy M
            Jan 16 '17 at 10:26











          • Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

            – Lukas
            Jan 16 '17 at 12:56











          • @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

            – Joe Obbish
            Jan 17 '17 at 0:29











          • Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

            – Lukas
            Jan 17 '17 at 5:40
















          0












          0








          0







          select name,period
          ,amount-lag(amount,1,0) over (partition by name order by period)
          from table1




          with cte as 
          (
          select *
          ,row_number() over (partition by name order by period) as rn
          from table1
          )
          select t1.name,t1.period
          ,t1.amount-coalesce(t2.amount,0)
          from cte as t1
          left join cte as t2
          on t2.rn = t1.rn-1





          share|improve this answer















          select name,period
          ,amount-lag(amount,1,0) over (partition by name order by period)
          from table1




          with cte as 
          (
          select *
          ,row_number() over (partition by name order by period) as rn
          from table1
          )
          select t1.name,t1.period
          ,t1.amount-coalesce(t2.amount,0)
          from cte as t1
          left join cte as t2
          on t2.rn = t1.rn-1






          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Jan 16 '17 at 8:22

























          answered Jan 16 '17 at 7:23









          David דודו MarkovitzDavid דודו Markovitz

          2,933518




          2,933518













          • Some explaining what lag does would help the OP...

            – joanolo
            Jan 16 '17 at 7:53











          • This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

            – Andriy M
            Jan 16 '17 at 10:26











          • Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

            – Lukas
            Jan 16 '17 at 12:56











          • @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

            – Joe Obbish
            Jan 17 '17 at 0:29











          • Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

            – Lukas
            Jan 17 '17 at 5:40





















          • Some explaining what lag does would help the OP...

            – joanolo
            Jan 16 '17 at 7:53











          • This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

            – Andriy M
            Jan 16 '17 at 10:26











          • Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

            – Lukas
            Jan 16 '17 at 12:56











          • @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

            – Joe Obbish
            Jan 17 '17 at 0:29











          • Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

            – Lukas
            Jan 17 '17 at 5:40



















          Some explaining what lag does would help the OP...

          – joanolo
          Jan 16 '17 at 7:53





          Some explaining what lag does would help the OP...

          – joanolo
          Jan 16 '17 at 7:53













          This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

          – Andriy M
          Jan 16 '17 at 10:26





          This seems to be intended for use with running totals but not with rolling totals. If the source has more than one year's worth of data, your code may not work, because the totals are YTD (year-to-date), not the running totals that include all the months since the beginning. (So at some point you can no longer just subtract the previous total from the current total to get the current month, you need to account for "losing" the month that was a year ago.)

          – Andriy M
          Jan 16 '17 at 10:26













          Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

          – Lukas
          Jan 16 '17 at 12:56





          Thanks guys! Regarding running totals that shouldnt be a problem, because every year all years data is moved to a separate table and in that one only that years is kept. However the problem with this code im having is that it overload the _log file and my server runs out of memory before it finishes it (gets to extra 20GB). It is running on a table with 1.7M rows, so its expected to have some usage, but this is a bit too much. Would there be a way to adjust so that all the transactions arent posted to _log and to just create a table of this query?

          – Lukas
          Jan 16 '17 at 12:56













          @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

          – Joe Obbish
          Jan 17 '17 at 0:29





          @Lukas It looks like you need to join on more than just the rn column. I suspect that you also want to join on the name column to prevent duplicate rows from showing up. It sounds like your query is generating more data than it should and that's why you run out of t-log space and use a lot of memory. You can also try running the query as a SELECT COUNT(*) to verify that you're getting back the right number of rows.

          – Joe Obbish
          Jan 17 '17 at 0:29













          Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

          – Lukas
          Jan 17 '17 at 5:40







          Hi Joe, that might be it. It has more rows, but i thought its because its showing variance between two lines out of which 1 was not existing before (i.e. if ABC period 1611 has 20 and theres nothing in period 1612, then a new line created of -20 for MTD). I tried to add this t2.rn = t1.rn-1 and t2.name = t2.name -1 but its giving me convertion errors from nvarchar to int, i assume i need to put something in the cte as well, just not sure what?

          – Lukas
          Jan 17 '17 at 5:40




















          draft saved

          draft discarded




















































          Thanks for contributing an answer to Database Administrators Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fdba.stackexchange.com%2fquestions%2f161182%2fconverting-monthly-rolling-sum-ytd-back-to-monthly-numbers%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Anexo:Material bélico de la Fuerza Aérea de Chile Índice Aeronaves Defensa...

          Always On Availability groups resolving state after failover - Remote harden of transaction...

          update json value to null Announcing the arrival of Valued Associate #679: Cesar Manara ...