Wednesday, March 28, 2012

Grouping on time

Hello,
I have a table with 2 columns, time and amount. I want to be able to group
by an interval and sum the amount see below of a sample of the data.
Time Amount
2005-02-16 05:41:00.000 100
2005-02-16 05:41:01.000 100
2005-02-16 05:41:02.000 100
2005-02-16 05:41:03.000 100
2005-02-16 05:41:04.000 100
2005-02-16 05:41:05.000 100
2005-02-16 05:41:06.000 100
2005-02-16 05:41:07.000 100
2005-02-16 05:41:08.000 100
2005-02-16 05:41:09.000 100
2005-02-16 05:41:10.000 100
2005-02-16 05:41:11.000 100
2005-02-16 05:41:12.000 100
2005-02-16 05:41:13.000 100
2005-02-16 05:41:14.000 100
so the result of the above with an interval of 5 seconds would be
Time Amount
2005-02-16 05:41:04.000 500
2005-02-16 05:41:09.000 500
2005-02-16 05:41:14.000 500
any ideas?
ThanksTry,
use northwind
go
create table t (
[Time] datetime,
Amount int
)
go
insert into t values('2005-02-16 05:41:00.000', 100)
insert into t values('2005-02-16 05:41:01.000', 100)
insert into t values('2005-02-16 05:41:02.000', 100)
insert into t values('2005-02-16 05:41:03.000', 100)
insert into t values('2005-02-16 05:41:04.000', 100)
insert into t values('2005-02-16 05:41:05.000', 100)
insert into t values('2005-02-16 05:41:06.000', 100)
insert into t values('2005-02-16 05:41:07.000', 100)
insert into t values('2005-02-16 05:41:08.000', 100)
insert into t values('2005-02-16 05:41:09.000', 100)
insert into t values('2005-02-16 05:41:10.000', 100)
insert into t values('2005-02-16 05:41:11.000', 100)
insert into t values('2005-02-16 05:41:12.000', 100)
insert into t values('2005-02-16 05:41:13.000', 100)
insert into t values('2005-02-16 05:41:14.000', 100)
go
select
max([time]) as max_time,
sum(amount) as sum_amount
from
t
group by
datediff(second, convert(char(8), [time], 112), [time]) / 5
go
drop table t
go
AMB
"Fab" wrote:

> Hello,
> I have a table with 2 columns, time and amount. I want to be able to group
> by an interval and sum the amount see below of a sample of the data.
> Time Amount
> 2005-02-16 05:41:00.000 100
> 2005-02-16 05:41:01.000 100
> 2005-02-16 05:41:02.000 100
> 2005-02-16 05:41:03.000 100
> 2005-02-16 05:41:04.000 100
> 2005-02-16 05:41:05.000 100
> 2005-02-16 05:41:06.000 100
> 2005-02-16 05:41:07.000 100
> 2005-02-16 05:41:08.000 100
> 2005-02-16 05:41:09.000 100
> 2005-02-16 05:41:10.000 100
> 2005-02-16 05:41:11.000 100
> 2005-02-16 05:41:12.000 100
> 2005-02-16 05:41:13.000 100
> 2005-02-16 05:41:14.000 100
> so the result of the above with an interval of 5 seconds would be
> Time Amount
> 2005-02-16 05:41:04.000 500
> 2005-02-16 05:41:09.000 500
> 2005-02-16 05:41:14.000 500
>
> any ideas?
> Thanks
>
>|||This was responded yesterday ( assumption is that there exists one row for
every monotonically increasing second ):
[url]http://groups.google.ca/groups?selm=%238%23HOtMMFHA.3832%40TK2MSFTNGP12.phx.gbl[/u
rl]
Anith|||use something like that
select dateadd(ss,-datepart(ss,time)%5,time),sum(amount) from @.t group by
dateadd(ss,-datepart(ss,time)%5,time)
"Fab" wrote:

> Hello,
> I have a table with 2 columns, time and amount. I want to be able to group
> by an interval and sum the amount see below of a sample of the data.
> Time Amount
> 2005-02-16 05:41:00.000 100
> 2005-02-16 05:41:01.000 100
> 2005-02-16 05:41:02.000 100
> 2005-02-16 05:41:03.000 100
> 2005-02-16 05:41:04.000 100
> 2005-02-16 05:41:05.000 100
> 2005-02-16 05:41:06.000 100
> 2005-02-16 05:41:07.000 100
> 2005-02-16 05:41:08.000 100
> 2005-02-16 05:41:09.000 100
> 2005-02-16 05:41:10.000 100
> 2005-02-16 05:41:11.000 100
> 2005-02-16 05:41:12.000 100
> 2005-02-16 05:41:13.000 100
> 2005-02-16 05:41:14.000 100
> so the result of the above with an interval of 5 seconds would be
> Time Amount
> 2005-02-16 05:41:04.000 500
> 2005-02-16 05:41:09.000 500
> 2005-02-16 05:41:14.000 500
>
> any ideas?
> Thanks
>
>|||CREATE TABLE ReportPeriods
(period_id CHAR(10) NOT NULL,
start_time DATETIME NOT NULL,
end_time DATETIME NOT NULL,
CHECK (start_time < end_time),
PRIMARY KEY (start_time, end_time));
Load your times into the table then:
SELECT period_id, COUNT(*)
FROM ReportPeriods AS P1, Foobar AS F1
WHERE F1.event_time BETWEEN start_time AND end_time;|||sorry i made a mistake the script should be
select max(time),sum(amount) from @.t group by
dateadd(ss,-datepart(ss,time)%5,time)
the problem with the response of alejandro mesa is that if you have the same
time in different days the two rows will be grouped together
"Fab" wrote:

> Hello,
> I have a table with 2 columns, time and amount. I want to be able to group
> by an interval and sum the amount see below of a sample of the data.
> Time Amount
> 2005-02-16 05:41:00.000 100
> 2005-02-16 05:41:01.000 100
> 2005-02-16 05:41:02.000 100
> 2005-02-16 05:41:03.000 100
> 2005-02-16 05:41:04.000 100
> 2005-02-16 05:41:05.000 100
> 2005-02-16 05:41:06.000 100
> 2005-02-16 05:41:07.000 100
> 2005-02-16 05:41:08.000 100
> 2005-02-16 05:41:09.000 100
> 2005-02-16 05:41:10.000 100
> 2005-02-16 05:41:11.000 100
> 2005-02-16 05:41:12.000 100
> 2005-02-16 05:41:13.000 100
> 2005-02-16 05:41:14.000 100
> so the result of the above with an interval of 5 seconds would be
> Time Amount
> 2005-02-16 05:41:04.000 500
> 2005-02-16 05:41:09.000 500
> 2005-02-16 05:41:14.000 500
>
> any ideas?
> Thanks
>
>|||can you explan this part please?
-datepart(ss,time)%5
"sergiu" <sergiu@.discussions.microsoft.com> wrote in message
news:C3A9AA65-1277-4AEF-A517-60E4E03CED9B@.microsoft.com...
> sorry i made a mistake the script should be
> select max(time),sum(amount) from @.t group by
> dateadd(ss,-datepart(ss,time)%5,time)
> the problem with the response of alejandro mesa is that if you have the
> same
> time in different days the two rows will be grouped together
>
> "Fab" wrote:
>|||your assumption is wrong is my skip a second or two...
any ideas?
"Anith Sen" <anith@.bizdatasolutions.com> wrote in message
news:%23uL11yVMFHA.568@.TK2MSFTNGP09.phx.gbl...
> This was responded yesterday ( assumption is that there exists one row for
> every monotonically increasing second ):
> [url]http://groups.google.ca/groups?selm=%238%23HOtMMFHA.3832%40TK2MSFTNGP12.phx.gbl[
/url]
> --
> Anith
>|||On Fri, 25 Mar 2005 14:18:16 -0500, Fab wrote:

>your assumption is wrong is my skip a second or two...
>any ideas?
Hi Fab,
So why didn't you indicate that the assumption was wrong in the original
thread? Half an hour ago, I saw the original thread with only Anith's
answer; I took the time to try a solution, write a message and send it.
And now, I find that you reposted the question in a new thread and
already got some replies.
If you had posted a follow-up to your original question instead of
starting a new thread, then I'd have seen the answers and moved on the
the next question, instead of wasting my time and cluttering the group
with yet another answer that isn't really any different from Alejandro's
suggestion.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)|||so now that you know your assumption was wrong are you still willing to help
me with my issue?
I need to group based on 5 seconds intervals...the result of the table will
roll up based on time not on the values in the table...so the results
should start at second 00 and end at second 04...anything that falls in
that 1st group will be rolled up...and so on for each interal all the way up
to 60.
let me know if you have any questions b4 you provide a solution.
"Hugo Kornelis" <hugo@.pe_NO_rFact.in_SPAM_fo> wrote in message
news:g45941liog7iufggqi8eqvp8mngammggir@.
4ax.com...
> On Fri, 25 Mar 2005 14:18:16 -0500, Fab wrote:
>
>
> Hi Fab,
> So why didn't you indicate that the assumption was wrong in the original
> thread? Half an hour ago, I saw the original thread with only Anith's
> answer; I took the time to try a solution, write a message and send it.
> And now, I find that you reposted the question in a new thread and
> already got some replies.
> If you had posted a follow-up to your original question instead of
> starting a new thread, then I'd have seen the answers and moved on the
> the next question, instead of wasting my time and cluttering the group
> with yet another answer that isn't really any different from Alejandro's
> suggestion.
> Best, Hugo
> --
> (Remove _NO_ and _SPAM_ to get my e-mail address)

No comments:

Post a Comment