MySQL:如何对每小时的数据进行分组并获取最新的小时数

发布于 2024-11-09 16:52:25 字数 800 浏览 0 评论 0原文

我正在尝试执行一个按小时获取数据的查询,但我想缩小范围并只获取最新小时,而不是正常的按小时分组 - 这意味着该小时内的最新数据。如下图所示,我想要得到的是带有红色框的行。如果您注意到,第一行红行是 10:59:51,这意味着它是 10:00:0010:59:59 内的唯一行。对于 12:00 及以上的其余行,我想获取 12:37:14 因为它是该小时范围内的最新或最新的。

在此处输入图像描述

我有一个简单的查询,使用 小时 对数据进行分组HOUR() 类似:

SELECT userid, username, date_created
FROM user_accounts 
WHERE date_created >= '2009-10-27 00:00:00' AND date_created < '2009-10-27 23:59:59'
GROUP BY HOUR(date_created)

但是,查询只是按小时 1012 进行分组,返回 id 24 和 < code>25 - 但我需要的是 id 2428。有什么想法吗?

I'm trying to do a query that fetches data per hour but instead of the normal group by hour I want to narrow it down and only get the latest hour - meaning the newest data within that hour. With the picture shown below what I wanted to get is the rows with red boxes. If you will notice, first red row is 10:59:51 which means it's the only row that's within 10:00:00 and 10:59:59. For the rest of the rows that is on 12:00 and above I wanted to get 12:37:14 because it's the latest or newest for that hour range.

enter image description here

I have a simple query that groups the data by hour using HOUR() like:

SELECT userid, username, date_created
FROM user_accounts 
WHERE date_created >= '2009-10-27 00:00:00' AND date_created < '2009-10-27 23:59:59'
GROUP BY HOUR(date_created)

The query, however, is just grouping it by hour 10 and 12 which returns id 24 and 25 - but what I needed is id 24 and 28. Any ideas?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

娇妻 2024-11-16 16:52:25

尝试

SELECT  f.*
FROM    (
        SELECT  MAX(UNIX_TIMESTAMP(date_created)) AS mts
        FROM  user_accounts 
        GROUP BY HOUR(date_created)
        ) s
JOIN    user_accounts  f
ON      UNIX_TIMESTAMP(date_created) = s.mts
WHERE  DATE(date_created) = '2009-10-27'

Try

SELECT  f.*
FROM    (
        SELECT  MAX(UNIX_TIMESTAMP(date_created)) AS mts
        FROM  user_accounts 
        GROUP BY HOUR(date_created)
        ) s
JOIN    user_accounts  f
ON      UNIX_TIMESTAMP(date_created) = s.mts
WHERE  DATE(date_created) = '2009-10-27'
人│生佛魔见 2024-11-16 16:52:25

也许这会起作用?

SELECT userid, username, date_created
FROM user_accounts 
WHERE userid IN (
  SELECT MAX(userid)
  FROM user_accounts 
  WHERE date_created >= '2009-10-27 00:00:00' AND date_created < '2009-10-27 23:59:59'
  GROUP BY HOUR(date_created)
)

Maybe this will work?

SELECT userid, username, date_created
FROM user_accounts 
WHERE userid IN (
  SELECT MAX(userid)
  FROM user_accounts 
  WHERE date_created >= '2009-10-27 00:00:00' AND date_created < '2009-10-27 23:59:59'
  GROUP BY HOUR(date_created)
)
幸福丶如此 2024-11-16 16:52:25

我不得不假设,如果跨越多天,您也希望每天都得到它,否则一小时的 max() 可以给您一周前的一小时与三天前的另一小时,以及当天的另一小时的信息。 .. 如果您超出了专门限制单日范围的 WHERE 子句,那么所有这些都是如此。它不是由您想要的特定用户执行的,而是由该小时内进行最后一次活动的人执行的...可能是同一个人,也可能每次都完全不同。我将特定日期作为我的小组测试的一部分,以防万一您想要跨越一个日期范围,但您也可以将其删除...

select STRAIGHT_JOIN
       ui.userid,
       ui.username,
       ui.date_created
   from 
       ( select
               date( date_created ),
               hour( date_created ),
               max( date_created ) as LastPerHour
            from
               user_accounts
            where
               date( date_created ) = '2009-10-27'
            group by
               date( date_created), 
               hour( date_created )) PreQuery
      join user_accounts ui
         on PreQuery.LastPerHour = ui.date_created

同样,如果您愿意,我也将日期作为分组包含在内要跨越多天,只需确保您的表在 date_created 上有一个自己创建的索引...或者至少在索引的第一个位置。

I would have to assume you would also want it by day too if spanning multiple days, otherwise a max() by an hour could give you something from a week ago with one hour vs three days ago another, and current day with yet another... That, all if you spanned outside your WHERE clause specifically limiting to your single day range. Its not by specific user you want, but whoever had the last activity for that hour... could be the same person, could be completely different every time. I'm tacking on the specific date as part of my group test just in case you ever wanted to span a date range, but you can take it out too...

select STRAIGHT_JOIN
       ui.userid,
       ui.username,
       ui.date_created
   from 
       ( select
               date( date_created ),
               hour( date_created ),
               max( date_created ) as LastPerHour
            from
               user_accounts
            where
               date( date_created ) = '2009-10-27'
            group by
               date( date_created), 
               hour( date_created )) PreQuery
      join user_accounts ui
         on PreQuery.LastPerHour = ui.date_created

Again, I've included date as a grouping too if you wanted to span multiple days, just make sure your table has an index on date_created by itself... or at least in the first position of the index.

画骨成沙 2024-11-16 16:52:25

您是指从现在起一小时还是最近一小时?
如果是最近的整点时间,这样的事情可能会起作用吗?

SELECT userid, username, date_created
FROM user_accounts 
WHERE HOUR(date_created) = (SELECT HOUR(date_created) FROM user_accounts ORDER BY date_created DESC LIMIT 1);

编辑:
啊哈,现在我想我得到了你想要的...你的日期范围内每个给定小时的最后添加的条目?

如果是这样,那么 Codler 的查询就是您想要的。

Do you mean one hour from NOW or latest full hour?
If it's latest full hour something like this might work?

SELECT userid, username, date_created
FROM user_accounts 
WHERE HOUR(date_created) = (SELECT HOUR(date_created) FROM user_accounts ORDER BY date_created DESC LIMIT 1);

EDIT:
Ahhh, now I think I get what you want... The last added entry on every given hour between your date range?

If so then Codler's query is what you want.

终止放荡 2024-11-16 16:52:25

我使用这个解决方案

select date(PROCESS_DATE),hour(PROCESS_DATE),EVENT,COUNT(*) from statistics group by EVENT,date(PROCESS_DATE),time(PROCESS_TIME) order by 1 desc, 2 desc limit 1;

希望这对某人有帮助。

I use this solution

select date(PROCESS_DATE),hour(PROCESS_DATE),EVENT,COUNT(*) from statistics group by EVENT,date(PROCESS_DATE),time(PROCESS_TIME) order by 1 desc, 2 desc limit 1;

Hope this assists someone.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文