有没有更快的方法来按累积平均值进行 Pandas 分组?

发布于 2025-01-12 07:39:35 字数 1537 浏览 0 评论 0原文

我正在尝试在 Python 中创建一个查找参考表,用于计算玩家之前(按日期时间)游戏得分的累积平均值,并按场地分组。但是,对于我的特定需求,玩家之前应该在相关场地至少玩过 2 次,以便计算“场地偏好”累积平均值

df 格式如下所示:

DateTimePlayerVenueScore
2021-09-25 17:15:00TimStadium A20
2021-09-27 10:00:00BlakeStadium B30

我现有的代码工作完美,但不幸的是非常慢,如下:

import numpy as np
import pandas as pd

VenueSum = pd.DataFrame(df.groupby(['DateTime', 'Player', 'Venue'])['Score'].sum().reset_index(name = 'Sum'))
VenueSum['Cumulative Sum'] = VenueSum.sort_values('DateTime').groupby(['Player', 'Venue'])['Sum'].cumsum()
VenueCount = pd.DataFrame(df.groupby(['DateTime', 'Player', 'Venue'])['Score'].count().reset_index(name = 'Count'))
VenueCount['Cumulative Count'] = VenueCount.sort_values('DateTime').groupby(['Player', 'Venue'])['Count'].cumsum()
VenueLookup = VenueSum.merge(VenueCount, how = 'outer', on = ['DateTime', 'Player', 'Venue'])
VenueLookup['Venue Preference'] = np.where(VenueLookup['Cumulative Count'] >= 2, VenueLookup['Cumulative Sum'] / VenueLookup['Cumulative Count'], np.nan)
VenueLookup = VenueLookup.drop(['Sum', 'Cumulative Sum', 'Count', 'Cumulative Count'], axis = 1)

我确信有一种方法可以一步计算累积平均值,而无需先计算累积总和累积计数,但不幸的是我无法让它发挥作用。

I am trying to create a lookup reference table in Python that calculates the cumulative mean of a Player's previous (by datetime) games scores, grouped by venue. However, for my specific need, a player should have previously played a minimum of 2 times at the relevant Venue for a 'Venue Preference' cumulative mean calculation.

df format looks like the following:

DateTimePlayerVenueScore
2021-09-25 17:15:00TimStadium A20
2021-09-27 10:00:00BlakeStadium B30

My existing code that works perfectly, but unfortunately is very slow, is as follows:

import numpy as np
import pandas as pd

VenueSum = pd.DataFrame(df.groupby(['DateTime', 'Player', 'Venue'])['Score'].sum().reset_index(name = 'Sum'))
VenueSum['Cumulative Sum'] = VenueSum.sort_values('DateTime').groupby(['Player', 'Venue'])['Sum'].cumsum()
VenueCount = pd.DataFrame(df.groupby(['DateTime', 'Player', 'Venue'])['Score'].count().reset_index(name = 'Count'))
VenueCount['Cumulative Count'] = VenueCount.sort_values('DateTime').groupby(['Player', 'Venue'])['Count'].cumsum()
VenueLookup = VenueSum.merge(VenueCount, how = 'outer', on = ['DateTime', 'Player', 'Venue'])
VenueLookup['Venue Preference'] = np.where(VenueLookup['Cumulative Count'] >= 2, VenueLookup['Cumulative Sum'] / VenueLookup['Cumulative Count'], np.nan)
VenueLookup = VenueLookup.drop(['Sum', 'Cumulative Sum', 'Count', 'Cumulative Count'], axis = 1)

I am sure there is a way to calculate the cumulative mean in one step without first calculating the cumulative sum and cumulative count, but unfortunately I couldn't get that to work.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

醉生梦死 2025-01-19 07:39:35

IIUC 首先通过 sumsize 进行聚合,然后通过两列进行累积总和来删除 2 个 groupby:

df1 = df.groupby(['DateTime', 'Player', 'Venue'])['Score'].agg(['sum','count'])
df1 = df1.groupby(['Player', 'Venue'])[['sum', 'count']].cumsum().reset_index()
df1['Venue Preference'] = np.where(df1['count'] >= 2, df1['sum'] / df1['count'], np.nan)
df1 = df1.drop(['sum', 'count'], axis=1)
print (df1)
              DateTime Player      Venue  Venue Preference
0  2021-09-25 17:15:00    Tim  Stadium A               NaN
1  2021-09-27 10:00:00  Blake  Stadium B               NaN

IIUC remove 2 groupby by aggregate by sum and size first and then cumulative sum by both columns:

df1 = df.groupby(['DateTime', 'Player', 'Venue'])['Score'].agg(['sum','count'])
df1 = df1.groupby(['Player', 'Venue'])[['sum', 'count']].cumsum().reset_index()
df1['Venue Preference'] = np.where(df1['count'] >= 2, df1['sum'] / df1['count'], np.nan)
df1 = df1.drop(['sum', 'count'], axis=1)
print (df1)
              DateTime Player      Venue  Venue Preference
0  2021-09-25 17:15:00    Tim  Stadium A               NaN
1  2021-09-27 10:00:00  Blake  Stadium B               NaN
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文