如何在Microsoft Excel Power查询中找到每个ID的最大值?

发布于 2025-02-09 02:13:59 字数 1571 浏览 2 评论 0原文

Job_IDDay_id
430120
430720
530120
530620
620630
120 630120
630620
630620

应成为:

job_idday_id
430720
530530 620
620630
620 630620

只有day_id的行中只有最大值。

我目前正在使用Excel的Power查询,因此不可用Python和R脚本。

JOB_IDDAY_ID
430120
430720
530120
530620
630120
630120
630620
630620

Should become:

JOB_IDDAY_ID
430720
530620
630620
630620

Where only the rows where day_id is the maximum value is kept.

I'm currently using Excel's Power Query, so Python and R scripts are not available.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

挖个坑埋了你 2025-02-16 02:13:59

为了将重复项保持在最后,您必须摆弄M。

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjE2UFDSUTI0MlCK1YFzzaFcU1RZKNcMyjVDlcXONcPCjQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"JOB_ID " = _t, DAY_ID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"JOB_ID ", Int64.Type}, {"DAY_ID", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"JOB_ID "}, {{"All", each 

    let a = List.Max(_[DAY_ID]),
    b =  List.Select(_[DAY_ID], each _ =   a)
    in b

    , type list}}),
    #"Expanded All" = Table.ExpandListColumn(#"Grouped Rows", "All")
in
    #"Expanded All"

To keep the duplicates at the end, you have to fiddle around with the M. Here you go.

let
    Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45WMjE2UFDSUTI0MlCK1YFzzaFcU1RZKNcMyjVDlcXONcPCjQUA", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [#"JOB_ID " = _t, DAY_ID = _t]),
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"JOB_ID ", Int64.Type}, {"DAY_ID", Int64.Type}}),
    #"Grouped Rows" = Table.Group(#"Changed Type", {"JOB_ID "}, {{"All", each 

    let a = List.Max(_[DAY_ID]),
    b =  List.Select(_[DAY_ID], each _ =   a)
    in b

    , type list}}),
    #"Expanded All" = Table.ExpandListColumn(#"Grouped Rows", "All")
in
    #"Expanded All"
ゝ杯具 2025-02-16 02:13:59
  • 通过job_id
  • 组组汇总来通过最大day_id
let

//Change next line to reflect actual data source
    Source = Excel.CurrentWorkbook(){[Name="Table20"]}[Content],
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"JOB_ID", Int64.Type}, {"DAY_ID", Int64.Type}}),

//Group by JOB_ID
//  Filter each subtable by maximum DAY_ID
    #"Grouped Rows" = Table.Group(#"Changed Type", 
        {"JOB_ID"}, {
            {"DAY_ID", (t)=>Table.SelectRows(t, each [DAY_ID]=List.Max(t[DAY_ID])), 
                    type table [JOB_ID=Int64.Type, DAY_ID=Int64.Type]}}),
    #"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"JOB_ID"}),
    
    #"Expanded DAY_ID" = Table.ExpandTableColumn(#"Removed Columns", "DAY_ID", {"JOB_ID", "DAY_ID"})
in
    #"Expanded DAY_ID"

”在此处输入图像说明”

  • Group by JOB_ID
  • Aggregate by filtering each subtable by the maximum DAY_ID
let

//Change next line to reflect actual data source
    Source = Excel.CurrentWorkbook(){[Name="Table20"]}[Content],
    #"Changed Type" = Table.TransformColumnTypes(Source,{{"JOB_ID", Int64.Type}, {"DAY_ID", Int64.Type}}),

//Group by JOB_ID
//  Filter each subtable by maximum DAY_ID
    #"Grouped Rows" = Table.Group(#"Changed Type", 
        {"JOB_ID"}, {
            {"DAY_ID", (t)=>Table.SelectRows(t, each [DAY_ID]=List.Max(t[DAY_ID])), 
                    type table [JOB_ID=Int64.Type, DAY_ID=Int64.Type]}}),
    #"Removed Columns" = Table.RemoveColumns(#"Grouped Rows",{"JOB_ID"}),
    
    #"Expanded DAY_ID" = Table.ExpandTableColumn(#"Removed Columns", "DAY_ID", {"JOB_ID", "DAY_ID"})
in
    #"Expanded DAY_ID"

enter image description here

情未る 2025-02-16 02:13:59

您可能需要按JOB_ID进行分组,然后选择汇总最大day_id

编辑:(请参见下文评论)
与原始列表的匹配可以使Job_ID的重复与相同的最大day_id保持

You may want to group by Job_ID and choose to aggregate max Day_ID

edit: (see below comment)
matching against original list allows to keep duplicates of Job_ID with same max Day_ID

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文