如果不存在特定的键值,则给定数组的推送对象到MongoDB文档的数组

发布于 2025-02-02 16:50:14 字数 1151 浏览 3 评论 0原文

可以说,我现在像Pymongo一样在Mongo中有一个文档

{
    name: "abc",
    manufacturer: "xyz",
    ...
    history: [
        {"condition": "new", "price": 150, "currency": "USD", "timestamp": "aaaaaaaaaaaa"}
    ]
}

,我想更新与过滤器(否则插入)匹配(否则插入)的文档,如果该字典在那里(Mongo doc的历史记录阵列)中不存在,并将每个字典推在历史记录列表中 以上查询的查询

history = [
    {"condition": "new", "price": 150, "currency": "USD", "timestamp": "bbbbbbbbbbbb"},
    {"condition": "used", "price": 90, "currency": "USD", "timestamp": "cccccccccccc"},
]
item = {
    "name": "abcd",
    "manufacture": "xyz"
}
db.collection.update_one(filter, {"$set": item, "$addToSet": {"history": {"$each": history}}}, True)

问题是我想从 $ $ addtoset 的比较中排除时间戳字段(最终将是唯一的,最终导致字典被推开),这是根据我的研究不可能的。

因此,我试图避免进行两个单独的查询(以避免绩效问题,因为我的收藏包含数百万个文档)与我相当新的聚合管道,并希望得到任何帮助。 MongoDB中更新的文档应该看起来像

{
    name: "abcd",
    manufacturer: "xyz",
    ...
    history: [
        {"condition": "new", "price": 150, "currency": "USD", "timestamp": "aaaaaaaaaaaa"},
        {"condition": "used", "price": 90, "currency": "USD", "timestamp": "cccccccccccc"}
    ]
}

Lets say I have a document in mongo like

{
    name: "abc",
    manufacturer: "xyz",
    ...
    history: [
        {"condition": "new", "price": 150, "currency": "USD", "timestamp": "aaaaaaaaaaaa"}
    ]
}

Now with pymongo, I want to update the document that matches the filter(otherwise insert) and push each dictionary in history list if that dictionary does not exist there(in mongo doc's history array) with the following query

history = [
    {"condition": "new", "price": 150, "currency": "USD", "timestamp": "bbbbbbbbbbbb"},
    {"condition": "used", "price": 90, "currency": "USD", "timestamp": "cccccccccccc"},
]
item = {
    "name": "abcd",
    "manufacture": "xyz"
}
db.collection.update_one(filter, {"$set": item, "$addToSet": {"history": {"$each": history}}}, True)

Problem with above query is that I want to exclude the timestamp field(as it would always be unique eventually causing the dictionary to be pushed) from comparison of $addToSet which is not possible as per my research.

So I am trying to avoid doing two separate queries(to avoid performance issues as my collection contains millions of documents) with the an aggregation pipeline to which I am fairly new and would appreciate any help.
The updated document in mongodb should look like

{
    name: "abcd",
    manufacturer: "xyz",
    ...
    history: [
        {"condition": "new", "price": 150, "currency": "USD", "timestamp": "aaaaaaaaaaaa"},
        {"condition": "used", "price": 90, "currency": "USD", "timestamp": "cccccccccccc"}
    ]
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

疧_╮線 2025-02-09 16:50:14

一种选项是将Update与Pipeline一起使用,例如:

  1. 添加新历史记录并创建现有历史记录的“干净”副本 - 仅包含一个使用其属性的密钥,该键使用无时间戳并可以比较。
  2. 为新历史记录执行同样的操作 - 创建一个干净的密钥副本,并在当前数组中添加键。
  3. 使用$ setDifference从新清洁中删除现有的历史记录密钥。
  4. $ MERGE仅使用具有所有数据的新历史记录后留下的历史记录对象。
  5. 清洁数据,并将现有历史记录与新历史相结合。
db.collection.update(
{name: "abcd", manufacturer: "xyz"},
[
  {
    $addFields: {
      newHistory: [
        {condition: "new", price: 150, currency: "USD", timestamp: "bbbbbbbbbbbb"},
        {condition: "used", price: 90, currency: "USD", timestamp: "cccccccccccc"}
      ],
      oldHistoryClean: {
        $map: {
          input: "$history",
          as: "item",
          in: {
            key: 
            {$concat: ["$item.condition", {$toString: "$item.price"}, "$item.currency"]}
          }
        }
      }
    }
  },
  {
    $addFields: {
      newHistoryClean: {
        $map: {
          input: "$newHistory",
          as: "item",
          in: {
            key: 
            {$concat: ["$item.condition", {$toString: "$item.price"}, "$item.currency"]}
          }
        }
      },
      newHistory: {
        $map: {
          input: "$newHistory",
          as: "item",
          in: {
            condition: "$item.condition",
            price: "$item.price",
            currency: "$item.currency",
            timestamp: "$item.timestamp",
            key: 
            {$concat: ["$item.condition", {$toString: "$item.price"}, "$item.currency"]}
          }
        }
      }
    }
  },
  {
    $set: {
      newHistoryClean: {$setDifference: ["$newHistoryClean", "$oldHistoryClean"]}
    }
  },
  {
    $set: {
      newHistoryClean: {
        $map: {
          input: "$newHistoryClean",
          in: {
            $mergeObjects: [
              "$this",
              {
                $arrayElemAt: [
                  "$newHistory",
                  {$indexOfArray: ["$newHistory.key", "$this.key"]}
                ]
              }
            ]
          }
        }
      }
    }
  },
  {
    $set: {
      newHistoryClean: {
        $map: {
          input: "$newHistoryClean",
          as: "item",
          in: {
            condition: "$item.condition",
            price: "$item.price",
            currency: "$item.currency",
            timestamp: "$item.timestamp"
          }
        }
      }
    }
  },
  {
    $project: {
      history: {$concatArrays: ["$history", "$newHistoryClean"]},
      name: 1,
      manufacturer: 1
    }
  }
],
{
  $upsert: true
})

Playground示例

One option is to use update with pipeline, like this:

  1. Add the new history and create a "clean" copy of the existing history - contains only a key that uses its properties without the timestamp and enables to compare.
  2. Do the same for the new history - create a clean copy of keys and also add the keys to the current array.
  3. use $setDifference to remove existing history keys from the new clean one.
  4. $merge only the history object that left after the removal with the new history that have all the data.
  5. clean the data and concatenate existing history with new one.
db.collection.update(
{name: "abcd", manufacturer: "xyz"},
[
  {
    $addFields: {
      newHistory: [
        {condition: "new", price: 150, currency: "USD", timestamp: "bbbbbbbbbbbb"},
        {condition: "used", price: 90, currency: "USD", timestamp: "cccccccccccc"}
      ],
      oldHistoryClean: {
        $map: {
          input: "$history",
          as: "item",
          in: {
            key: 
            {$concat: ["$item.condition", {$toString: "$item.price"}, "$item.currency"]}
          }
        }
      }
    }
  },
  {
    $addFields: {
      newHistoryClean: {
        $map: {
          input: "$newHistory",
          as: "item",
          in: {
            key: 
            {$concat: ["$item.condition", {$toString: "$item.price"}, "$item.currency"]}
          }
        }
      },
      newHistory: {
        $map: {
          input: "$newHistory",
          as: "item",
          in: {
            condition: "$item.condition",
            price: "$item.price",
            currency: "$item.currency",
            timestamp: "$item.timestamp",
            key: 
            {$concat: ["$item.condition", {$toString: "$item.price"}, "$item.currency"]}
          }
        }
      }
    }
  },
  {
    $set: {
      newHistoryClean: {$setDifference: ["$newHistoryClean", "$oldHistoryClean"]}
    }
  },
  {
    $set: {
      newHistoryClean: {
        $map: {
          input: "$newHistoryClean",
          in: {
            $mergeObjects: [
              "$this",
              {
                $arrayElemAt: [
                  "$newHistory",
                  {$indexOfArray: ["$newHistory.key", "$this.key"]}
                ]
              }
            ]
          }
        }
      }
    }
  },
  {
    $set: {
      newHistoryClean: {
        $map: {
          input: "$newHistoryClean",
          as: "item",
          in: {
            condition: "$item.condition",
            price: "$item.price",
            currency: "$item.currency",
            timestamp: "$item.timestamp"
          }
        }
      }
    }
  },
  {
    $project: {
      history: {$concatArrays: ["$history", "$newHistoryClean"]},
      name: 1,
      manufacturer: 1
    }
  }
],
{
  $upsert: true
})

Playground example

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文