超越 Salesforce 触发器调节器

发布于 2024-08-28 08:31:17 字数 592 浏览 7 评论 0原文

我正在尝试编写一个“更新后”触发器,该触发器对刚刚更新的记录的所有子记录进行批量更新。这需要能够一次处理 15k+ 子记录。不幸的是,限制似乎是 100,这远远低于我的需求,甚至无法接受。我还没有尝试将记录分成每批 100 条,因为这仍然会让我每次触发器执行的更新上限为 10k。 (也许我可以将触发器连接在一起?呃。)

有谁知道我可以跳过哪些系列的环来克服这个限制?

编辑:我尝试在触发器中调用以下 @future 函数,但它永远不会更新子记录:

global class ParentChildBulkUpdater
{
    @future 
    public static void UpdateChildDistributors(String parentId) {
        Account[] children = [SELECT Id FROM Account WHERE ParentId = :parentId];

        for(Account child : children)
            child.Site = 'Bulk Updater Fired';
        update children;

    }
}

I'm trying to write an "after update" trigger that does a batch update on all child records of the record that has just been updated. This needs to be able to handle 15k+ child records at a time. Unfortunately, the limit appears to be 100, which is so far below my needs it's not even close to acceptable. I haven't tried splitting the records into batches of 100 each, since this will still put me at a cap of 10k updates per trigger execution. (Maybe I could just daisy-chain triggers together? ugh.)

Does anyone know what series of hoops I can jump through to overcome this limitation?

Edit: I tried calling following @future function in my trigger, but it never updates the child records:

global class ParentChildBulkUpdater
{
    @future 
    public static void UpdateChildDistributors(String parentId) {
        Account[] children = [SELECT Id FROM Account WHERE ParentId = :parentId];

        for(Account child : children)
            child.Site = 'Bulk Updater Fired';
        update children;

    }
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(6

丢了幸福的猪 2024-09-04 08:31:17

解决此问题的最佳(也是最简单)途径是使用 Batch Apex,您可以创建一个批处理类并从触发器中触发它。与 @future 一样,它在单独的线程中运行,但它最多可以处理 50,000,000 条记录!

在使用database.executeBatch之前,您需要将一些信息传递给批处理类,以便它具有可使用的父ID列表,或者您当然可以获取所有帐户;)

我刚刚注意到这个问题有多老了,但希望这个答案对其他人有帮助。

The best (and easiest) route to take with this problem is to use Batch Apex, you can create a batch class and fire it from the trigger. Like @future it runs in a separate thread, but it can process up to 50,000,000 records!

You'll need to pass some information to your batch class before using database.executeBatch so that it has the list of parent IDs to work with, or you could just get all of the accounts of course ;)

I've only just noticed how old this question is but hopefully this answer will help others.

天涯沦落人 2024-09-04 08:31:17

比这更糟糕的是,您甚至无法首先获取这 15k 条记录,因为触发器内有 1,000 行查询限制(这会扩展到触发器被调用的行数,但这可能没有帮助)

我想你唯一的方法是使用 @future 标签 - 阅读文档中的内容。它给你更高的限制。尽管如此,您一天只能调用其中的许多个 - 因此您可能需要以某种方式跟踪哪些父对象的子对象正在更新,然后离线处理。

最后一个选择可能是通过某些外部工具使用 API。但您仍然必须确保代码中的所有内容都已批量处理。

我一开始认为这些限制很严格,但实际上,如果您正确地批处理,您可以在其中做很多事情,我们定期从触发器更新 1,000 行。从架构的角度来看,远不止于此,而且您实际上正在谈论批处理,而批处理通常不会由触发器激活。有一点是肯定的——它们会让你克服重重困难去做这件事。

It's worst than that, you're not even going to be able to get those 15k records in the first place, because there is a 1,000 row query limit within a trigger (This scales to the number of rows the trigger is being called for, but that probably doesnt help)

I guess your only way to do it is with the @future tag - read up on that in the docs. It gives you much higher limits. Although, you can only call so many of those in a day - so you may need to somehow keep track of which parent objects have their children updating, and then process that offline.

A final option may be to use the API via some external tool. But you'll still have to make sure everything in your code is batched up.

I thought these limits were draconian at first, but actually you can do a hell of a lot within them if you batch things correctly, we regularly update 1,000's of rows from triggers. And from an architectural point of view, much more than that and you're really talking batch processing anyway which isnt normally activated by a trigger. One things for sure - they make you jump through hoops to do it.

浪推晚风 2024-09-04 08:31:17

我认为Codek是对的,走API/外部工具路线是一个好方法。调控器限制仍然适用,但对 API 调用的限制要宽松得多。 Salesforce 最近改进了他们的 DataLoader 工具,因此这可能值得研究。

您可以尝试的另一件事是使用带有出站消息的工作流规则来调用您端的 Web 服务。只需发送父对象并让您端的进程通过 API 处理子记录更新即可。对于出站消息需要注意的一件事是,最好以某种方式在您端对流程进行排队,并立即响应 Salesforce。否则 Salesforce 将重新发送消息。

I think Codek is right, going the API / external tool route is a good way to go. The governor limits still apply, but are much less strict with API calls. Salesforce recently revamped their DataLoader tool, so that might be something to look into.

Another thing you could try is using a Workflow rule with an Outbound Message to call a web service on your end. Just send over the parent object and let a process on your end handle the child record updates via the API. One thing to be aware of with outbound messages, it is best to queue up the process on your end somehow, and immediately respond to Salesforce. Otherwise Salesforce will resend the message.

夏雨凉 2024-09-04 08:31:17

@future 不起作用(根本不更新记录)?诡异的。您是否尝试过在自动化测试中使用您的功能?它应该可以工作,并且注释应该被忽略(在测试期间它将立即执行,测试方法有更高的限制)。我建议您对此进行更多调查,这似乎是您想要实现的目标的最佳解决方案。

另外 - 也许尝试从你的班级调用它,而不是触发器?

将触发器菊花链在一起不起作用,我过去曾尝试过。

您的最后一个选项可能是批处理 Apex(从 Winter'10 版本开始,因此所有组织现在都应该拥有它)。它适用于大量数据更新/验证作业,这些作业通常在普通数据库中过夜运行(可以安排)。请参阅 http://www.salesforce .com/community/winter10/custom-cloud/program-cloud-logic/batch-code.jsp 和发行说明 PDF。

@future doesn't work (does not update records at all)? Weird. Did you try using your function in automated test? It should work and and the annotation should be ignored (during the test it will be executed instantly, test methods have higher limits). I suggest you investigate this a bit more, it seems like best solution to what you want to accomplish.

Also - maybe try to call it from your class, not the trigger?

Daisy-chaining triggers together will not work, I've tried it in the past.

Your last option might be batch Apex (from Winter'10 release so all organisations should have it by now). It's meant for mass data update/validation jobs, things you typically run overnight in normal databases (it can be scheduled). See http://www.salesforce.com/community/winter10/custom-cloud/program-cloud-logic/batch-code.jsp and release notes PDF.

盛夏已如深秋| 2024-09-04 08:31:17

我相信在 API 的第 18 版中,1000 的限制已被删除。 (所以文档说,但在某些情况下我仍然达到了限制)

所以你也许可以使用批处理顶点。使用单个 APEX 更新语句,

例如:

列出子项 = 新列表{};

for(childObect__c c : [SELECT ....]) {

<块引用>

c.foo__c = '酒吧';

children.add(c);

}
更新(子级);;

确保您批量化您的跳跳虎还请参阅 http:// sfdc.arrowpointe.com/2008/09/13/bulkifying-a-trigger-an-example/

I believe in version 18 of the API the 1000 limit has been removed. (so the documentations says but in some cases I still hit a limit)

So you may be able to use batch apex. With a single APEX update statement

Something like:

List children = new List{};

for(childObect__c c : [SELECT ....]) {

c.foo__c = 'bar';

children.add(c);

}
update(children);;

Besure you bulkify your tigger also see http://sfdc.arrowpointe.com/2008/09/13/bulkifying-a-trigger-an-example/

枕梦 2024-09-04 08:31:17

也许更改数据模型是更好的选择。考虑在子对象上创建一个公式,您可以在其中访问父对象的数据。这可能会更有效率。

Maybe a change to your data model is the better option here. Think of creating a formula on the children object where you access the data from the parent. This would be far more efficient probably.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文