如何在联邦学习中的客户举重中添加噪音(差异隐私)?

发布于 2025-01-29 01:24:29 字数 748 浏览 4 评论 0 原文

我想在客户端的梯度中添加噪音。我修改了 tf.keras.optimizers.adam() dpkerasadamoptimizer(),但它不起作用。

    iterative_process = tff.learning.build_federated_averaging_process(
        model_fn=Create_tff_model,
        client_optimizer_fn=lambda: DPKerasAdamOptimizer(1,1.85))

错误是

AssertionError: Neither _compute_gradients() or get_gradients() on the differentially private optimizer was called. This means the training is not differentially private. It may be the case that you need to upgrade to TF 2.4 or higher to use this particular optimizer.

我可以使用 tff.learning.model_update_aggregator.dp_aggregator(noings_multiplier,client_per_round) 添加噪声,但是如何在客户端上添加噪声?

I want to add noise to the gradient on the client side. I modified tf.keras.optimizers.Adam() to DPKerasAdamOptimizer(), but it doesn't work.

    iterative_process = tff.learning.build_federated_averaging_process(
        model_fn=Create_tff_model,
        client_optimizer_fn=lambda: DPKerasAdamOptimizer(1,1.85))

The error is

AssertionError: Neither _compute_gradients() or get_gradients() on the differentially private optimizer was called. This means the training is not differentially private. It may be the case that you need to upgrade to TF 2.4 or higher to use this particular optimizer.

I can add noise on the server side using the tff.learning.model_update_aggregator.dp_aggregator(noise_multiplier, client_per_round), but how to add noise on the client side?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

待"谢繁草 2025-02-05 01:24:29

首先,看看教程 tff.learning.dp_aggregator 。

如果您想自定义机制的详细信息,则可以查看如何实现 dp_aggregator ,特别是 tff.aggregators.differentiallyprivateFactory 被TensorFlow Privacy参数化对象,或编写a 自定义聚合器来自scratch。

请注意,使用 dpkerasadamoptimizer 作为客户端优化器可能不是正确的路径,因为通常有趣的部分是将数据留下的任何数据私有化,但是客户端的中间步骤并不重要。

First, have a look at tutorial Differential Privacy in TFF which shows the simple gaussian mechanism using tff.learning.dp_aggregator.

If you would like to customize the details of the mechanism, you can either look at how the dp_aggregator is implemented, in particular tff.aggregators.DifferentiallyPrivateFactory being parameterized by a TensorFlow Privacy object, or write a custom aggregator from scratch.

Note that using DPKerasAdamOptimizer as the client optimizer might not be the right path, as usually the interesting part is to privatize whatever data leaves the client, but the intermediate steps at a client are not important.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文