如何在联邦学习中的客户举重中添加噪音(差异隐私)?
我想在客户端的梯度中添加噪音。我修改了 tf.keras.optimizers.adam()
dpkerasadamoptimizer()
,但它不起作用。
iterative_process = tff.learning.build_federated_averaging_process(
model_fn=Create_tff_model,
client_optimizer_fn=lambda: DPKerasAdamOptimizer(1,1.85))
错误是
AssertionError: Neither _compute_gradients() or get_gradients() on the differentially private optimizer was called. This means the training is not differentially private. It may be the case that you need to upgrade to TF 2.4 or higher to use this particular optimizer.
我可以使用 tff.learning.model_update_aggregator.dp_aggregator(noings_multiplier,client_per_round)
添加噪声,但是如何在客户端上添加噪声?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
首先,看看教程 tff.learning.dp_aggregator 。
如果您想自定义机制的详细信息,则可以查看如何实现
dp_aggregator
,特别是tff.aggregators.differentiallyprivateFactory
被TensorFlow Privacy参数化对象,或编写a 自定义聚合器来自scratch。请注意,使用
dpkerasadamoptimizer
作为客户端优化器可能不是正确的路径,因为通常有趣的部分是将数据留下的任何数据私有化,但是客户端的中间步骤并不重要。First, have a look at tutorial Differential Privacy in TFF which shows the simple gaussian mechanism using
tff.learning.dp_aggregator
.If you would like to customize the details of the mechanism, you can either look at how the
dp_aggregator
is implemented, in particulartff.aggregators.DifferentiallyPrivateFactory
being parameterized by a TensorFlow Privacy object, or write a custom aggregator from scratch.Note that using
DPKerasAdamOptimizer
as the client optimizer might not be the right path, as usually the interesting part is to privatize whatever data leaves the client, but the intermediate steps at a client are not important.