PKCS#7 SignedData 和多种摘要算法

发布于 2024-11-29 21:27:21 字数 607 浏览 1 评论 0 原文

我正在研究将应用程序从作为默认 PKCS#7 SignedData 摘要算法的 SHA1 升级到更强的摘要(例如 SHA256),以保留不支持 SHA1 以外的摘要算法的签名验证器的向后兼容性。我想检查我对 PKCS#7 格式和可用选项的理解。

我想做的是使用 SHA1 和 SHA256(或更一般地说,一组摘要算法)来摘要消息内容,以便较旧的应用程序可以继续通过 SHA1 进行验证,而升级的应用程序可以开始通过 SHA256 进行验证(更一般地说, ,提供的最强摘要),忽略较弱的算法。 [如果有更好的方法,请告诉我。]

看来在 PKCS#7 标准中,提供多个摘要的唯一方法是提供多个 SignerInfo,每个摘要算法一个。不幸的是,这似乎会导致安全性的净下降,因为攻击者能够使用最弱的摘要算法剥离除 SignerInfo 之外的所有内容,而仅靠该摘要算法仍将形成有效的签名。这种理解正确吗?

如果是这样,我的想法是使用 SignerInfo 的authentiatedAttributes 字段中的自定义属性来为附加摘要算法提供附加消息摘要(将 SHA1 保留为“默认”算法以实现向后兼容性)。由于该字段作为单个块进行身份验证,因此可以防止上述攻击。这看起来是一个可行的方法吗?有没有办法在不超出 PKCS 标准的情况下完成此任务或类似的任务?

I'm investigating upgrading an application from SHA1 as the default PKCS#7 SignedData digest algorithm to stronger digests such as SHA256, in ways that preserve backwards compatibility for signature verifiers which do not support digest algorithms other than SHA1. I want to check my understanding of the PKCS#7 format and available options.

What think I want to do is digest message content with both SHA1 and SHA256 (or more generally, a set of digest algorithms) such that older applications can continue to verify via the SHA1, and upgraded applications can begin verifying via the SHA256 (more generally, the strongest digest provided), ignoring the weaker algorithm(s). [If there is a better approach, please let me know.]

It appears that within the PKCS#7 standard, the only way to provide multiple digests is to provide multiple SignerInfos, one for each digest algorithm. Unfortunately, this would seem to lead to a net decrease in security, as an attacker is able to strip all but the the SignerInfo with the weakest digest algorithm, which alone will still form a valid signature. Is this understanding correct?

If so, my idea was to use custom attributes within the authenticatedAttributes field of SignerInfo to provide additional message-digests for the additional digest algorithms (leaving SHA1 as the "default" algorithm for backwards compatibility). Since this field is authenticated as a single block, this would prevent the above attack. Does this seem like a viable approach? Is there a way to accomplish this or something similar without going outside of the PKCS standard?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

白云悠悠 2024-12-06 21:27:21

是的,你是对的,在当前的 CMS RFC 中,它提到了消息摘要属性

signerInfo 中的 SignedAttributes
必须仅包含消息摘要属性的一个实例。
类似地,AuthenticatedData 中的 AuthAttributes 必须包括
message-digest 属性只有一个实例。

因此,使用标准签名属性提供多个消息摘要值的唯一方法确实是提供多个signedInfo。

是的,任何安全系统的强度取决于其最薄弱的环节,因此理论上,如果您仍然接受 SHA-1,则通过使用 SHA-256 添加 SignedInfo 将不会获得任何好处 - 正如您所说,更强的签名始终可以被删除。

具有自定义属性的方案有点难以破解 - 但仍然存在可能受到攻击的 SHA-1 哈希。它不再像剥离属性那么简单 - 因为它被签名覆盖了。但是:

还有摘要算法,用于摘要签名属性,作为最终签名值的基础。你打算在那里使用什么? SHA-256 还是 SHA-1?如果它是 SHA-1,那么您将处于与以前相同的情况:

如果我可以为 SHA-1 产生冲突,那么我会剥离您的自定义 SHA-256 属性并以这样的方式伪造 SHA-1 属性:签名的最终 SHA-1 摘要再次相加。这表明,只有当签名摘要算法也是 SHA-256 时,才会提高安全性,但我猜这不是选择,因为您希望保持向后兼容。

对于您的情况,我建议始终使用 SHA-1,但应用 RFC 3161 符合您的签名的时间戳作为未签名的属性。这些时间戳实际上是它们自己的签名。好处是您可以使用 SHA-256 作为消息印记,并且时间戳服务器通常使用您提供的相同摘要算法应用其签名。然后拒绝任何不包含此类时间戳或仅包含消息印记/签名摘要算法弱于 SHA-256 的时间戳的签名。

这个解决方案有什么好处?您的遗留应用程序应该检查是否存在未签名的时间戳属性以及是否使用了强摘要,但否则忽略它们并继续像以前一样验证签名。另一方面,新的应用程序将验证签名,但也会验证时间戳。由于时间戳签名“覆盖”了签名值,攻击者不再能够伪造签名。尽管签名使用 SHA-1 作为摘要值,但攻击者也必须能够破解更强的时间戳摘要。

时间戳的另一个好处是您可以将生成日期与签名相关联 - 您可以安全地声明签名是在时间戳时间之前生成的。因此,即使签名证书被撤销,借助时间戳,您仍然可以根据证书被撤销的时间精确地决定是拒绝还是接受签名。如果证书在时间戳之后被撤销,那么如果证书在时间戳时间之前被撤销,您可以接受签名(添加安全裕度(又名“宽限期”) - 信息发布需要一些时间)那么你想拒绝签名。

时间戳的最后一个好处是,如果某些算法变得很弱,您可以随着时间的推移更新它们。例如,您可以使用最新算法每 5-10 年应用一个新时间戳,并让新时间戳覆盖所有旧签名(包括旧时间戳)。这样,较弱的算法就会被更新、更强的时间戳签名覆盖。看看CAdES(还有一个RFC,但它已经过时了到目前为止),它基于 CMS,并尝试应用这些策略来提供 CMS 签名的长期归档。

Yes, you are right, in the current CMS RFC it says about the message digest attribute that

The SignedAttributes in a signerInfo
MUST include only one instance of the message-digest attribute.
Similarly, the AuthAttributes in an AuthenticatedData MUST include
only one instance of the message-digest attribute.

So it is true that the only way to provide multiple message digest values using the standard signed attributes is to provide several signedInfos.

And yes, any security system is as strong as its weakest link, so theoretically you will not gain anything by adding a SignedInfo with SHA-256 if you also still accept SHA-1 - as you said, the stronger signatures can always be stripped.

Your scheme with custom attributes is a bit harder to break - but there is still a SHA-1 hash floating around that can be attacked. It's no longer as easy as just stripping the attribute - as it's covered by the signature. But:

There is also the digest algorithm that is used to digest the signed attributes which serves as the basis of the final signature value. What do you intend to use there? SHA-256 or SHA-1? If it's SHA-1, then you will be in the same situation as before:

If I can produce collisions for SHA-1, then I would strip off your custom SHA-256 attribute and forge the SHA-1 attribute in such a way that the final SHA-1 digest for the signature adds up again. This shows that there will only be a gain in security if the signature digest algorithm would be SHA-256, too, but I'm guessing this is no option since you want to stay backwards-compatible.

What I would suggest in your situation is to keep using SHA-1 throughout but apply an RFC 3161-compliant timestamp to your signature as an unsigned attribute. Those timestamps are in fact signatures of their own. The good thing is you can use SHA-256 for the message imprint there and often the timestamp server applies its signature using the same digest algorithm you provided. Then reject any signature that either does not contain such a timestamp or contains only timestamps with message imprint/signature digest algorithms weaker than SHA-256.

What's the benefit of this solution? Your legacy applications should check for the presence of an unsigned timestamp attribute and if a strong digest was used for it, but otherwise ignore them and keep on verifying the signatures the same way they did before. New applications on the other hand will verify the signature but additionally verify the timestamp, too. As the timestamp signature "covers" the signature value, there's no longer a way for an attacker to forge the signature. Although the signature uses SHA-1 for the digest values an attacker would have to be able to break break the stronger digest of the timestamp, too.

An additional benefit of a timestamp is that you can associate a date of production with your signature - you can safely claim that the signature has been produced before the time of the timestamp. So even if a signature certificate were to be revoked, with the help of the timestamp you could still precisely decide whether to reject or accept a signature based on the time that the certificate was revoked. If the certificate was revoked after the timestamp, then you can accept the signature (add a safety margin (aka "grace period") - it takes some time until the information gets published), if it was revoked prior to the time of the timestamp then you want to reject the signature.

A last benefit of timestamps is that you can renew them over time if certain algorithms get weak. You could for example apply a new timestamp every 5-10 years using up-to-date algorithms and have the new timestamps cover all of the older signatures (including older timestamps). This way weak algorithms are then covered by the newer, stronger timestamp signature. Have a look at CAdES (there exists also an RFC, but it's outdated by now), which is based on CMS and makes an attempt at applying these strategies to provide for long-term archiving of CMS signatures.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文