获取使用 SHA1CryptoServiceProvider 计算 SHA1 的进度
目前,我正在 C++/CLI 代码中实现一个返回文件 SHA1 值的函数。它是 Visual Studio 中的 Windows 窗体应用程序。
我选择实现 .NetFramework 类 SHA1CryptoServiceProvider 因为它真的很快(相信我)。我测试了几种算法,但没有一个算法像 SHA1CryptoServiceProvider 类那么快。
问题是,在我的应用程序中,有一个进度条显示计算 SHA1 的进度,而 SHA1CryptoServiceProvider 类没有任何返回计算 SHA1 进度的函数。
这是代码:
using namespace System::Security::Cryptography;
using namespace System::IO;
StreamReader^ Reader = gcnew StreamReader("C:\\abc.exe");
SHA1CryptoServiceProvider^ SHA1 = gcnew SHA1CryptoServiceProvider();
String^ Hash = "";
Hash = BitConverter::ToString(SHA1->ComputeHash(Reader->BaseStream));
return Hash;
Currently I am implementing into my C++/CLI code a function that return the SHA1 value of a file. It is a Windows Forms application in Visual Studio.
I chose to implement the .NetFramework class SHA1CryptoServiceProvider because it is really fast (believe me). I have tested several algorithms but none of them were as fast as the SHA1CryptoServiceProvider class.
The problem is that in my application there is a progressBar showing the progress of computing SHA1 and the SHA1CryptoServiceProvider class doesn't have any function that returns progress of computing SHA1.
Here is the code:
using namespace System::Security::Cryptography;
using namespace System::IO;
StreamReader^ Reader = gcnew StreamReader("C:\\abc.exe");
SHA1CryptoServiceProvider^ SHA1 = gcnew SHA1CryptoServiceProvider();
String^ Hash = "";
Hash = BitConverter::ToString(SHA1->ComputeHash(Reader->BaseStream));
return Hash;
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
最后我做到了。我发布代码,也许有人会发现它有用。我知道代码不干净,我仍在学习。它可以计算大于 2^31 字节的文件的 SHA1。在 22GB 文件上进行了测试。在后台工作人员中工作正常:)
Finally I have done it. I post the code, maybe someone will find it useful. I know the code is not clean, I am still learning. It can compute SHA1 of files larger that 2^31 bytes. Tested it on a 22GB file. Works fine in backgroundWorker :)