如何找到深度学习模型的大小?
我正在处理同一模型的不同量化实现,主要区别是权重,偏见和激活的精度。因此,我想知道如何找到MBS中32位浮点的模型大小与INT8中的差异之间的区别。我的模型以.pth格式保存。
I am working with different quantized implementations of the same model, the main difference being the precision of the weights, biases, and activations. So I'd like to know how I can find the difference between the size of a model in MBs that's in say 32-bit floating point, and one that's in int8. I have the models saved in .PTH format.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(5)
您可以计算参数和缓冲区的数量。
然后尝试将它们与元素大小相乘,您将具有所有参数的大小。
它将打印:
You are able to calculate the number of parameters and buffers.
Then try to multiply them with the element size and you will have the size of all parameters.
And it will print:
“要计算模型大小(以字节为单位),请将参数数量乘以所选精度的大小(以字节为单位)。例如,如果我们使用 BLOOM-176B 模型的 bfloat16 版本,则有 176*10**9 x 2 字节 = 352GB!”
这个关于 HF 的博客值得一读:https://huggingface.co/blog/hf-位和字节集成
"To calculate the model size in bytes, one multiplies the number of parameters by the size of the chosen precision in bytes. For example, if we use the bfloat16 version of the BLOOM-176B model, we have 176*10**9 x 2 bytes = 352GB!"
This blog on HF would be worth the read: https://huggingface.co/blog/hf-bitsandbytes-integration
遵循@Prajot的代码,我们可以得到一个精度乘数和多个参数乘数,从而得出一条经验法则。
规则
的经验计算
TLDR -达到精度乘数
参数乘数
模型尺寸计算
Following @Prajot's code, one can arrive at a Precision multiplier and multiple parameter multipliers leading to a rule of thumb.
TLDR - Rule of thumb
Calculations
Arriving at a Precision multiplier
Param multiplier
Model size calculation
已经编写了一个小型代码来计算模型的大小,具体取决于您的模型的参数次数&
dtype
您的模型当前支持
fp32
,fp16
,bfloat16
&int8
Have written a small code to calculate the size of your model depending on the number of params your model has &
dtype
of your modelCurrently supports
fp32
,fp16
,bfloat16
&int8
另一种方法是仅计算将权重下载到您的计算机的文件夹的大小(如果使用诸如 Hugging Face 之类的东西)。
An alternative is to just to figure out the size of the folder where the weights were downloaded to your machine (if using something like Hugging Face).