删除下载的TensorFlow和Pytorch(拥抱面)型号

发布于 2025-01-28 03:46:35 字数 262 浏览 2 评论 0 原文

我想从笔记本电脑上删除Tensorflow和拥抱面部型号。 我确实找到了一个链接 https://github.com/github.com/huggingface/transforceface/transformers/sissues/861 但是是否没有命令可以删除它们一些错误。

I would like to remove tensorflow and hugging face models from my laptop.
I did find one link https://github.com/huggingface/transformers/issues/861
but is there not command that can remove them because as mentioned in the link manually deleting can cause problems because we don't know which other files are linked to those models or are expecting some model to be present in that location or simply it may cause some error.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

允世 2025-02-04 03:46:36

变形金刚库将将下载的文件存储在您的缓存中。据我所知,没有内置方法可以从缓存中删除某些模型。但是您可以自己编码一些内容。这些文件与具有 .json 的两个附加文件一起存储。 >附加到神秘的名称。 JSON文件包含一些可用于识别文件的元数据。以下是这样一个文件的示例:

{"url": "https://cdn.huggingface.co/roberta-base-pytorch_model.bin", "etag": "\"8a60a65d5096de71f572516af7f5a0c4-30\""}

我们现在可以使用此信息来创建您的缓存文件列表,如下所示:

import glob
import json
import re
from collections import OrderedDict 
from transformers import TRANSFORMERS_CACHE
 
metaFiles = glob.glob(TRANSFORMERS_CACHE + '/*.json')
modelRegex = "huggingface\.co\/(.*)(pytorch_model\.bin$|resolve\/main\/tf_model\.h5$)"

cachedModels = {}
cachedTokenizers = {}
for file in metaFiles:
     with open(file) as j:
         data = json.load(j)
         isM = re.search(modelRegex, data['url'])
         if isM:
             cachedModels[isM.group(1)[:-1]] = file
         else:
             cachedTokenizers[data['url'].partition('huggingface.co/')[2]] = file

cachedTokenizers = OrderedDict(sorted(cachedTokenizers.items(), key=lambda k: k[0]))

现在您要做的就是检查 cachedmodels and <代码> cachedtokenizers 并决定是否要保留它们。如果要删除它们,只需检查字典的值,然后从缓存中删除文件即可。不要忘记还删除相应的*。json *。锁定文件。

The transformers library will store the downloaded files in your cache. As far as I know, there is no built-in method to remove certain models from the cache. But you can code something by yourself. The files are stored with a cryptical name alongside two additional files that have .json (.h5.json in case of Tensorflow models) and .lock appended to the cryptical name. The json file contains some metadata that can be used to identify the file. The following is an example of such a file:

{"url": "https://cdn.huggingface.co/roberta-base-pytorch_model.bin", "etag": "\"8a60a65d5096de71f572516af7f5a0c4-30\""}

We can now use this information to create a list of your cached files as shown below:

import glob
import json
import re
from collections import OrderedDict 
from transformers import TRANSFORMERS_CACHE
 
metaFiles = glob.glob(TRANSFORMERS_CACHE + '/*.json')
modelRegex = "huggingface\.co\/(.*)(pytorch_model\.bin$|resolve\/main\/tf_model\.h5$)"

cachedModels = {}
cachedTokenizers = {}
for file in metaFiles:
     with open(file) as j:
         data = json.load(j)
         isM = re.search(modelRegex, data['url'])
         if isM:
             cachedModels[isM.group(1)[:-1]] = file
         else:
             cachedTokenizers[data['url'].partition('huggingface.co/')[2]] = file

cachedTokenizers = OrderedDict(sorted(cachedTokenizers.items(), key=lambda k: k[0]))

Now all you have to do is to check the keys of cachedModels and cachedTokenizers and decide if you want to keep them or not. In case you want to delete them, just check for the value of the dictionary and delete the file from the cache. Don't forget to also delete the corresponding *.json and *.lock files.

绮筵 2025-02-04 03:46:36

您可以运行此代码以删除所有模型

from transformers import TRANSFORMERS_CACHE
print(TRANSFORMERS_CACHE)

import shutil
shutil.rmtree(TRANSFORMERS_CACHE)

You can run this code to delete all models

from transformers import TRANSFORMERS_CACHE
print(TRANSFORMERS_CACHE)

import shutil
shutil.rmtree(TRANSFORMERS_CACHE)
神爱温柔 2025-02-04 03:46:36
pip uninstall tensorflow 
pip uninstall tensorflow-gpu
pip uninstall transformers

并找到您保存的gpt-2

model.save_pretaining(“ ./ English-gpt2”)

English-gpt2 =您下载的型号名称。

您可以手动删除那条路。

pip uninstall tensorflow 
pip uninstall tensorflow-gpu
pip uninstall transformers

and find where you have saved gpt-2

model.save_pretrained("./english-gpt2") .???

english-gpt2 = your downloaded model name.

from that path you can manually delete.

挽手叙旧 2025-02-04 03:46:35

使用

pip install huggingface_hub["cli"]

然后

huggingface-cli delete-cache

您现在应该查看可以选择/取消选择的修订列表。

参见此链接

Use

pip install huggingface_hub["cli"]

Then

huggingface-cli delete-cache

You should now see a list of revisions that you can select/deselect.

See this link for details.

铃予 2025-02-04 03:46:35

来自注释找到缓存目录,以便您可以清洁:

from transformers import file_utils
print(file_utils.default_cache_path)

From a comment in transformers github issue, you can use the following way to find the cache directory so that you can clean it:

from transformers import file_utils
print(file_utils.default_cache_path)
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文