在没有Internet的机器上安装Jupyter笔记本电脑 - 获取错误:找不到满足要求的版本

发布于 2025-02-06 15:39:48 字数 1593 浏览 2 评论 0 原文

此答案Internet:


如果要安装Python Libs及其依赖项离线,请在具有相同OS,网络连接的机器上遵循以下步骤,并安装了Python:

  1. 创建一个 insuert.txt 具有相似的文件内容(注意 - 这些是您要下载的库):

    烧瓶== 0.12 请求> = 2.7.0 Scikit-learn == 0.19.1 numpy == 1.14.3 pandas == 0.22.0

创建需求文件的一个选项是使用 pip冻结>需求.txt 。这将列出您环境中的所有库。然后,您可以进入 unignts.txt 并删除不需要的。

  1. 执行命令 mkdir Wheelhouse&& pip下载-r需求.txt -d惠室要下载libs及其依赖项 heelhouse

  2. 复制要求。 >

  3. 存档驾驶室 wheelhouse.tar.gz 带有 tar -zcf heelhouse.tar.tar.gz wheelhouse < < < /p>

然后上传 wheelhouse.tar.gz 到您的目标机器:

  1. 执行 tar -zxf heelhouse.tar.gz 提取文件

  2. 执行 pip install -r heelhouse/insuert.txt -no -index -no -index -find -links wheelhouse 安装LIB及其依赖项


这正是我在做的,除了我的unignts.txt,目前,仅是:

笔记本== 7.0.0a4

是jupyter笔记本。

但奇怪的是,我遇到了错误:

ERROR: Could not find a version that satisfies the requirement pyzmq>=17 (from jupyter-server) (from versions: none)                                      
ERROR: No matching distribution found for pyzmq>=17

我弄清楚了如何在此错误上取得进展:

  • pyzmq == 17 添加到我的要求中

。似乎I 可以将这些软件包显式地添加到 unigess.txt 中,但这似乎远不及最佳,尤其是如果有很多软件包要添加。我是否可以将某些内容添加到车轮构建命令中以获取所有这些依赖项而无需手动进行操作?

This answer (and the one above it) explains a way to install pip requirements on an offline machine by first involving a machine with internet:


If you want install python libs and their dependencies offline, finish following these steps on a machine with the same os, network connected, and python installed:

  1. Create a requirements.txt file with similar content (Note - these are the libraries you wish to download):

    Flask==0.12
    requests>=2.7.0
    scikit-learn==0.19.1
    numpy==1.14.3
    pandas==0.22.0

One option for creating the requirements file is to use pip freeze > requirements.txt. This will list all libraries in your environment. Then you can go in to requirements.txt and remove un-needed ones.

  1. Execute command mkdir wheelhouse && pip download -r requirements.txt -d wheelhouse to download libs and their dependencies to directory wheelhouse

  2. Copy requirements.txt into wheelhouse directory

  3. Archive wheelhouse into wheelhouse.tar.gz with tar -zcf wheelhouse.tar.gz wheelhouse

Then upload wheelhouse.tar.gz to your target machine:

  1. Execute tar -zxf wheelhouse.tar.gz to extract the files

  2. Execute pip install -r wheelhouse/requirements.txt --no-index --find-links wheelhouse to install the libs and their dependencies


This is exactly what I'm doing, except my requirements.txt, for now, is just:

notebook==7.0.0a4

which is Jupyter Notebook.

But oddly, I'm getting the error:

ERROR: Could not find a version that satisfies the requirement pyzmq>=17 (from jupyter-server) (from versions: none)                                      
ERROR: No matching distribution found for pyzmq>=17

I figured out how to make progress on this error:

  • Adding pyzmq==17 to my requirements.txt

But then the same error appears for a similar package, so it seems like I could just keep explicitly adding these packages to requirements.txt but that seems a bit less than optimal, especially if there are a lot of packages to add. Is there something I could add to the wheel building command to get all these dependencies included without doing so manually?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

等数载,海棠开 2025-02-13 15:39:48

听起来您遵循了步骤1 ,一个并手动创建了 sumplliant.txt 用内容笔记本== 7.0.0a4 ,但错过了所说的部分:

创建需求文件的一个选项是使用 pip冻结&gt;需求.txt 。这将列出您环境中的所有库。然后,您可以进入需求。txt并删除不需要的。

如果我手动安装笔记本,请在“互联网连接”的计算机上(可能最好使用新的虚拟env来防止其他项目的dep泄漏),然后使用:

pip install notebook==7.0.0a4

然后 用:

pip freeze > requirements.txt

这给我一个 unignts.txt 包含的文件:

aiofiles==0.8.0
aiosqlite==0.17.0
anyio==3.6.1
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
asttokens==2.0.5
attrs==21.4.0
Babel==2.10.1
backcall==0.2.0
beautifulsoup4==4.11.1
bleach==5.0.0
certifi==2022.5.18.1
cffi==1.15.0
charset-normalizer==2.0.12
debugpy==1.6.0
decorator==5.1.1
defusedxml==0.7.1
entrypoints==0.4
executing==0.8.3
fastjsonschema==2.15.3
idna==3.3
importlib-metadata==4.11.4
importlib-resources==5.7.1
ipykernel==6.13.1
ipython==8.4.0
jedi==0.18.1
Jinja2==3.1.2
json5==0.9.8
jsonschema==4.6.0
jupyter-client==7.3.4
jupyter-core==4.10.0
jupyter-server==1.17.1
jupyter-ydoc==0.1.10
jupyterlab==4.0.0a26
jupyterlab-pygments==0.2.2
jupyterlab-server==2.14.0
MarkupSafe==2.1.1
matplotlib-inline==0.1.3
mistune==0.8.4
nbclient==0.6.4
nbconvert==6.5.0
nbformat==5.4.0
nest-asyncio==1.5.5
notebook==7.0.0a4
notebook-shim==0.1.0
packaging==21.3
pandocfilters==1.5.0
parso==0.8.3
pexpect==4.8.0
pickleshare==0.7.5
prometheus-client==0.14.1
prompt-toolkit==3.0.29
psutil==5.9.1
ptyprocess==0.7.0
pure-eval==0.2.2
pycparser==2.21
Pygments==2.12.0
pyparsing==3.0.9
pyrsistent==0.18.1
python-dateutil==2.8.2
pytz==2022.1
pyzmq==23.1.0
requests==2.28.0
Send2Trash==1.8.0
six==1.16.0
sniffio==1.2.0
soupsieve==2.3.2.post1
stack-data==0.2.0
terminado==0.15.0
tinycss2==1.1.1
tornado==6.1
traitlets==5.2.2.post1
typing-extensions==4.2.0
urllib3==1.26.9
wcwidth==0.2.5
webencodings==0.5.1
websocket-client==1.3.2
y-py==0.5.0
ypy-websocket==0.1.13
zipp==3.8.0

我认为这是制作需求文件的方法,然后将其输入步骤2 ,这意味着所有所需的dep都进入驾驶室以导出到该驾驶室,以“导出到”离线计算机”。

Sounds like you've followed step 1, one and manually created the requirements.txt with the contents notebook==7.0.0a4, but missed the part where it say's:

One option for creating the requirements file is to use pip freeze > requirements.txt. This will list all libraries in your environment. Then you can go in to requirements.txt and remove un-needed ones.

If I manually install notebook, on the "internet connected" computer (probably best use a fresh virtual env to prevent other project's deps leaking in) with:

pip install notebook==7.0.0a4

Then export the requirements with:

pip freeze > requirements.txt

This gives me a requirements.txt file containing:

aiofiles==0.8.0
aiosqlite==0.17.0
anyio==3.6.1
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
asttokens==2.0.5
attrs==21.4.0
Babel==2.10.1
backcall==0.2.0
beautifulsoup4==4.11.1
bleach==5.0.0
certifi==2022.5.18.1
cffi==1.15.0
charset-normalizer==2.0.12
debugpy==1.6.0
decorator==5.1.1
defusedxml==0.7.1
entrypoints==0.4
executing==0.8.3
fastjsonschema==2.15.3
idna==3.3
importlib-metadata==4.11.4
importlib-resources==5.7.1
ipykernel==6.13.1
ipython==8.4.0
jedi==0.18.1
Jinja2==3.1.2
json5==0.9.8
jsonschema==4.6.0
jupyter-client==7.3.4
jupyter-core==4.10.0
jupyter-server==1.17.1
jupyter-ydoc==0.1.10
jupyterlab==4.0.0a26
jupyterlab-pygments==0.2.2
jupyterlab-server==2.14.0
MarkupSafe==2.1.1
matplotlib-inline==0.1.3
mistune==0.8.4
nbclient==0.6.4
nbconvert==6.5.0
nbformat==5.4.0
nest-asyncio==1.5.5
notebook==7.0.0a4
notebook-shim==0.1.0
packaging==21.3
pandocfilters==1.5.0
parso==0.8.3
pexpect==4.8.0
pickleshare==0.7.5
prometheus-client==0.14.1
prompt-toolkit==3.0.29
psutil==5.9.1
ptyprocess==0.7.0
pure-eval==0.2.2
pycparser==2.21
Pygments==2.12.0
pyparsing==3.0.9
pyrsistent==0.18.1
python-dateutil==2.8.2
pytz==2022.1
pyzmq==23.1.0
requests==2.28.0
Send2Trash==1.8.0
six==1.16.0
sniffio==1.2.0
soupsieve==2.3.2.post1
stack-data==0.2.0
terminado==0.15.0
tinycss2==1.1.1
tornado==6.1
traitlets==5.2.2.post1
typing-extensions==4.2.0
urllib3==1.26.9
wcwidth==0.2.5
webencodings==0.5.1
websocket-client==1.3.2
y-py==0.5.0
ypy-websocket==0.1.13
zipp==3.8.0

I think this is how to make the requirements file, which you then feed into step 2 meaning all the required deps go into the wheelhouse for export to the "offline computer".

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文