如何使用共享模型和数据库中的同一项目中的Django应用程序部署?

发布于 2025-02-09 01:49:46 字数 1118 浏览 0 评论 0原文

我有一个建筑问题: 我们有一个由多个应用程序制成的Django项目。有一个核心应用程序包含用于其他应用程序集的主要模型。 然后,我们有一些针对API的用户的应用程序。最后,我们只有开发人员使用的一些内部应用程序和工具,这些应用程序和工具可作为扩展功能在Admin UI中访问。

我们的部署过程非常整体。我们使用kubernetes,然后整体部署整个项目。这意味着,如果我们仅在内部应用中进行更改,并且在生产中需要它,我们将构建一个新的Docker映像,并部署带有新版本标签的新版本。

我并不是这样的忠实拥护者,因为内部工具的更改不应创建面向用户的应用程序的新版本。

我一直想知道是否有一种方法可以将这些部署分开(也许使它们成为微服务架构?)。因此,我们可以将面向应用程序的用户与内部工具分开。我知道我可以为项目的一部分构建单独的图像,标签和所有内容,但是如果internal_app_1取决于core> core> core> core 应用程序以及settings.pyManage.py文件。

同样,在Kubernetes中,必须将应用程序分开的意思是用两个服务器运行,因此这意味着两个单独的Django项目彼此隔离,但使用相同的数据库。

有没有人与类似的事情一起工作,或者想建议替代方案(如果有的话)?

以下是我们项目目前如何结构的树示例:

├── core
|   ├── models.py
|   ├── views.py
|   └── urls.py
├── userapi_1
|   ├── views.py
|   └── urls.py
├── userapi_2
|   ├── views.py
|   └── urls.py
├── insternal_app_1
|   ├── templates
|   |   └── ...
|   ├── models.py
|   ├── views.py
|   └── urls.py
├── manage.py
├── settings.py
└── Dockerfiles
    ├── Dockerfile.core
    └── Dockerfile.internal_app_1

I have an architectural question:
We have a Django project made of multiple apps. There is a core app that holds the main models used for the other sets of apps.
Then, we have a couple apps for user facing APIs. Lastly, we have some internal apps and tools used by developers only that are accessible in Admin UI as extended features.

Our deployment process is very monolithic. We use Kubernetes and we deploy the whole project as a whole. Meaning that if we only had changes in an internal app and we need that in production, we will build a new Docker image and deploy a new release with a new version tag incremented.

I'm not a big fan of this because change in internal tools shouldn't create a new release of the user facing applications.

I have been wondering if there is a way to split those deployments (maybe make them into a microservice architecture?). So we could deploy the user facing applications separate from the internal tools. I know I could build separate images, tags and everything for parts of the project but I'm not sure how they could communicate between each other if internal_app_1 depends on the models of core app and potentially the settings.py and manage.py file as well.

Also because in Kubernetes, having to separate applications would mean to separate deployments with two servers running, so this means two separate Django projects isolated from each other but using the same database.

Has anyone worked with something similar or would like to suggest an alternative, if there's any?

Below is a tree example of how our project is structured at the moment:

├── core
|   ├── models.py
|   ├── views.py
|   └── urls.py
├── userapi_1
|   ├── views.py
|   └── urls.py
├── userapi_2
|   ├── views.py
|   └── urls.py
├── insternal_app_1
|   ├── templates
|   |   └── ...
|   ├── models.py
|   ├── views.py
|   └── urls.py
├── manage.py
├── settings.py
└── Dockerfiles
    ├── Dockerfile.core
    └── Dockerfile.internal_app_1

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

无名指的心愿 2025-02-16 01:49:47

django和微服务?是的,也许在平行宇宙中的某个地方。

我可能建议的只有一件事是构建两个相同的服务,例如django_container_internaldjango_container_production。在这种情况下,您将能够发布内部工具而无需停止生产

如果要使用内部端点访问生产功能,则可以通过使用envs停用生产 production url。通常Django项目具有common config/urls.py汇总所有URL端点,例如

urlpatterns = [
    url('core/api/v1/', include(core.urls)),
    url('internal/api/v1/', include(internal_app_1.urls)),
    url('user/api/v1/', include(userapi_1.urls))
    ...
]

,您可以添加is_internal_tools环境变量,并更新urls.py 喜欢

from os import environ

urlpatterns = [
    url('core/api/v1/', include(core.urls)),
    ...
]

if environ.get('IS_INTERNAL_TOOLS', 'false').lower() in ('true', '1', 'yes'):
    urlpatterns.append(url('insternal/api/v1/', include(insternal_app_1.urls)))
else:
    urlpatterns.append(url('user/api/v1/', include(userapi_1.urls)))
  • PROS:

    • 在两种服务中都可以访问所有模型(只有一个常见的dao =>没有双重开发人员可以创建两次模型)
    • 功能是分开的,因此仅访问必要的功能
    • 易于实现
  • CONS:

    • 即使没有使用了一半的一半,也存储在两个容器中的整个源代码
    • 如果您使用两个单独的数据库用于内部工具和外部API,则必须在这两个中创建所有表(但看起来不是您的情况)
    • 因为它仍然是单片内部生产零件可严重地依赖core,并且不可能单独部署更新的核心 。 /li>

Django and microservices? Yeah, maybe somewhere in the parallel universe.

Only one thing that I may recommend is to build two identical services like django_container_internal and django_container_production. In this case you will be able to release internal tools without stopping production.

If you want to prevent access to production functionality with internal endpoints you may deactivate production URLs by using ENVs. Usually Django project has common config/urls.py that aggregate all URL endpoints and looks like

urlpatterns = [
    url('core/api/v1/', include(core.urls)),
    url('internal/api/v1/', include(internal_app_1.urls)),
    url('user/api/v1/', include(userapi_1.urls))
    ...
]

For example you may add IS_INTERNAL_TOOLS environment variable and update urls.py like

from os import environ

urlpatterns = [
    url('core/api/v1/', include(core.urls)),
    ...
]

if environ.get('IS_INTERNAL_TOOLS', 'false').lower() in ('true', '1', 'yes'):
    urlpatterns.append(url('insternal/api/v1/', include(insternal_app_1.urls)))
else:
    urlpatterns.append(url('user/api/v1/', include(userapi_1.urls)))
  • Pros:

    • All models will be accessible at both services (only one common DAO => no double developers work to create models twice)
    • Functionality is separated so only necessary features are accessible
    • Easy to implement
  • Cons:

    • Whole source code stored inside both of containers even if half of it is not used
    • If you using two separate databases for internal tools and external API you have to create all tables in both of it (but looks like that is not your case)
    • Because of it is still monolith internal and production parts heavily dependable on common core and it is impossible to deploy only updated core separately
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文