自动生成自然键

发布于 2024-10-08 06:40:27 字数 324 浏览 2 评论 0原文

我正在研究一种方法来序列化数据库 A 中的部分数据并在数据库 B 中反序列化它(一种不同安装之间的保存/恢复),并且我查看了 Django 自然键以避免由于重复 ID 导致的问题。

唯一的问题是我应该向我的所有模型添加自定义管理器和新方法。有没有办法让 Django 通过查看 unique=Trueunique_togheter 字段自动生成自然键?

I'm studying a way to serialize part of the data in database A and deserialize it in database B (a sort of save/restore between different installations) and I've had a look to Django natural keys to avoid problems due to duplicated IDs.

The only issue is that I should add a custom manager and a new method to all my models. Is there a way to make Django automatically generate natural keys by looking at unique=True or unique_togheter fields?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

一向肩并 2024-10-15 06:40:27

请注意,这个答案与 Django 无关,但希望为您提供另一种考虑的选择。

但是,您没有提到您的数据库,在 SQL Server 中有一个 BINARY_CHECKSUM() 关键字可用于为行中保存的数据提供唯一值。将其视为针对行中所有字段的哈希。
该校验和方法可用于通过检查本地行校验和<>是否有效来从另一个数据库更新数据库。远程行校验和。

下面的 SQL 将从远程数据库更新本地数据库。它不会插入新行,因为您使用 insert ... where id > > @MaxLocalID

SELECT  delivery_item_id, BINARY_CHECKSUM(*) AS bc
INTO    #DI
FROM    [REMOTE.NETWORK.LOCAL].YourDatabase.dbo.delivery_item di


SELECT  delivery_item_id, BINARY_CHECKSUM(*) AS bc
INTO    #DI_local
FROM    delivery_item di

-- Get rid of items that already match
DELETE  FROM #DI_local
WHERE   delivery_item_id IN (SELECT l.delivery_item_id
                             FROM   #DI x, #DI_local l
                             WHERE  l.delivery_item_id = x.delivery_item_id
                             AND l.bc = x.bc)

DROP TABLE #DI

UPDATE  DI
SET     engineer_id = X.engineer_id,
        ... -- Set other fields here
FROM    delivery_item DI,
        [REMOTE.NETWORK.LOCAL].YourDatabase.dbo.delivery_item x,
        #DI_local L
WHERE   x.delivery_item_id = L.delivery_item_id
        AND DI.delivery_item_id = L.delivery_item_id

DROP TABLE #DI_local

为了使上述工作正常,您将需要本地数据库和远程数据库之间有一个链接服务器:

-- Create linked server if you don't have one already 
IF NOT EXISTS ( SELECT  srv.name
                FROM    sys.servers srv
                WHERE   srv.server_id != 0
                        AND srv.name = N'REMOTE.NETWORK.LOCAL' ) 
    BEGIN
        EXEC master.dbo.sp_addlinkedserver @server = N'REMOTE.NETWORK.LOCAL',
        @srvproduct = N'SQL Server'

        EXEC master.dbo.sp_addlinkedsrvlogin
        @rmtsrvname = N'REMOTE.NETWORK.LOCAL',
        @useself = N'False', @locallogin = NULL,
        @rmtuser = N'your user name',
        @rmtpassword = 'your password'
    END
GO

Please note this answer has nothing to do with Django, but hopefully give you another alternative to think about.

You didn't mention your database, however, in SQL Server there is a BINARY_CHECKSUM() keyword you can use to give you a unique value for the data held in the row. Think of it as a hash against all the fields in the row.
This checksum method can be used to update a database from another by checking if local row checksum <> remote row checksum.

This SQL below will update a local database from a remote database. It won't insert new rows, for that you use insert ... where id > @MaxLocalID

SELECT  delivery_item_id, BINARY_CHECKSUM(*) AS bc
INTO    #DI
FROM    [REMOTE.NETWORK.LOCAL].YourDatabase.dbo.delivery_item di


SELECT  delivery_item_id, BINARY_CHECKSUM(*) AS bc
INTO    #DI_local
FROM    delivery_item di

-- Get rid of items that already match
DELETE  FROM #DI_local
WHERE   delivery_item_id IN (SELECT l.delivery_item_id
                             FROM   #DI x, #DI_local l
                             WHERE  l.delivery_item_id = x.delivery_item_id
                             AND l.bc = x.bc)

DROP TABLE #DI

UPDATE  DI
SET     engineer_id = X.engineer_id,
        ... -- Set other fields here
FROM    delivery_item DI,
        [REMOTE.NETWORK.LOCAL].YourDatabase.dbo.delivery_item x,
        #DI_local L
WHERE   x.delivery_item_id = L.delivery_item_id
        AND DI.delivery_item_id = L.delivery_item_id

DROP TABLE #DI_local

For the above to work, you will need a linked server between your local database and the remote database:

-- Create linked server if you don't have one already 
IF NOT EXISTS ( SELECT  srv.name
                FROM    sys.servers srv
                WHERE   srv.server_id != 0
                        AND srv.name = N'REMOTE.NETWORK.LOCAL' ) 
    BEGIN
        EXEC master.dbo.sp_addlinkedserver @server = N'REMOTE.NETWORK.LOCAL',
        @srvproduct = N'SQL Server'

        EXEC master.dbo.sp_addlinkedsrvlogin
        @rmtsrvname = N'REMOTE.NETWORK.LOCAL',
        @useself = N'False', @locallogin = NULL,
        @rmtuser = N'your user name',
        @rmtpassword = 'your password'
    END
GO
怪我闹别瞎闹 2024-10-15 06:40:27

在这种情况下,您应该使用 GUID 作为密钥。数据库可以自动为您生成这些。 Google唯一标识符。我们有 50 多个仓库,所有仓库都远程插入数据,并使用 SQL Server 复制将其数据发送到我们的主数据库。它们都使用 GUID 作为主键,因为这可以保证是唯一的。它运作得很好。

In that case you should use a GUID as your key. The database can automatically generate these for you. Google uniqueidentifier. We have 50+ warehouses all inserting data remotely and send their data up to our primary database using SQL Server replication. They all use a GUID as the primary key as this is guaranteed to be unique. It works very well.

薯片软お妹 2024-10-15 06:40:27

我的解决方案与自然键无关,但使用 picke/unpickle
这不是最有效的方法,但它简单且易于适应您的代码。我不知道它是否适用于复杂的数据库结构,但如果这不是你的情况,请尝试一下!

当连接到数据库 A 时:

import pickle
records_a = your_model.objects.filter(...) 
f = open("pickled.records_a.txt", 'wb')
pickle.dump(records_a, f) 
f.close()

然后移动文件,当连接到数据库 B 时运行:

import pickle 
records_a = pickle.load(open('pickled.records_a.txt'))
for r in records_a:
    r.id = None
    r.save()

希望这有帮助

my solution has nothing to do with natural keys but uses picke/unpickle.
It's not the most efficient way, but it's simple and easy to adapt to your code. I don't know if it works with a complex db structure, but if this is not your case give it a try!

when connected to db A:

import pickle
records_a = your_model.objects.filter(...) 
f = open("pickled.records_a.txt", 'wb')
pickle.dump(records_a, f) 
f.close()

then move the file and when connected to db B run:

import pickle 
records_a = pickle.load(open('pickled.records_a.txt'))
for r in records_a:
    r.id = None
    r.save()

Hope this helps

就此别过 2024-10-15 06:40:27

通过扩展 models.Model 类来创建自定义基本模型,并在其中编写通用管理器,然后使用自定义 .save() 方法编辑模型以扩展自定义基本模型。这不会对您的数据库表结构或旧保存的数据产生任何副作用,除非您更新一些旧行。如果您有旧数据,请尝试对所有记录进行虚假更新。

make a custom base model by extending models.Model class, and write your generic manager inside it, and acustom .save() method then edit your models to extend the custome base model. this will have no side effect on your db tables structure nor old saved data, except when you update some old rows. and if you had old data try to make a fake update to all your recoreds.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文