弹簧数据mongoDB:E11000重复的密钥错误集合副本集。甚至在检查是否存在唯一密钥的数据后也会发生

发布于 2025-02-09 08:14:17 字数 1657 浏览 3 评论 0原文

我面临的问题是间歇性的,并且仅在沉重的负载期间出现在生产环境中。 MongoDB以仲裁者的形式设置为复制设置模式。

保存之前,我在服务层中有一个UserDetail类

@Builder
@Data
@Document(collection = "user_details")
public class UserDetail {

    private List<String> params;

    @Id
    private ObjectId id;

    @Indexed(name = "idx_us_de_on_user_id", unique = true)
    private UUID userId;

}

,我首先通过以下代码检查它是否存在:

UserDetail userDetail =
                Optional.ofNullable(
                        mongoTemplateReadPreferencePrimary
                                .query(UserDetail.class)
                                .matching(
                                        query(where("userId").is(userId))
                                )
                                .firstValue()
                )
                .orElse(
                        UserDetail.builder()
                                .userId(userId)
                                .params(new ArrayList<>())
                                .build()
                );

// perform business logic

// then try to save userDetail
userDetailRepository.save(userDetail);

存储库层如下:

@Repository
public interface UserDetailRepository extends MongoRepository<UserDetail, ObjectId> {

    Optional<UserDetail> findByUserId(UUID userId);

}

但是在重负载期间的生产环境中,有时我会遇到

Write error: WriteError{code=11000, message='E11000 duplicate key error collection: database_name.user_details index: userId_1 dup key: { userId: UUID("b5261508-fdc8-4f30-b358-5d37374cf9f9") }', details={}}

我尝试使用ReadConcern作为主要的错误的 错误但这无济于事。

The issue that I am facing is intermittent and occurs only in the production environment during heavy loads.
MongoDB is set up in replica set mode with an arbiter.

I have a UserDetail class

@Builder
@Data
@Document(collection = "user_details")
public class UserDetail {

    private List<String> params;

    @Id
    private ObjectId id;

    @Indexed(name = "idx_us_de_on_user_id", unique = true)
    private UUID userId;

}

In the service layer before saving, I first check whether it exists or not through the following code:

UserDetail userDetail =
                Optional.ofNullable(
                        mongoTemplateReadPreferencePrimary
                                .query(UserDetail.class)
                                .matching(
                                        query(where("userId").is(userId))
                                )
                                .firstValue()
                )
                .orElse(
                        UserDetail.builder()
                                .userId(userId)
                                .params(new ArrayList<>())
                                .build()
                );

// perform business logic

// then try to save userDetail
userDetailRepository.save(userDetail);

The repository layer is as follows:

@Repository
public interface UserDetailRepository extends MongoRepository<UserDetail, ObjectId> {

    Optional<UserDetail> findByUserId(UUID userId);

}

But in production environment during heavy loads sometimes I get the error

Write error: WriteError{code=11000, message='E11000 duplicate key error collection: database_name.user_details index: userId_1 dup key: { userId: UUID("b5261508-fdc8-4f30-b358-5d37374cf9f9") }', details={}}

I tried using readConcern as primary but that did not help.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

凉城已无爱 2025-02-16 08:14:17

通常,您有种族状况。两个并发要求更新相同userDetail的请求将发现该实体不存在,并尝试执行插入,其中一个将抛出重复的关键错误

选项:

  • 实现persistable isNew() false,因此保存始终是UPSERT
  • 实施业务逻辑,因此您总是进行两次尝试。仅当您获得重复键异常时,第二个尝试仅尝试

In general you have a race condition. Two concurrent requests to update the same UserDetail will find the entity does not exist and try to do an insert and one of them will throw a duplicate key error

Options:

  • Implement Persistable with isNew() false so save is always an upsert
  • Implement business logic so you always do two tries. The second try only if you get a duplicate key exception
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文