validates_uniqueness_of 在 Heroku 上失败?

发布于 2024-10-21 22:41:53 字数 1128 浏览 1 评论 0原文

在我的用户模型中,我有:

validates_uniqueness_of :fb_uid (我正在使用 facebook connect)。

但是,有时,我会在用户注册时收到重复的行。这非常糟糕。

两条记录的创建时间都在100ms以内。我无法确定它是否发生在两个单独的请求中(heroku 日志记录很糟糕,只能追溯到到目前为止,而且只发生了两次)。

有两件事:

  • 有时请求需要一些时间,因为我查询 FB API 的姓名信息、好友和图片。
  • 我使用 bigint 来存储 fb_uid (后端是 postgres)。

我无法在开发中复制。

任何想法将不胜感激。

登录功能

def self.create_from_cookie(fb_cookie, remote_ip = nil)
    return nil unless fb_cookie
    return nil unless fb_hash = authenticate_cookie(fb_cookie)
    uid = fb_hash["uid"].join.to_i

    #Make user and set data
    fb_user = FacebookUser.new
    fb_user.fb_uid = uid
    fb_user.fb_authorized = true
    fb_user.email_confirmed = true
    fb_user.creation_ip = remote_ip
    fb_name_data, fb_friends_data, fb_photo_data, fb_photo_ext = fb_user.query_data(fb_hash)
    return nil unless fb_name_data
    fb_user.set_name(fb_name_data)
    fb_user.set_photo(fb_photo_data, fb_photo_ext)

    #Save user and friends to the db
    return nil unless fb_user.save
    fb_user.set_friends(fb_friends_data)
    return fb_user
end

In my User model, I have:

validates_uniqueness_of :fb_uid (I'm using facebook connect).

However, at times, I'm getting duplicate rows upon user sign up. This is Very Bad.

The creation time of the two records is within 100ms. I haven't been able to determine if it happens in two separate requests or not (heroku logging sucks and only goes back so far and it's only happened twice).

Two things:

  • Sometimes the request takes some time, because I query FB API for name info, friends, and picture.
  • I'm using bigint to store fb_uid (backend is postgres).

I haven't been able to replicate in dev.

Any ideas would be extremely appreciated.

The signin function

def self.create_from_cookie(fb_cookie, remote_ip = nil)
    return nil unless fb_cookie
    return nil unless fb_hash = authenticate_cookie(fb_cookie)
    uid = fb_hash["uid"].join.to_i

    #Make user and set data
    fb_user = FacebookUser.new
    fb_user.fb_uid = uid
    fb_user.fb_authorized = true
    fb_user.email_confirmed = true
    fb_user.creation_ip = remote_ip
    fb_name_data, fb_friends_data, fb_photo_data, fb_photo_ext = fb_user.query_data(fb_hash)
    return nil unless fb_name_data
    fb_user.set_name(fb_name_data)
    fb_user.set_photo(fb_photo_data, fb_photo_ext)

    #Save user and friends to the db
    return nil unless fb_user.save
    fb_user.set_friends(fb_friends_data)
    return fb_user
end

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

陌上芳菲 2024-10-28 22:41:53

我对 facebook connect 不太熟悉,但是如果来自两个单独帐户的两个单独用户在任一请求完成之前快速连续发布请求,是否有可能获得两个相同的 uuid? (也称为竞争条件)validates_uniqueness_of 仍然会受到这种竞争条件的影响,详细信息可以在此处找到:

http://apidock.com/rails/ActiveModel/Validations/ClassMethods/validates_uniqueness_of

因为执行了此检查
数据库之外还有一个
重复值的机会
插入两个并行事务中。
为了防止这种情况发生,你应该
在字段上创建唯一索引。
请参阅 add_index 了解更多信息。

通过添加数据库约束,您确实可以确保这种情况永远不会发生。将其添加到数据库迁移中,然后运行它:

add_index :user, :fb_uid, :unique => true

现在用户会收到错误,而无法完成请求,这通常比在数据库中生成非法数据更可取,您必须手动调试和清除这些数据。

I'm not terribly familiar with facebook connect, but is it possible to get two of the same uuid if two separate users from two separate accounts post a request in very quick succession before either request has completed? (Otherwise known as a race condition) validates_uniqueness_of can still suffer from this sort of race condition, details can be found here:

http://apidock.com/rails/ActiveModel/Validations/ClassMethods/validates_uniqueness_of

Because this check is performed
outside the database there is still a
chance that duplicate values will be
inserted in two parallel transactions.
To guarantee against this you should
create a unique index on the field.
See add_index for more information.

You can really make sure this will never happen by adding a database constraint. Add this to a database migration and then run it:

add_index :user, :fb_uid, :unique => true

Now a user would get an error instead of being able to complete the request, which is usually preferable to generating illegal data in your database which you have to debug and clean out manually.

爱冒险 2024-10-28 22:41:53

来自 Ruby on Rails v3.0.5 模块 ActiveRecord::Validations::ClassMethods
http://s831.us/dK6mFQ

并发性和完整性

使用这个[validates_uniqueness_of]
结合验证方法
ActiveRecord::Base#save 没有
保证不存在重复
记录插入,因为唯一性
对应用程序级别的检查是
本质上容易出现竞争条件。
例如,假设两个用户
尝试同时发表评论
时间,评论的标题必须是
独特的。在数据库级别,
这些用户执行的操作可能
交错在下面
方式:...

From Ruby on Rails v3.0.5 Module ActiveRecord::Validations::ClassMethods
http://s831.us/dK6mFQ

Concurrency and integrity

Using this [validates_uniqueness_of]
validation method in conjunction with
ActiveRecord::Base#save does not
guarantee the absence of duplicate
record insertions, because uniqueness
checks on the application level are
inherently prone to race conditions.
For example, suppose that two users
try to post a Comment at the same
time, and a Comment’s title must be
unique. At the database-level, the
actions performed by these users could
be interleaved in the following
manner: ...

晨与橙与城 2024-10-28 22:41:53

您的代码中似乎存在某种竞争条件。为了检查这一点,我首先更改代码,以便首先提取 facebook 值,然后我才会创建一个新的 facebook 对象。

那么我强烈建议您编写一个测试来检查您的函数是否执行一次。好像执行了两次。

在此基础上,等待获取 facebook 结果时似乎存在竞争条件。

It seems like there is some sort of a race condition inside your code. To check this, i would first change the code so that facebook values are first extracted and only then i would create a new facebook object.

Then i would highly suggest that you write a test to check whether your function gets executed once. It seems that it's executed two times.

And upon this, there seems to be a race condition upon waiting to get the facebook results.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文