DataMapper 完成的许多无用查询
我有一个 DataMapper 问题(我将它与 Sinatra 一起使用)
我有一个非常基本的应用程序,有 3 个模型。 这是代码。
class Level
include DataMapper::Resource
property :id, Serial
property :name, String, :required => true, :unique => true, :lazy => true
property :description, Text, :lazy => true
timestamps :at
end
class Player
include DataMapper::Resource
property :id, Serial
property :name, String, :required => true, :lazy => true
timestamps :at
belongs_to :game
end
class Game
include DataMapper::Resource
property :id, Serial
has n, :players
belongs_to :level
belongs_to :current_player, 'Player', :required => false
end
这是一个基本路线:
get '/' do
DataMapper::logger.debug 'Creating level'
level = Level.create(:name => "One")
DataMapper::logger.debug 'Creating game'
game = Game.create(:level => level)
DataMapper::logger.debug 'Adding players'
alice = Player.create(:name => 'Alice', :game => game)
bob = Player.create(:name => 'Bob', :game => game)
DataMapper::logger.debug 'Setting game current player'
game.current_player = alice
game.save
'ok'
end
我的问题是,当我查看 DataMapper 日志文件时,我发现它做了很多无用的查询,我不明白为什么!
这是日志输出:
~ Creating level
~ (0.000062) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.002241) INSERT INTO "levels" ("name", "created_at", "updated_at") VALUES ('One', '2012-01-15T18:15:28+01:00', '2012-01-15T18:15:28+01:00')
~ Creating game
~ (0.000048) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.001747) INSERT INTO "games" ("level_id") VALUES (1)
~ Adding players
~ (0.000050) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.003762) INSERT INTO "players" ("name", "created_at", "updated_at", "game_id") VALUES ('Alice', '2012-01-15T18:15:28+01:00', '2012-01-15T18:15:28+01:00', 1)
~ (0.000085) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.001820) INSERT INTO "players" ("name", "created_at", "updated_at", "game_id") VALUES ('Bob', '2012-01-15T18:15:28+01:00', '2012-01-15T18:15:28+01:00', 1)
~ Setting game current player
~ (0.000078) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.001826) UPDATE "games" SET "current_player_id" = 1 WHERE "id" = 1
正如您所看到的,有很多针对级别模型的查询。我实在不明白DataMapper为什么要做这些。
预先非常感谢您的帮助。
PS:你可能认为这没什么大不了的,但我实际上在发布之前简化了模型结构。实际的模型更复杂,并且充满了那些无用的查询。
这是我的真实数据映射器日志文件的一小部分: 当我保存游戏模型的实例时会发生这种情况。
~ (0.001640) UPDATE "asd_games" SET "updated_at" = '2012-01-15T17:51:27+01:00', "current_player_id" = 3, "current_action_id" = 3 WHERE "id" = 1
~ (0.000079) SELECT "id", "body" FROM "asd_actions" WHERE "id" = 3 ORDER BY "id"
~ (0.000083) SELECT "id", "name", "description" FROM "asd_levels" WHERE "id" = 1 ORDER BY "id"
~ (0.000057) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000075) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000083) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000082) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000084) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
I have a problem with DataMapper (I'm using it with Sinatra)
I have a very basic app with 3 models.
Here's the code.
class Level
include DataMapper::Resource
property :id, Serial
property :name, String, :required => true, :unique => true, :lazy => true
property :description, Text, :lazy => true
timestamps :at
end
class Player
include DataMapper::Resource
property :id, Serial
property :name, String, :required => true, :lazy => true
timestamps :at
belongs_to :game
end
class Game
include DataMapper::Resource
property :id, Serial
has n, :players
belongs_to :level
belongs_to :current_player, 'Player', :required => false
end
Here's a basic route:
get '/' do
DataMapper::logger.debug 'Creating level'
level = Level.create(:name => "One")
DataMapper::logger.debug 'Creating game'
game = Game.create(:level => level)
DataMapper::logger.debug 'Adding players'
alice = Player.create(:name => 'Alice', :game => game)
bob = Player.create(:name => 'Bob', :game => game)
DataMapper::logger.debug 'Setting game current player'
game.current_player = alice
game.save
'ok'
end
My problem is that when I look at the DataMapper log file, I find it has made many useless queries and I don't understand why!
Here's the log output:
~ Creating level
~ (0.000062) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.002241) INSERT INTO "levels" ("name", "created_at", "updated_at") VALUES ('One', '2012-01-15T18:15:28+01:00', '2012-01-15T18:15:28+01:00')
~ Creating game
~ (0.000048) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.001747) INSERT INTO "games" ("level_id") VALUES (1)
~ Adding players
~ (0.000050) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.003762) INSERT INTO "players" ("name", "created_at", "updated_at", "game_id") VALUES ('Alice', '2012-01-15T18:15:28+01:00', '2012-01-15T18:15:28+01:00', 1)
~ (0.000085) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.001820) INSERT INTO "players" ("name", "created_at", "updated_at", "game_id") VALUES ('Bob', '2012-01-15T18:15:28+01:00', '2012-01-15T18:15:28+01:00', 1)
~ Setting game current player
~ (0.000078) SELECT "id" FROM "levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.001826) UPDATE "games" SET "current_player_id" = 1 WHERE "id" = 1
As you can see, there's a lot of queries for the level model. I really don't understand why DataMapper is doing these.
Thanks a lot in advance for your help.
PS: You may think that it is not a big deal but I actually simplified the model structure before posting here. The actual model is more complex and is full of those useless queries..
Here's a short part of my real datamapper log file:
It happens when I save an instance of my game model.
~ (0.001640) UPDATE "asd_games" SET "updated_at" = '2012-01-15T17:51:27+01:00', "current_player_id" = 3, "current_action_id" = 3 WHERE "id" = 1
~ (0.000079) SELECT "id", "body" FROM "asd_actions" WHERE "id" = 3 ORDER BY "id"
~ (0.000083) SELECT "id", "name", "description" FROM "asd_levels" WHERE "id" = 1 ORDER BY "id"
~ (0.000057) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000075) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000083) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000082) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
~ (0.000084) SELECT "id" FROM "asd_levels" WHERE "name" = 'One' ORDER BY "id" LIMIT 1
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
额外的
SELECTS
正在检查:unique =>;
约束。似乎在每次数据库调用时都会进行此检查。Level
类上的 true避免这种情况的一种方法是在创建模型对象时不使用
create
,这会立即将模型保存在数据库中,使用new
然后保存整个对象图当合适的对象准备就绪时,对它们进行一次save
调用(请参阅有关创建和保存模型的文档):产生输出:
因此模型不会立即持久化,而是一起完成,并且唯一性检查只进行一次。
另一种可能性是设置
:auto_validation =>
。:name
属性上的 false此更改会产生以下输出(使用
create
):因此仍然存在对数据库的多个调用,但不会对每个调用进行检查(事实上,它看起来不像是在所有,所以这首先就违背了使用
:unique => true
的目的)。The extra
SELECTS
are being made to check the:unique => true
constraint on theLevel
class. This check seems to be being made on every database call.One way to avoid this would be instead of using
create
when creating your model objects, which immediately saves the model in the database, usenew
and then save the whole object graph with a single call tosave
on a suitable object when they're all ready (see the docs on creating and saving models):produces the output:
So the models are not immediately persisted, but are all done together, and the uniqueness check is only done once.
Another possibility would be to set
:auto_validation => false
on the:name
property.This change produces this output (using
create
):So there are still multiple calls to the database, but the check isn't made on each call (in fact it doesn't look like it's being made at all, so this rather defeats the object of using
:unique => true
in the first place).我遇到了同样的问题,我有一个用户和一个工作。作业属于用户。片段:
每次保存作业时,都会运行两个查询:
因为我确信我的更新不包含不良数据(用户输入不会污染任何内容),所以我能够使用 bang (!) 停止 SELECT 运行save http://datamapper.org/docs/create_and_destroy.html 的版本
描述了实际内容当您使用 bang 方法而不是非 bang 方法时会发生这种情况,因此您需要检查它并查看它对于您的用例是否安全。
I ran into the same issue where I have a User and a Job. The Job belongs_to the user. Snippets:
Every time I save the job, two queries are run:
Because I am confident that my update contains no bad data (nothing is tainted by user input), I was able to stop the SELECT from running by using the bang (!) version of save
http://datamapper.org/docs/create_and_destroy.html describes what actually happens when you use the bang method instead of the non-bang method, so you will want to review that and see if it's safe for your use case.