查询在 phpMyAdmin 中工作正常,但在应用程序中则不行(C#)
在 SO 和其他地方有一些类似的问题,但主要是关于 php 的,我不明白。我正在尝试恢复一个有 62 个表的数据库,如下所示:
string query = @"SET SQL_MODE= 'NO_AUTO_VALUE_ON_ZERO'; CREATE DATABASE " + dbName + " DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci; USE " + dbName + ";" + Environment.NewLine;
using (StreamReader reader = File.OpenText("C:\b.sql"))
{
string line = reader.ReadToEnd();
query += line; //almost 1700 lines.
}
// upto this i get the query correctly which works fine in phpMyAdmin.
MySqlCommand c = new MySqlCommand(query, conn);
c.ExecuteReader();
//but when I execute, throws: "Fatal error encountered during command execution."
为什么会这样?如果它是查询长度的余弦,那么我如何从应用程序执行如此大的查询?
There are a few similar questions on SO and elsewhere but mostly with php and I do not understand that. I'm trying to restore a database with a 62 tables like this:
string query = @"SET SQL_MODE= 'NO_AUTO_VALUE_ON_ZERO'; CREATE DATABASE " + dbName + " DEFAULT CHARACTER SET utf8 COLLATE utf8_general_ci; USE " + dbName + ";" + Environment.NewLine;
using (StreamReader reader = File.OpenText("C:\b.sql"))
{
string line = reader.ReadToEnd();
query += line; //almost 1700 lines.
}
// upto this i get the query correctly which works fine in phpMyAdmin.
MySqlCommand c = new MySqlCommand(query, conn);
c.ExecuteReader();
//but when I execute, throws: "Fatal error encountered during command execution."
Why is this so? If it's 'cos of the length of the query, then how can I execute such large queries from the application?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
尝试检查错误:
编辑
为了获得更好的性能,您可以使用此类。我还没试过,希望效果好:
try this for check error:
Edit
for better performance, you can use this class. I have not tried it, hope it works well:
我不确定 MySql 的查询字符串的长度是否有限制。我的第一个想法是将庞大的 1700 行查询集分解为单独的查询。
这样您就可以创建数据库,然后单独运行每个查询。在不了解您的数据的情况下,这可能会帮助您查明任何问题。如果您用 try catch 块包围“ExecuteNonQuery”,您可以捕获任何失败的查询并将它们放入日志文件中以查看批处理何时完成。
I'm not sure if there is a limit to the length of a query string for MySql. My first thought would be to break the giant 1700 line set of queries into individual queries.
This way you could create the database and then run each query individually. Without knowing your data this may help you pinpoint any issues. If you surrounded the "ExecuteNonQuery" with a try catch block you could catch any queries that fail and put them in a log file to look at when the batch is done.