将表数据导出到文本文件 + C# + SQL服务器
我正在尝试将 SQL 表数据导出到 C# 代码中带有“~”分隔符的文本文件。 当数据很小的时候就可以了。当它很大时,它会抛出内存不足异常。
我的代码:
public static void DataTableToTextFile(DataTable dtToText, string filePath)
{
int i = 0;
StreamWriter sw = null;
try
{
sw = new StreamWriter(filePath, false); /*For ColumnName's */
for (i = 0; i < dtToText.Columns.Count - 1; i++)
{
sw.Write(dtToText.Columns[i].ColumnName + '~');
}
sw.Write(dtToText.Columns[i].ColumnName + '~');
sw.WriteLine(); /*For Data in the Rows*/
foreach (DataRow row in dtToText.Rows)
{
object[] array = row.ItemArray;
for (i = 0; i < array.Length - 1; i++)
{
sw.Write(array[i].ToString() + '~');
}
sw.Write(array[i].ToString() + '~');
sw.WriteLine();
}
sw.Close();
}
catch (Exception ex)
{
throw new Exception("");
}
}
是否有更好的方法在存储过程或 BCP 命令中执行此操作?
I am trying to export SQL table data to a text file with '~' delimiter in C# code.
When data is small its fine. When it's huge, it is throwing an Out of memory exception.
My Code:
public static void DataTableToTextFile(DataTable dtToText, string filePath)
{
int i = 0;
StreamWriter sw = null;
try
{
sw = new StreamWriter(filePath, false); /*For ColumnName's */
for (i = 0; i < dtToText.Columns.Count - 1; i++)
{
sw.Write(dtToText.Columns[i].ColumnName + '~');
}
sw.Write(dtToText.Columns[i].ColumnName + '~');
sw.WriteLine(); /*For Data in the Rows*/
foreach (DataRow row in dtToText.Rows)
{
object[] array = row.ItemArray;
for (i = 0; i < array.Length - 1; i++)
{
sw.Write(array[i].ToString() + '~');
}
sw.Write(array[i].ToString() + '~');
sw.WriteLine();
}
sw.Close();
}
catch (Exception ex)
{
throw new Exception("");
}
}
Is there a better way to do this in a stored procedure or BCP command?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
如果没有使用 ~ 分隔符格式的具体原因,您可以尝试使用 DataTable WriteXml 函数 (
:
dtToText.WriteXml("c:\data.xml")
如果稍后需要将此文本转换回 DataTable,可以使用 ReadXml (http://msdn.microsoft.com/en-us/library/system.data.datatable.readxml.aspx)
如果你真的需要使现有代码正常工作,我可能会尝试以设定的时间间隔关闭并调用 StreamWriter 上的 Dispose,然后重新打开并附加到现有文本。
If there's no specific reason for using the ~ delimiter format, you might try using the DataTable WriteXml function (http://msdn.microsoft.com/en-us/library/system.data.datatable.writexml.aspx)
For example:
dtToText.WriteXml("c:\data.xml")
If you need to convert this text back to a DataTable later you can use ReadXml (http://msdn.microsoft.com/en-us/library/system.data.datatable.readxml.aspx)
If you really need to make the existing code work, I'd probably try closing and calling Dispose on the StreamWriter at a set interval, then reopen and append to the existing text.
我意识到这个问题已经有很多年了,但我最近遇到了类似的问题。解决方案: 简而言之,我认为您遇到了 Windows 大对象堆的问题。相关链接:
https://www .simple-talk.com/dotnet/.net-framework/the-dangers-of-the-large-object-heap/
总结上面的文章:当您分配长度超过 85K 的内存块时(其中如果 DataTable 中的值足够大,似乎很可能在 StreamWriter 对象的幕后发生),它们会进入一个单独的堆,即大对象堆 (LOH)。 LOH 中的内存块在其生命周期到期时通常会被释放,但堆不会被压缩。最终结果是抛出
System.OutOfMemoryException
,不是因为实际上没有足够的内存,而是因为在某些时候堆中没有足够的连续内存观点。如果您使用 .NET Framework 4.5.1 或更高版本(不适用于 Visual Studio 2010 或之前版本;它可能适用于 VS2012),您可以使用以下命令:
此命令强制在下一个垃圾时进行 LOH 压缩收藏。只需将该命令作为函数的第一行即可;每次调用该函数时都会将其设置为 CompactOnce,这将导致调用该函数后在某个不确定的点进行 LOH 压缩。
如果您没有 .NET 4.5.1,它会变得更难看。问题在于内存分配不明确;它很可能发生在 StreamWriter 的幕后。尝试不时调用 GC.Collect() 强制进行垃圾回收——也许每 3 次调用此函数。
警告:很多人会建议您直接调用
GC.Collect()
是一个坏主意,并且会减慢您的应用程序的速度 - 他们是对的。我只是不知道有更好的方法来处理这个问题。I realize that this question is years old, but I recently experienced a similar problem. The solution: Briefly, I think that you're running into problems with the Windows Large Object Heap. A relevant link:
https://www.simple-talk.com/dotnet/.net-framework/the-dangers-of-the-large-object-heap/
To summarize the above article: When you allocate chunks of memory more than 85K long (which seems likely to happen behind the scenes in your StreamWriter object if the values in your DataTable are large enough), they go onto a separate heap, the Large Object Heap (LOH). Memory chunks in the LOH are deallocated normally when their lifetime expires, but the heap is not compacted. The net result is that a
System.OutOfMemoryException
is thrown, not because there isn't actually enough memory, but because there isn't enough contiguous memory in the heap at some point.If you're using .NET framework 4.5.1 or later (which won't work on Visual Studio 2010 or before; it might work on VS2012), you can use this command:
This command forces LOH compaction to happen at the next garbage collection. Just put that command as the first line in your function; it will be set to CompactOnce every time this function is called, which will cause LOH compaction at some indeterminate point after the function is called.
If you don't have .NET 4.5.1, it gets uglier. The problem is that the memory allocation isn't explicit; it's happening behind the scenes in your StreamWriter, most likely. Try calling
GC.Collect()
, forcing garbage collection, from time to time--perhaps every 3rd time this function is called.A warning: Lots of people will advise you that calling
GC.Collect()
directly is a bad idea and will slow down your application--and they're right. I just don't know a better way to handle this problem.