从列表/排序列表 C# 创建 CSV 字符串的有效方法?

发布于 2024-10-07 14:26:17 字数 1204 浏览 0 评论 0原文

我有一个实现异步 SOAP 的应用程序。每隔 50-100 毫秒,我就会收到转换为 SortedList 对象的数据。我还有一个预定义的 IList,其中包含该 SortedList 中所有可能的键。

我需要迭代 IList 并检查 SortedList 是否包含该键。如果是,我将该值写入 csv 字符串;如果没有,我将 0.0 写入 csv 字符串。

注意: IList 有 400 个键。 SortedList 通常会比 400 小得多,最多大约 100。

        string MyText = timestamp.ToString("HH:mm:ss");
        for (int i = 0; i < AllKeys.Count; i++)
        {
            double info;
            if (MySortedList.TryGetValue(AllKeys[i], out info))
            {
                MyText += "," + info;
            }
            else
            {
                MyText += ",0.0";
            }
        }
        MyText += "\n";

        File.AppendAllText(filePath, MyText);

我目前正在使用上面的代码创建 csv 字符串,然后再将其写入我的文件。但是,我发现这段代码落后于我的应用程序。

我需要帮助提高效率,以便存储传入数据的时间低于 50 毫秒。一些额外的事情:

  • 我不必写入 csv 文件,我只需要快速存储数据。 (我可以稍后从序列化文件转换为我的 csv 文件)
  • 我考虑过使用 LINQ,但我不熟悉查询,也不知道它会提高多少效率

编辑:我通过使用 Conrad 的创建 StreamWriter 对象的建议解决了我的性能问题。我只是创建了一个静态 StreamWriter 对象,并在通信终止时关闭 StreamWriter 之前将所有文本写入其中。

I have an application which implements Asynchronous SOAP. Every 50-100ms I will receive data which is converted into a SortedList<double,double> object. I also have a predefined IList<double> which contains all the possible Keys in that SortedList.

I need to iterate through the IList and check if the SortedList contains that key. If it does, I write that value to the csv string; if not, I write 0.0 to the csv string.

Note: The IList has 400 keys. The SortedList will generally be much smaller than 400, around 100 at most.

        string MyText = timestamp.ToString("HH:mm:ss");
        for (int i = 0; i < AllKeys.Count; i++)
        {
            double info;
            if (MySortedList.TryGetValue(AllKeys[i], out info))
            {
                MyText += "," + info;
            }
            else
            {
                MyText += ",0.0";
            }
        }
        MyText += "\n";

        File.AppendAllText(filePath, MyText);

I currently am using the above code to create the csv string before writing it to my file. However, I am finding that this code is lagging my application.

I need help improving the efficiency so that storing the incoming data takes below 50ms. Some additional things:

  • I do not have to write to a csv file, I just need to store the data fast. (I can convert from a serialized file to my csv file later)
  • I have considered using LINQ, but I am not familiar with the queries and don't know how much more efficient it will be

Edit: I have solved my performance issue by using Conrad's suggestion of making a StreamWriter object. I simply created a static StreamWriter object and write all my text to it before closing the StreamWriter when communication is terminated.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

谈场末日恋爱 2024-10-14 14:26:17

以下是一些想法。

1) 使用 StreamWriter 代替 File 进行写入。这比先写入内存再写入文件的两步要快。

2)如果可能的话,并行化工作。例如,如果您可以使用一个线程来处理消息,而另一个线程来编写消息。

3)我不认为LINQ的目的是提高性能,而是使数据操作更容易

Here are some thoughts.

1) Use a StreamWriter to write instead of File. This will be faster than the two step of writing to memory and then to a file.

2) If it all possible parallelize the work. For example if you can you one thread for processing the message and another thread to write the message.

3) I don't think the intent of LINQ is to improve performance, but make manipulations of data easier

雅心素梦 2024-10-14 14:26:17

我确信我还没有想出最有效的算法,但至少这是一个起点。如果不出意外,您会注意到使用 StringBuilder,而不是连接字符串。仅此一项就可能为您带来一些性能优势。

该算法假设 SortedList 键和“数据”列表都以相同的方式排序(从低到高)。

var textBuilder = new StringBuilder(timestamp.ToString("HH:mm:ss"));

var index = 0;
foreach(double key in data.Keys)
{
    while(Allkeys[index] < key)
    {
        textBuilder.Append(",0.0");
        index++;
    }

    textBuilder.Append(",").Append(data[key]);
    index++;
}
MyText = textBuilder.Append(@"\n").ToString();

只要看看上面的内容,我就确信存在一个错误,但如果不花费更多时间和/或测试,就不确定是什么或在哪里。

可能的 LINQ 解决方案更具声明性:

var textBuilder = new StringBuilder(timestamp.ToString("HH:mm:ss"));

var values = Allkeys.Select(
    key => data.ContainsKey(key) ? data[key].ToString() : "0.0")
    .ToArray();

var data = String.Join(",", values);
var MyText = textBuilder.Append(data).Append(@"\n").ToString();

使用聚合扩展方法可以在 LINQ 表达式中包含更多内容,但您必须在累加器中使用字符串连接,因此我没有在此处进行演示。

I'm sure I haven't come up with the most efficient algorithm, but here is a starting point, at least. If nothing else, you will notice the use of StringBuilder, rather than concatenating strings. This alone is likely to garner you some performace benefit.

This algorithm assumes that both the SortedList keys and the "data" List are ordered in the same way (low-to-high).

var textBuilder = new StringBuilder(timestamp.ToString("HH:mm:ss"));

var index = 0;
foreach(double key in data.Keys)
{
    while(Allkeys[index] < key)
    {
        textBuilder.Append(",0.0");
        index++;
    }

    textBuilder.Append(",").Append(data[key]);
    index++;
}
MyText = textBuilder.Append(@"\n").ToString();

Just looking at the above, I'm sure there is a bug, but not sure what or where without spending more time and/or testing.

A possible LINQ solution is more declarative:

var textBuilder = new StringBuilder(timestamp.ToString("HH:mm:ss"));

var values = Allkeys.Select(
    key => data.ContainsKey(key) ? data[key].ToString() : "0.0")
    .ToArray();

var data = String.Join(",", values);
var MyText = textBuilder.Append(data).Append(@"\n").ToString();

More can be included in the LINQ expression using the Aggregate extension method, but you'd have to use string concatenation in the accumulator, so I haven't shown that here.

过度放纵 2024-10-14 14:26:17

我同意 Conrad 的回答 - 然而提高性能的另一个想法是进行反向查找,即从 SortedList 中获取每个元素并在其他列表中进行查找(当然,我建议使用字典而不是列表以加快查找速度)。

I agree with answer from Conrad - yet one more idea to improve performance would be to do reverse lookup i.e. take each element from SortedList and do lookup in other list (of course, I would recommend to have dictionary instead of list for faster lookup).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文