大文件上传到 WSS v3

发布于 2024-08-05 20:40:06 字数 740 浏览 4 评论 0 原文

我构建了一个 WSSv3 应用程序,可以以小块的形式上传文件;当每个数据块到达时,出于性能原因**,我会暂时将其保留到 SQL 2005 图像数据类型字段中。

上传结束时出现问题;我需要通过 WSSv3 对象模型将数据从 SQL Server 移动到 Sharepoint 文档库。

现在,我可以想到两种方法:

SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException

还有

SPFile file = folder.Files.Add("filename", new byte[]{ });
using(Stream stream = file.OpenBinaryStream())
{
    // ... init vars and stuff ...
    while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) > 0)
    {
        stream.Write(buffer, 0, (int)bytes); // Timeout issues
    }
    file.SaveBinary(stream);
}

其他方法可以成功完成这项任务吗?

** 性能原因:如果您尝试直接在 Sharepoint 写入每个块,您会注意到随着文件增大(>100Mb),性能会下降。

I'd built an WSSv3 application which upload files in small chunks; when every data piece arrives, I temporarly keep it into a SQL 2005 image data type field for performance reasons**.

Problem come when upload ends; I need to move data from my SQL Server to Sharepoint Document Library through WSSv3 object model.

Right now, I can think two approaches:

SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException

and

SPFile file = folder.Files.Add("filename", new byte[]{ });
using(Stream stream = file.OpenBinaryStream())
{
    // ... init vars and stuff ...
    while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) > 0)
    {
        stream.Write(buffer, 0, (int)bytes); // Timeout issues
    }
    file.SaveBinary(stream);
}

Are there any other way to complete successfully this task?

** Performance reasons: if you tries to write every chunk directly at Sharepoint, you'll note a performance degradation as file grows up (>100Mb).

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

神爱温柔 2024-08-12 20:40:06

我以以下代码结束:


myFolder.Files.Add("filename", 
   new DataRecordStream(dataReader, 
      dataReader.GetOrdinal("Content"), length));

您可以找到 此处实现 DataRecordStream。它基本上是一个通过 .GetBytesDbDataRecord 读取数据的 Stream。

这种方法类似于 OpenBinaryStream()/SaveBinary(stream),但它是当您传输数据时,不会将所有 byte[] 保留在内存中。在某些时候,将使用 64k 块从 Microsoft.SharePoint.SPFile.CloneStreamToSPFileStream 访问 DataRecordStream。

谢谢大家提供的宝贵信息!

I ended with following code:


myFolder.Files.Add("filename", 
   new DataRecordStream(dataReader, 
      dataReader.GetOrdinal("Content"), length));

You can find DataRecordStream implementation here. It's basically a Stream whos read data from a DbDataRecord through .GetBytes

This approach is similar to OpenBinaryStream()/SaveBinary(stream), but it's doesnt keeps all byte[] in memory while you transfer data. In some point, DataRecordStream will be accessed from Microsoft.SharePoint.SPFile.CloneStreamToSPFileStream using 64k chunks.

Thank you all for valuable infos!

只是我以为 2024-08-12 20:40:06

我要说的第一件事是 SharePoint确实不是为此而设计的。它将所有文件存储在自己的数据库中,因此这就是这些大文件的存放位置。由于多种原因,这不是一个好主意:可扩展性、成本、备份/恢复、性能等......因此我强烈建议使用文件共享

您可以通过更改 httpRuntime 元素的executionTimeout 属性来增加 Web 请求的超时时间在 web.config 中。

除此之外,我不确定还有什么建议。我还没有听说过 SharePoint 中存储有这么大的文件。如果您绝对必须这样做,请尝试询问服务器故障

The first thing I would say is that SharePoint is really, really not designed for this. It stores all files in its own database so that's where these large files are going. This is not a good idea for lots of reasons: scalability, cost, backup/restore, performance, etc... So I strongly recommend using file shares instead.

You can increase the timeout of the web request by changing the executionTimeout attribute of the httpRuntime element in web.config.

Apart from that, I'm not sure what else to suggest. I haven't heard of such large files being stored in SharePoint. If you absolutely must do this, try also asking on Server Fault.

陈独秀 2024-08-12 20:40:06

如前所述,在 Sharepoint 中存储大文件通常不是一个好主意。有关详细信息,请参阅本文:http://blogs.msdn.com/joelo/archive/2007/11/08/what-not-to-store-in-sharepoint.aspx

话虽如此,这是可能的 使用 BLOB 的外部存储,这可能会或可能不会帮助您解决性能问题 - Microsoft 发布了一个半完整的外部 BLOB 存储提供程序,可以解决这个问题,但不幸的是,它在场级别工作并影响所有上传。恶心。

幸运的是,由于您可以实现自己的外部 BLOB 提供程序,因此您可以编写一些内容来更好地处理这些特定文件。有关详细信息,请参阅此文章: http://207.46.16.252/en-us/ magazine/2009.06.insidesharepoint.aspx

这是否值得花费这些开销取决于您遇到的问题的严重程度。 :)

As mentioned previously, storing large files in Sharepoint is generally a bad idea. See this article for more information: http://blogs.msdn.com/joelo/archive/2007/11/08/what-not-to-store-in-sharepoint.aspx

With that said, it is possible to use external storage for BLOBs, which may or may not help your performance issues -- Microsoft released a half-complete external BLOB storage provider that does the trick, but it unfortunately works at the farm level and affects all uploads. Ick.

Fortunately, since you can implement your own external BLOB provider, you may be able to write something to better handle these specific files. See this article for details: http://207.46.16.252/en-us/magazine/2009.06.insidesharepoint.aspx

Whether or not this would be worth the overhead depends on how much of a problem you're having. :)

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文