如何将大型二进制数据流式传输到 SQL Server

发布于 2024-10-09 22:49:16 字数 401 浏览 1 评论 0原文

我有一个用 ASP.NET MVC 编写的带有 SQL Server 后端的 Web 服务。客户端可以在对 Web 服务的调用中发送大数据,每次约为 10 兆字节。

在服务器端,我想读取 HTTP 请求流并将其写入 SQL 表中的 VARBINARY 字段。 如何在不将整个流读入内存数组的情况下做到这一点

我担心内存的原因是这个系统需要扩展到 1000 个并发客户端(至少)。

我目前正在使用 Linq to SQL,它看起来不支持数据流,因为 Linq Binary 类型只能使用字节数组进行初始化。

是否有其他方法可以使用 Linq to SQL 来执行此操作?

如果使用 Linq to SQL 无法做到这一点,我如何使用 ADO.NET 或其他方法来做到这一点?

I have a web service written in ASP.NET MVC with a SQL Server backend. Clients can send large data in calls to the web service, in the order of 10's of megabytes each.

On the server side I want to read the HTTP request stream and write it to VARBINARY field in a SQL table. How can I do this without reading the entire stream into an in-memory array?

The reason I am concerned about memory is that this system needs to scale to 1000's of concurrent clients (at least).

I'm currently using Linq to SQL and it doesn't look like it supports streaming of data, because the Linq Binary type can only be initialized using a byte array.

Is there some other way to do this using Linq to SQL?

If this is not possible using Linq to SQL, how can I do this using ADO.NET or some other approach?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

去了角落 2024-10-16 22:49:16

您可以将数据流式传输到磁盘,然后使用 OPENROWSET 执行批量导入数据。例如:

INSERT INTO YOURTABLE(binaryColumnName) 
SELECT * FROM 
OPENROWSET(BULK N'C:\data.bin', SINGLE_BLOB)

您可以使用临时文件 API为了避免管理磁盘上数据的生命周期......一旦不再使用,它​​将被删除。

You can stream the data to disk, and then use OPENROWSET to perform a bulk import of the data. For example:

INSERT INTO YOURTABLE(binaryColumnName) 
SELECT * FROM 
OPENROWSET(BULK N'C:\data.bin', SINGLE_BLOB)

You can use temp file APIs to avoid having to manage the lifetime of the data on disk ... it will be deleted once no longer in use.

日久见人心 2024-10-16 22:49:16

我相信您可以使用 UPDATE 语句的 .WRITE 子句来执行分块数据插入。如果您使用 LINQ,则需要从数据上下文对象创建两个命令。第一个创建(插入)将包含数据的行。第二个是使用 .WRITE 子句的 UPDATE 语句。参数化数据和偏移值。循环执行 UPDATE 语句,次数与“块”的次数相同。

INSERT INTO XrayImages (HeaderId, ImageBytes) VALUES(@headerId, @imageValue)

UPDATE XrayImages SET ImageBytes.WRITE(@imageChunk, NULL, @chunkLength WHERE ImageId = @imageId;

有关详细信息,请查看此 MSDN 文章:http://msdn.microsoft.com/en-us/library/bb399384.aspx


我非常确定,当您的控制器代码被调用时,整个请求已被接收(并且位于内存中),我的基础是 Request.Files 集合已经知道有多少个文件,对于多部分表单,我知道如何确定这一点的唯一方法是通过读取整个请求,我不知道如何在 MVC 中更改它,因此您可能必须编写一个执行此操作的处理程序。实际文件上传本身

需要考虑很多变量,具体取决于同时上传数据的 1000 个并发客户端的数量,数据库服务器和 Web 之间需要多个 GB 的网卡。服务器。您还需要数据库服务器上的 I/O 功能,这将超出我的经验。

我会认真考虑将文件保存到基于文件系统的位置,并将元信息保存在数据库中,特别是如果您可以将文件保存在 Web 服务器上。否则,我认为使用 SqlServer 的 FileStream 功能比纯表解决方案更具可扩展性。

希望有帮助!

I believe you can use the .WRITE clause of the UPDATE statement to do chunked data insertion. If you're using LINQ, you'd create two commands from the data context object. First one to create (INSERT) the row that will contain the data. The second one would be an UPDATE statement that uses the .WRITE clause. parameterize the data and offset values. Loop through, executing the UPDATE statement as many times as you have "chunks".

INSERT INTO XrayImages (HeaderId, ImageBytes) VALUES(@headerId, @imageValue)

UPDATE XrayImages SET ImageBytes.WRITE(@imageChunk, NULL, @chunkLength WHERE ImageId = @imageId;

For more information, please check out this MSDN Article: http://msdn.microsoft.com/en-us/library/bb399384.aspx


I'm pretty sure that by the time that your Controller code is called, the entire request has been received (and is in memory). I'm basing this off that the Request.Files collection already knows how many files are there, and their lengths. With multi-part forms, the only way I know how to determine this is by reading the entire request. I don't know of anyway to change this in MVC, so you may have to write a handler that does the actual file upload itself.

There are a lot of variables here that would need to be considered. Depending on the amount of the 1000's of concurrent clients that will be uploading data at once, you'd need multiple GB nics between the db server and web server. You'd also need I/O capabilities on your database server that would be beyond what I've experienced.

I would seriously consider saving the files to a file system based location and having the meta info in the database, especially if you can keep the files on the web server. Otherwise, i think that using SqlServer's FileStream abilities would be more scalable than just a pure table solution.

Hope that helps!

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文