Windows Azure:无法将 34 MB 文件上传到 blob

发布于 2024-08-28 13:04:47 字数 1259 浏览 6 评论 0原文

我试图将 34 MB 文件上传到 blob,但它提示我一些错误

    XML Parsing Error: no element found
Location: http://127.0.0.1:83/Default.aspx
Line Number 1, Column 1:

我应该做什么....如何解决它


我能够上传大小为 500KB 的小文件..但我有一个大小为 的文件34 MB 要上传到我的 blob 容器中

我尝试使用

protected void ButUpload_click(object sender, EventArgs e)
        {
            // store upladed file as a blob storage
            if (uplFileUpload.HasFile)
            {
                name = uplFileUpload.FileName;
                // get refernce to the cloud blob container
                CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents");

                // set the name for the uploading files
                string UploadDocName = name;

                // get the blob reference and set the metadata properties
                CloudBlob blob = blobContainer.GetBlobReference(UploadDocName);
                blob.Metadata["FILETYPE"] = "text";
                blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType;

                // upload the blob to the storage
                blob.UploadFromStream(uplFileUpload.FileContent);

            }
        } 

但无法上传它..任何人都可以告诉我该怎么做....

I was trying to upload a 34 MB file onto the blob but it is prompting me some error

    XML Parsing Error: no element found
Location: http://127.0.0.1:83/Default.aspx
Line Number 1, Column 1:

What should I do....How to solve it


I am able to upload small files of size 500KB.. but I have a file of size 34 MB to be uploaded into my blob container

I tried it using

protected void ButUpload_click(object sender, EventArgs e)
        {
            // store upladed file as a blob storage
            if (uplFileUpload.HasFile)
            {
                name = uplFileUpload.FileName;
                // get refernce to the cloud blob container
                CloudBlobContainer blobContainer = cloudBlobClient.GetContainerReference("documents");

                // set the name for the uploading files
                string UploadDocName = name;

                // get the blob reference and set the metadata properties
                CloudBlob blob = blobContainer.GetBlobReference(UploadDocName);
                blob.Metadata["FILETYPE"] = "text";
                blob.Properties.ContentType = uplFileUpload.PostedFile.ContentType;

                // upload the blob to the storage
                blob.UploadFromStream(uplFileUpload.FileContent);

            }
        } 

But I am not able to upload it.. Can anyone tell me How to do that....

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

糖果控 2024-09-04 13:04:47

大于 64MB 的 Blob 必须使用块 Blob 上传。您将文件分成块,上传所有块(将每个块与唯一的字符串标识符相关联),最后将块 ID 列表发布到 blob 以一次性提交整个批次。

对于大小小于 64MB 的大 blob,还建议按块上传。网络连接或互联网路由中的故障很容易在非常大的上传中丢失一两帧,这将损坏整个上传或使整个上传无效。使用较小的方块来减少您接触宇宙事件的机会。

此讨论线程中的更多信息: http://social.msdn.microsoft.com/Forums/en-NZ/windowsazure/thread/f4575746-a695-40ff-9e49-ffe4c99b28c7

Blobs larger than 64MB must be uploaded using block blobs. You break the file into blocks, upload all the blocks (associating each block with a unique string identifier), and at the very end you post the list of block IDs to the blob to commit the entire batch in one go.

Uploading in blocks is also recommended for large blobs less than 64MB in size. It is very easy for a hiccup in the network connection or routing through the internet to lose a frame or two in a very large upload, which will corrupt or invalidate the entire upload. Use smaller blocks to reduce your exposure to cosmic events.

More info in this discussion thread: http://social.msdn.microsoft.com/Forums/en-NZ/windowsazure/thread/f4575746-a695-40ff-9e49-ffe4c99b28c7

╭⌒浅淡时光〆 2024-09-04 13:04:47

我首先会在项目中进行一些登录,以尝试跟踪问题。它可能不会发生在你想象的地方。也可能存在权限错误。尝试向数据库中添加一些虚拟数据。如果仍然失败,则可能是一个潜在的问题。

但你可以通过一些调试、日志记录和代码审查来跟踪它,我敢打赌,这样你可以更快地找到问题的根源。它还将有助于使您的代码更加健壮。

I would start by dropping some logging into the project to try and track the problem down. It may not be happening where you think. There might also be a permissions error. Try adding some dummy data into the database. If it still fails that might be a potential problem.

But track it down yourself with some debug, logging and some code review, I bet you can get to the bottom of the problem sooner that way. And it will also help to make your code more robust.

悲欢浪云 2024-09-04 13:04:47

您可以在此处使用 Blob。我认为这是您的网络请求大小的问题。您可以通过增加元素中 maxRequestLength 属性的数量来更改 web.config 中的此设置。如果您发送 500Kb 的块,那么您就会浪费带宽并降低性能。发送更大的数据块,例如每个块 1-2 Mb。请参阅我的基于 Silverlight 或 HTML5 的上传控件以进行分块上传。 选择 Azure 文件上传控件:Silverlight 和 TPL 或 HTML5 和 AJAX

You can use Blobs here. I think its an issue with your web request size. You can change this setting in the web.config by increasing the number of the maxRequestLength attribute in the element. If you are sending chunks of 500Kb, then you are wasting bandwidth and bringing down performance. Send bigger chunks of data such as 1-2 Mb per chunk. See my Silverlight or HTML5 based upload control for chunked uploads. Pick Your Azure File Upload Control: Silverlight and TPL or HTML5 and AJAX

赴月观长安 2024-09-04 13:04:47

使用 Blob 传输实用程序下载和上传所有 Blob 文件。

它是一种有效处理数千个(小/大)blob 传输的工具。

二进制文件和源代码,此处:http://bit.ly/blobtransfer

Use the Blob Transfer Utility to download and upload all your blob files.

It's a tool to handle thousands of (small/large) blob transfers in a effective way.

Binaries and source code, here: http://bit.ly/blobtransfer

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文