将 Cache-Control 和 Expires 标头添加到 Azure 存储 Blob

发布于 2024-10-09 09:06:57 字数 381 浏览 2 评论 0原文

我使用 Azure 存储来提供静态文件 blob,但我想在提供时向文件/blob 添加 Cache-Control 和 Expires 标头,以降低带宽成本。

CloudXplorer 和 Cerebrata 的 Cloud Storage Studio 提供了在容器和 blob 上设置元数据属性的选项,但在尝试添加缓存控制时会感到不安。

有人知道是否可以为文件设置这些标头吗?

I'm using Azure Storage to serve up static file blobs but I'd like to add a Cache-Control and Expires header to the files/blobs when served up to reduce bandwidth costs.

Application like CloudXplorer and Cerebrata's Cloud Storage Studio give options to set metadata properties on containers and blobs but get upset when trying to add Cache-Control.

Anyone know if it's possible to set these headers for files?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(9

递刀给你 2024-10-16 09:06:57

我必须在大约 600k blob 上运行批处理作业,并发现有两件事确实有帮助:

  1. 从同一数据中心中的辅助角色运行操作。只要 Azure 服务位于同一关联组中,它们之间的速度就非常快。另外,没有数据传输费用。
  2. 并行运行操作。 .net v4 中的任务并行库 (TPL) 使这变得非常简单。以下是为容器中的每个 blob 并行设置缓存控制标头的代码:

    //获取容器中每个blob的信息
    var blobInfos = cloudBlobContainer.ListBlobs(
        新的 BlobRequestOptions() { UseFlatBlobListing = true });
    Parallel.ForEach(blobInfos, (blobInfo) =>
    {
        // 获取blob属性
        CloudBlob blob = container.GetBlobReference(blobInfo.Uri.ToString());
        blob.FetchAttributes();
    
        // 如果需要的话设置缓存控制头
        if (blob.Properties.CacheControl != YOUR_CACHE_CONTROL_HEADER)
        {
            blob.Properties.CacheControl = YOUR_CACHE_CONTROL_HEADER;
            blob.SetProperties();
        }
    });
    

I had to run a batch job on about 600k blobs and found 2 things that really helped:

  1. Running the operation from a worker role in the same data center. The speed between Azure services is great as long as they are in the same affinity group. Plus there are no data transfer costs.
  2. Running the operation in parallel. The Task Parallel Library (TPL) in .net v4 makes this really easy. Here is the code to set the cache-control header for every blob in a container in parallel:

    // get the info for every blob in the container
    var blobInfos = cloudBlobContainer.ListBlobs(
        new BlobRequestOptions() { UseFlatBlobListing = true });
    Parallel.ForEach(blobInfos, (blobInfo) =>
    {
        // get the blob properties
        CloudBlob blob = container.GetBlobReference(blobInfo.Uri.ToString());
        blob.FetchAttributes();
    
        // set cache-control header if necessary
        if (blob.Properties.CacheControl != YOUR_CACHE_CONTROL_HEADER)
        {
            blob.Properties.CacheControl = YOUR_CACHE_CONTROL_HEADER;
            blob.SetProperties();
        }
    });
    
黄昏下泛黄的笔记 2024-10-16 09:06:57

这是 Joel Fillmore 使用 Net 5 和 V12 Azure.Storage.Blobs 的答案的更新版本。 (旁白:如果可以在父容器上设置默认标头属性不是很好吗?)

Azure 可以运行“WebJobs”,而不是创建网站并使用 WorkerRole。您可以在存储帐户所在的同一数据中心的网站上按需运行任何可执行文件,以设置缓存标头或任何其他标头字段。

  1. 在与您的存储帐户同一数据中心中创建一个一次性临时网站。不用担心亲和力团体;创建一个空的 ASP.NET 站点或任何其他简单的站点。内容并不重要。我需要至少使用 B1 服务计划,否则 WebJob 将在 5 分钟后中止。
  2. 使用下面的代码创建一个控制台程序,该程序可与更新的 Azure 存储 API 配合使用。编译它以供发布,然后将可执行文件和所有必需的 DLL 压缩到 .zip 文件中,或者直接从 VisualStudio 发布并跳过下面的#3。
  3. 创建一个 WebJob 并上传步骤 #2 中的 .zip 文件。
  4. 运行 Web 作业。写入控制台的所有内容都可以在创建的日志文件中查看,并可以从 WebJob 控制页面访问。
    输入图片此处说明
  5. 删除临时网站,或将其更改为免费套餐(在“扩展”下)。

下面的代码为每个容器运行一个单独的任务,并且我每分钟更新多达 100K 个标头(取决于一天中的时间?)。没有出口费用。

using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace AzureHeaders
{
    class Program
    {
        private static string connectionString = "DefaultEndpointsProtocol=https;AccountName=REPLACE_WITH_YOUR_CONNECTION_STRING";
        private static string newCacheControl = "public, max-age=7776001"; // 3 months
        private static string[] containersToProcess = { "container1", "container2" };

        static async Task Main(string[] args)
        {
            BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);

            var tasks = new List<Task>();
            foreach (var container in containersToProcess)
            {
                BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(container);
                tasks.Add(Task.Run(() => UpdateHeaders(containerClient, 1000)));  // I have no idea what segmentSize should be!
            }
            Task.WaitAll(tasks.ToArray());
        }

        private static async Task UpdateHeaders(BlobContainerClient blobContainerClient, int? segmentSize)
        {
            int processed = 0;
            int failed = 0;
            try
            {
                // Call the listing operation and return pages of the specified size.
                var resultSegment = blobContainerClient.GetBlobsAsync()
                    .AsPages(default, segmentSize);

                // Enumerate the blobs returned for each page.
                await foreach (Azure.Page<BlobItem> blobPage in resultSegment)
                {
                    var tasks = new List<Task>();

                    foreach (BlobItem blobItem in blobPage.Values)
                    {
                        BlobClient blobClient = blobContainerClient.GetBlobClient(blobItem.Name);
                        tasks.Add(UpdateOneBlob(blobClient));
                        processed++;
                    }
                    Task.WaitAll(tasks.ToArray());
                    Console.WriteLine($"Container {blobContainerClient.Name} processed: {processed}");
                }
            }
            catch (RequestFailedException e)
            {
                Console.WriteLine(e.Message);
                failed++;
            }
            Console.WriteLine($"Container {blobContainerClient.Name} processed: {processed}, failed: {failed}");
        }

        private static async Task UpdateOneBlob(BlobClient blobClient) {
            Response<BlobProperties> propertiesResponse = await blobClient.GetPropertiesAsync();
            BlobHttpHeaders httpHeaders = new BlobHttpHeaders
            {
                // copy any existing headers you wish to preserve
                ContentType = propertiesResponse.Value.ContentType,
                ContentHash = propertiesResponse.Value.ContentHash,
                ContentEncoding = propertiesResponse.Value.ContentEncoding,
                ContentDisposition = propertiesResponse.Value.ContentDisposition,
                // update CacheControl
                CacheControl = newCacheControl  
            };
            await blobClient.SetHttpHeadersAsync(httpHeaders);
        }
    }
}

Here's an updated version of Joel Fillmore's answer using Net 5 and V12 of Azure.Storage.Blobs. (Aside: wouldn't it be nice if default header properties could be set on the parent container?)

Instead of creating a website and using a WorkerRole, Azure has the ability to run "WebJobs". You can run any executable on demand on a website at the same datacenter where your storage account is located to set cache headers or any other header field.

  1. Create a throw-away, temporary website in the same datacenter as your storage account. Don't worry about affinity groups; create an empty ASP.NET site or any other simple site. The content is unimportant. I needed to use at least a B1 service plan, otherwise the WebJob aborted after 5 minutes.
  2. Create a console program using the code below which works with the updated Azure Storage APIs. Compile it for release, and then zip the executable and all required DLLs into a .zip file, or just publish it from VisualStudio and skip #3 below.
  3. Create a WebJob and upload the .zip file from step #2.
  4. Run the WebJob. Everything written to the console is available to view in the log file created and accessible from the WebJob control page.
    enter image description here
  5. Delete the temporary website, or change it to a Free tier (under "Scale Up").

The code below runs a separate task for each container, and I'm getting up to 100K headers updated per minute (depending on time of day?). No egress charges.

using Azure;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;

namespace AzureHeaders
{
    class Program
    {
        private static string connectionString = "DefaultEndpointsProtocol=https;AccountName=REPLACE_WITH_YOUR_CONNECTION_STRING";
        private static string newCacheControl = "public, max-age=7776001"; // 3 months
        private static string[] containersToProcess = { "container1", "container2" };

        static async Task Main(string[] args)
        {
            BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);

            var tasks = new List<Task>();
            foreach (var container in containersToProcess)
            {
                BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(container);
                tasks.Add(Task.Run(() => UpdateHeaders(containerClient, 1000)));  // I have no idea what segmentSize should be!
            }
            Task.WaitAll(tasks.ToArray());
        }

        private static async Task UpdateHeaders(BlobContainerClient blobContainerClient, int? segmentSize)
        {
            int processed = 0;
            int failed = 0;
            try
            {
                // Call the listing operation and return pages of the specified size.
                var resultSegment = blobContainerClient.GetBlobsAsync()
                    .AsPages(default, segmentSize);

                // Enumerate the blobs returned for each page.
                await foreach (Azure.Page<BlobItem> blobPage in resultSegment)
                {
                    var tasks = new List<Task>();

                    foreach (BlobItem blobItem in blobPage.Values)
                    {
                        BlobClient blobClient = blobContainerClient.GetBlobClient(blobItem.Name);
                        tasks.Add(UpdateOneBlob(blobClient));
                        processed++;
                    }
                    Task.WaitAll(tasks.ToArray());
                    Console.WriteLine(
quot;Container {blobContainerClient.Name} processed: {processed}");
                }
            }
            catch (RequestFailedException e)
            {
                Console.WriteLine(e.Message);
                failed++;
            }
            Console.WriteLine(
quot;Container {blobContainerClient.Name} processed: {processed}, failed: {failed}");
        }

        private static async Task UpdateOneBlob(BlobClient blobClient) {
            Response<BlobProperties> propertiesResponse = await blobClient.GetPropertiesAsync();
            BlobHttpHeaders httpHeaders = new BlobHttpHeaders
            {
                // copy any existing headers you wish to preserve
                ContentType = propertiesResponse.Value.ContentType,
                ContentHash = propertiesResponse.Value.ContentHash,
                ContentEncoding = propertiesResponse.Value.ContentEncoding,
                ContentDisposition = propertiesResponse.Value.ContentDisposition,
                // update CacheControl
                CacheControl = newCacheControl  
            };
            await blobClient.SetHttpHeadersAsync(httpHeaders);
        }
    }
}
绾颜 2024-10-16 09:06:57

Cerebrata Cloud Storage Studio最新版本,v2011.04.23.00,支持设置对单个 blob 对象的缓存控制。右键单击 Blob 对象,选择“查看/编辑 Blob 属性”,然后设置 Cache-Control 属性的值。 (例如public,max-age=2592000)。

如果使用curl 检查 blob 对象的 HTTP 标头,您将看到返回的缓存控制标头以及您设置的值。

The latest version of Cerebrata Cloud Storage Studio, v2011.04.23.00, supports setting cache-control on individual blob objects. Right click on the blob object, choose "View/Edit Blob Properties" then set the value for the Cache-Control attribute. (e.g. public, max-age=2592000).

If you check the HTTP headers of the blob object using curl, you'll see the cache-control header returned with the value you set.

与之呼应 2024-10-16 09:06:57

有时,最简单的答案就是最好的答案。如果您只想管理少量 Blob,可以使用 Azure 管理 更改以下内容的标头/元数据:你的斑点。

  1. 单击存储,然后单击存储帐户名称。
  2. 单击容器选项卡,然后单击一个容器。
  3. 单击一个斑点,然后单击屏幕底部的“编辑”。

在该编辑窗口中,您可以自定义缓存控制内容编码内容语言等。

注意:您当前无法从 Azure 编辑此数据门户

Sometimes, the simplest answer is the best one. If you just want to manage a small amount of blobs, you can use Azure Management to change the headers/metadata for your blobs.

  1. Click on Storage, then click on the storage account name.
  2. Click the Containers tab, then click on a container.
  3. Click on a blob, then click on Edit at the bottom of the screen.

In that edit window, you can customize the Cache Control, Content Encoding, Content Language, and more.

Note: you cannot currently edit this data from the Azure Portal

挖个坑埋了你 2024-10-16 09:06:57

这是 Joel Fillmore 使用 WindowsAzure.Storage v9.3.3 的答案的更新版本。请注意,ListBlobsSegmentedAsync 返回的页面大小为 5,000,这就是使用 BlobContinuationToken 的原因。

    public async Task BackfillCacheControlAsync()
    {
        var container = await GetCloudBlobContainerAsync();
        BlobContinuationToken continuationToken = null;

        do
        {
            var blobInfos = await container.ListBlobsSegmentedAsync(string.Empty, true, BlobListingDetails.None, null, continuationToken, null, null);
            continuationToken = blobInfos.ContinuationToken;
            foreach (var blobInfo in blobInfos.Results)
            {
                var blockBlob = (CloudBlockBlob)blobInfo;
                var blob = await container.GetBlobReferenceFromServerAsync(blockBlob.Name);
                if (blob.Properties.CacheControl != "public, max-age=31536000")
                {
                    blob.Properties.CacheControl = "public, max-age=31536000";
                    await blob.SetPropertiesAsync();
                }
            }               
        }
        while (continuationToken != null);
    }

    private async Task<CloudBlobContainer> GetCloudBlobContainerAsync()
    {
        var storageAccount = CloudStorageAccount.Parse(_appSettings.AzureStorageConnectionString);
        var blobClient = storageAccount.CreateCloudBlobClient();
        var container = blobClient.GetContainerReference("uploads");
        return container;
    }

Here's an updated version of Joel Fillmore's answer consuming WindowsAzure.Storage v9.3.3. Note that ListBlobsSegmentedAsync returns a page size of 5,000 which is why the BlobContinuationToken is used.

    public async Task BackfillCacheControlAsync()
    {
        var container = await GetCloudBlobContainerAsync();
        BlobContinuationToken continuationToken = null;

        do
        {
            var blobInfos = await container.ListBlobsSegmentedAsync(string.Empty, true, BlobListingDetails.None, null, continuationToken, null, null);
            continuationToken = blobInfos.ContinuationToken;
            foreach (var blobInfo in blobInfos.Results)
            {
                var blockBlob = (CloudBlockBlob)blobInfo;
                var blob = await container.GetBlobReferenceFromServerAsync(blockBlob.Name);
                if (blob.Properties.CacheControl != "public, max-age=31536000")
                {
                    blob.Properties.CacheControl = "public, max-age=31536000";
                    await blob.SetPropertiesAsync();
                }
            }               
        }
        while (continuationToken != null);
    }

    private async Task<CloudBlobContainer> GetCloudBlobContainerAsync()
    {
        var storageAccount = CloudStorageAccount.Parse(_appSettings.AzureStorageConnectionString);
        var blobClient = storageAccount.CreateCloudBlobClient();
        var container = blobClient.GetContainerReference("uploads");
        return container;
    }
慵挽 2024-10-16 09:06:57

这可能来不及回答了,但最近我想以不同的方式做同样的事情,我有图像列表,需要使用 powershell 脚本来应用(当然是在 Azure 存储组件的帮助下)
希望将来有人会发现这很有用。

使用 powershell 脚本设置 Azure blob 缓存控制<中给出的完整说明< /a>

Add-Type -Path "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.3\ref\Microsoft.WindowsAzure.StorageClient.dll"

$accountName = "[azureaccountname]"
$accountKey = "[azureaccountkey]"
$blobContainerName = "images"

$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $accountName,$accountKey
$storageAccount = New-Object Microsoft.WindowsAzure.CloudStorageAccount -ArgumentList $storageCredentials,$true
#$blobClient = $storageAccount.CreateCloudBlobClient()
$blobClient =  [Microsoft.WindowsAzure.StorageClient.CloudStorageAccountStorageClientExtensions]::CreateCloudBlobClient($storageAccount)

$cacheControlValue = "public, max-age=604800"

echo "Setting cache control: $cacheControlValue"

Get-Content "imagelist.txt" | foreach {     
    $blobName = "$blobContainerName/$_".Trim()
    echo $blobName
    $blob = $blobClient.GetBlobReference($blobName)
    $blob.Properties.CacheControl = $cacheControlValue
    $blob.SetProperties()
}

This might be too late to answer, but recently I wanted to do the same in different manner, where I have list of images and needed to apply using powershell script (of course with the help of Azure storage assembly)
Hope someone will find this useful in future.

Complete explanation given in Set Azure blob cache-control using powershell script

Add-Type -Path "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.3\ref\Microsoft.WindowsAzure.StorageClient.dll"

$accountName = "[azureaccountname]"
$accountKey = "[azureaccountkey]"
$blobContainerName = "images"

$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $accountName,$accountKey
$storageAccount = New-Object Microsoft.WindowsAzure.CloudStorageAccount -ArgumentList $storageCredentials,$true
#$blobClient = $storageAccount.CreateCloudBlobClient()
$blobClient =  [Microsoft.WindowsAzure.StorageClient.CloudStorageAccountStorageClientExtensions]::CreateCloudBlobClient($storageAccount)

$cacheControlValue = "public, max-age=604800"

echo "Setting cache control: $cacheControlValue"

Get-Content "imagelist.txt" | foreach {     
    $blobName = "$blobContainerName/$_".Trim()
    echo $blobName
    $blob = $blobClient.GetBlobReference($blobName)
    $blob.Properties.CacheControl = $cacheControlValue
    $blob.SetProperties()
}
旧时光的容颜 2024-10-16 09:06:57

这是一个批处理/unix 脚本,适用于不使用 PowerShell 的 Windows 计算机的每个人。以下脚本循环遍历所有 blob,并单独设置 blob 上的 Content-Cache 属性(Cache-Control http 标头)。

不幸的是,没有办法同时设置多个 blob 的属性,因此这是一项耗时的任务。每个斑点通常需要大约 1-2 秒。然而,正如 Jay Borseth 指出的那样,如果从与存储帐户位于同一数据中心的服务器运行该过程,则会显着加快该过程。

# Update Azure Blob Storage blob's cache-control headers
# /content-cache properties
# 
# Quite slow, since there is no `az storage blob update-batch`
#
# Created by Jon Tingvold, March 2021
#
#
# If you want progress, you need to install pv:
# >>> brew install pv  # Mac
# >>> sudo apt install pv  # Ubuntu
#

set -e  # exit when any command fails

AZURE_BLOB_CONNECTION_STRING='DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=XXXXXXXXXXXX;AccountKey=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='
CONTAINER_NAME=main

BLOB_PREFIX='admin/'
CONTENT_CACHE='max-age=3600'
NUM_RESULTS=10000000  # Defaults to 5000

BLOB_NAMES=$(az storage blob list --connection-string $AZURE_BLOB_CONNECTION_STRING --container-name $CONTAINER_NAME --query '[].name' --output tsv --num-results $NUM_RESULTS --prefix $BLOB_PREFIX)
NUMBER_OF_BLOBS=$(echo $BLOB_NAMES | wc -w)

echo "Ask Azure for files in Blob Storage ..."
echo "Set content-cache on $NUMBER_OF_BLOBS blobs ..."

for BLOB_NAME in $BLOB_NAMES
do
  az storage blob update --connection-string $AZURE_BLOB_CONNECTION_STRING --container-name $CONTAINER_NAME --name $BLOB_NAME --content-cache $CONTENT_CACHE > /dev/null;
  echo "$BLOB_NAME"

# If you don't have pv install, uncomment  everything after done
done | cat | pv -pte --line-mode --size $NUMBER_OF_BLOBS > /dev/null

Here is a batch/unix script for everyone that does not sit on a Windows machine with PowerShell. The following script loops through all blobs and sets the Content-Cache property (Cache-Control http header) on the blobs individually.

Unfortunately, there is no good is no way to set properties on several blobs simultaneously, so this is a time consuming task. It usually takes around 1–2 seconds per blob. However, as Jay Borseth points out, the process is significantly accelerated if run it from a server in the same data center as your storage account.

# Update Azure Blob Storage blob's cache-control headers
# /content-cache properties
# 
# Quite slow, since there is no `az storage blob update-batch`
#
# Created by Jon Tingvold, March 2021
#
#
# If you want progress, you need to install pv:
# >>> brew install pv  # Mac
# >>> sudo apt install pv  # Ubuntu
#

set -e  # exit when any command fails

AZURE_BLOB_CONNECTION_STRING='DefaultEndpointsProtocol=https;EndpointSuffix=core.windows.net;AccountName=XXXXXXXXXXXX;AccountKey=XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX=='
CONTAINER_NAME=main

BLOB_PREFIX='admin/'
CONTENT_CACHE='max-age=3600'
NUM_RESULTS=10000000  # Defaults to 5000

BLOB_NAMES=$(az storage blob list --connection-string $AZURE_BLOB_CONNECTION_STRING --container-name $CONTAINER_NAME --query '[].name' --output tsv --num-results $NUM_RESULTS --prefix $BLOB_PREFIX)
NUMBER_OF_BLOBS=$(echo $BLOB_NAMES | wc -w)

echo "Ask Azure for files in Blob Storage ..."
echo "Set content-cache on $NUMBER_OF_BLOBS blobs ..."

for BLOB_NAME in $BLOB_NAMES
do
  az storage blob update --connection-string $AZURE_BLOB_CONNECTION_STRING --container-name $CONTAINER_NAME --name $BLOB_NAME --content-cache $CONTENT_CACHE > /dev/null;
  echo "$BLOB_NAME"

# If you don't have pv install, uncomment  everything after done
done | cat | pv -pte --line-mode --size $NUMBER_OF_BLOBS > /dev/null
苍风燃霜 2024-10-16 09:06:57

通过 PowerShell 脚本设置存储 blob 缓存控制属性

https://gallery .technet.microsoft.com/How-to-set-storage-blob-4774aca5

#creat CloudBlobClient 
Add-Type -Path "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.3\ref\Microsoft.WindowsAzure.StorageClient.dll" 
$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $StorageName,$StorageKey 
$blobClient =   New-Object Microsoft.WindowsAzure.StorageClient.CloudBlobClient($BlobUri,$storageCredentials) 
#set Properties and Metadata 
$cacheControlValue = "public, max-age=60480" 
foreach ($blob in $blobs) 
{ 
  #set Metadata 
  $blobRef = $blobClient.GetBlobReference($blob.Name) 
  $blobRef.Metadata.Add("abcd","abcd") 
  $blobRef.SetMetadata() 

  #set Properties 
  $blobRef.Properties.CacheControl = $cacheControlValue 
  $blobRef.SetProperties() 
}

Set storage blob cache-control Properties by PowerShell script

https://gallery.technet.microsoft.com/How-to-set-storage-blob-4774aca5

#creat CloudBlobClient 
Add-Type -Path "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.3\ref\Microsoft.WindowsAzure.StorageClient.dll" 
$storageCredentials = New-Object Microsoft.WindowsAzure.StorageCredentialsAccountAndKey -ArgumentList $StorageName,$StorageKey 
$blobClient =   New-Object Microsoft.WindowsAzure.StorageClient.CloudBlobClient($BlobUri,$storageCredentials) 
#set Properties and Metadata 
$cacheControlValue = "public, max-age=60480" 
foreach ($blob in $blobs) 
{ 
  #set Metadata 
  $blobRef = $blobClient.GetBlobReference($blob.Name) 
  $blobRef.Metadata.Add("abcd","abcd") 
  $blobRef.SetMetadata() 

  #set Properties 
  $blobRef.Properties.CacheControl = $cacheControlValue 
  $blobRef.SetProperties() 
}
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文