如何使用最快的方法250.000.000.000行数据创建
Helllo All,
我想在一个具有位置坐标的Azure CosmosDB项目上创建。 昨天我在下面的API上创建了一种方法:
[HttpGet("createzone")]
public async Task<ActionResult<string>> CreateZones()
{
Container container = database.GetContainer(containerId);
double x = 38.988599999993035;
double y = 26.003399999999992;
int id = 524962;
while (y < 45.0000000001)
{
Console.WriteLine("x: " + x + " y: " + y);
Zone newZone = new Zone
{
Id = "Zone." + id.ToString(),
Title = "Zone" + id.ToString(),
Owner = null,
Coordinate = new Coordinate { Latitude = x, Longitude = y },
PartitionKey = "id." + id.ToString()
};
await container.CreateItemAsync<Zone>(newZone, new PartitionKey(newZone.PartitionKey));
x += 0.0002;
id += 1;
if (x>42)
{
y += 0.0002;
x = 36;
}
}
return "done";
}
根据我的计算,如果数据继续如此
我neeed的内容,则将需要3年以上的数据: 创建应用程序中图上有正方形的区域,并确保所有这些区域都有唯一的数字。像这样:
我现在在应用程序中进行了同样的视觉效果。但是创建区域花费太多时间。
如何以最快的方式创建这些项目?
Helllo all,
I wanted to create on azure cosmosdb items which one has location coordinates.
Yesterday i created a method on API below:
[HttpGet("createzone")]
public async Task<ActionResult<string>> CreateZones()
{
Container container = database.GetContainer(containerId);
double x = 38.988599999993035;
double y = 26.003399999999992;
int id = 524962;
while (y < 45.0000000001)
{
Console.WriteLine("x: " + x + " y: " + y);
Zone newZone = new Zone
{
Id = "Zone." + id.ToString(),
Title = "Zone" + id.ToString(),
Owner = null,
Coordinate = new Coordinate { Latitude = x, Longitude = y },
PartitionKey = "id." + id.ToString()
};
await container.CreateItemAsync<Zone>(newZone, new PartitionKey(newZone.PartitionKey));
x += 0.0002;
id += 1;
if (x>42)
{
y += 0.0002;
x = 36;
}
}
return "done";
}
according to my calculations it will take more than 3 years for the data to be generated if it continues like this
What i neeed:
Creating areas shown with squares on the map in the application and ensuring that all of these areas have a unique number. Like this:
I did same visual in my app now. but creating areas taking too much time.
How can i create on fastest way these items?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
看起来您似乎并没有使用带有Cosmos DB的批量导入模式。有很多东西可以利用。在执行此操作时,还要在执行操作期间执行并考虑增加RUS时监视使用的RU。
It does not look like you have implemented this using bulk import pattern with Cosmos DB. There's a lot to gain using that. While doing this, also monitor the used RUs when you execute and consider increasing RUs for the duration of the operation.