使用AWS DMS从CSV(S3)到DynamoDB的数据迁移
当我迁移一张 json 的表格时,我正面临一个问题,该表应将其插入为 dyanmodb 地图。源数据采用以下格式
<table border="1"><tr><th>userId</th><th>brandId</th><th>createdBy</th><th>modifiedBy</th><th>preferences</th><th>version</th></tr><tr><td>TU0001</td><td>TEST BRAND</td><td>{"channel":{"S":"website"},"date":{"S": "2022-06-13T08:16:26.300Z"},"userId":{"S":"TU0001"}}</td><td>{"channel":{"S":"website"},"date":{"S": "2022-06-13T015:26:10.200Z"},"userId":{"S":"TU0001"}}</td><td>{"Colour": {"S": "Red" },"Size":{"S": "XL" }}</td><td>1</td></tr></table>
DynamoDB中的表位于SAM结构中,只是将JSON值存储为MAP(DynamoDB-MAP)。当我们使用DMS迁移时,它将JSON作为字符串值插入,而不是按预期映射。
我已经定义了转换规则如下:
"attribute-mappings": [
{
"target-attribute-name": "userId",
"attribute-type": "scalar",
"attribute-sub-type": "string",
"value": "${userId}"
},
{
"target-attribute-name": "brandId",
"attribute-type": "scalar",
"attribute-sub-type": "string",
"value": "${brandId}"
},
{
"target-attribute-name": "createdBy",
"attribute-type": "document",
"attribute-sub-type": "dynamodb-map",
"value": {
"M": {
"S": "${createdBy}"
}
}
},
{
"target-attribute-name": "modifiedBy",
"attribute-type": "document",
"attribute-sub-type": "dynamodb-map",
"value": {
"M": {
"S": "${modifiedBy}"
}
}
},
{
"target-attribute-name": "preferences",
"attribute-type": "document",
"attribute-sub-type": "dynamodb-map",
"value": {
"M": {
"S": "${preferences}"
}
}
},
{
"target-attribute-name": "version",
"attribute-type": "scalar",
"attribute-sub-type": "number",
"value": "${version}"
}
]
我还尝试在下面添加地图,并在Dyanmodb中添加一个空的地图。
"value": {
"M": "${preferences}"
}
希望有人可以帮忙。
I am facing an issue when I am migrating a table consisting a JSON in a column which should be inserted in as a DyanmoDB Map. The source data is in the following format
<table border="1"><tr><th>userId</th><th>brandId</th><th>createdBy</th><th>modifiedBy</th><th>preferences</th><th>version</th></tr><tr><td>TU0001</td><td>TEST BRAND</td><td>{"channel":{"S":"website"},"date":{"S": "2022-06-13T08:16:26.300Z"},"userId":{"S":"TU0001"}}</td><td>{"channel":{"S":"website"},"date":{"S": "2022-06-13T015:26:10.200Z"},"userId":{"S":"TU0001"}}</td><td>{"Colour": {"S": "Red" },"Size":{"S": "XL" }}</td><td>1</td></tr></table>
The table in DynamoDB is in the sam structure except that the JSON values are stored as MAP (dynamodb-map). When we migrate using DMS, it inserts the JSON as string value and not MAP as expected.
I have defined the transformation rule as follows:
"attribute-mappings": [
{
"target-attribute-name": "userId",
"attribute-type": "scalar",
"attribute-sub-type": "string",
"value": "${userId}"
},
{
"target-attribute-name": "brandId",
"attribute-type": "scalar",
"attribute-sub-type": "string",
"value": "${brandId}"
},
{
"target-attribute-name": "createdBy",
"attribute-type": "document",
"attribute-sub-type": "dynamodb-map",
"value": {
"M": {
"S": "${createdBy}"
}
}
},
{
"target-attribute-name": "modifiedBy",
"attribute-type": "document",
"attribute-sub-type": "dynamodb-map",
"value": {
"M": {
"S": "${modifiedBy}"
}
}
},
{
"target-attribute-name": "preferences",
"attribute-type": "document",
"attribute-sub-type": "dynamodb-map",
"value": {
"M": {
"S": "${preferences}"
}
}
},
{
"target-attribute-name": "version",
"attribute-type": "scalar",
"attribute-sub-type": "number",
"value": "${version}"
}
]
I also tried adding the map as below, and it adds an empty map in DyanmoDB.
"value": {
"M": "${preferences}"
}
Hope someone can help.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我认为DMS不支持将JSON字符串直接转换为发电机图。在这里,DMS将JSON数据插入字符串,而不是转换为DynamoDB地图或列表类型。但是,您可以在迁移后或之前处理数据,以确保将JSON数据正确解析到Dyanamodb映射中。
迁移之前:
DM可以处理。
源,将JSON转换为DynamoDB映射,然后写入DynamoDB。
或
迁移后:
然后写回DynamoDB。
检索字符串数据,将其转换为地图,然后更新项目
在DynamoDB中。
以下是一个node.js,具有AWS SDK的“后处理”示例,可以在迁移后完成,您可以做:
希望这对您有帮助。
I think DMS does not support direct conversion of JSON strings to Dynamo maps. Here DMS inserting JSON data as strings instead of converting into DynamoDB Map or List type. But you can process the data before or after migration to make sure that JSON data is correctly parsed into DyanamoDB Map.
Before Migration:
DMS can handle.
source, transform JSON into DynamoDB map and then write to DynamoDB.
OR
After Migration:
then write back to DynamoDB.
retrieve the string data, convert it into a Map, and then update item
in the DynamoDB.
Below is an Node.js with AWS SDK example of 'Post Processing' that can be done after migration, you can do:
Hope this will help you.