如何编写摄入管道以进行弹性搜索以将CSV文件作为嵌套JSON加载?
我有一个CSV文件,具有以下格式:
Company_ID | 年 | 销售 | 购买 | 地点 |
---|---|---|---|---|
3 | 2020 | 230 230 | 112 | Europe |
3 | 2019 | 234 234 | 231 | Europe |
2 | 2020 | 443 | 351 | USA |
2 | 2019 | 224 224 | 256 | 美国 |
,当我将其导入弹性搜索时,我最终有一个条目每行。 但是,我想以下面的格式导入它:
[
{"company_id" : 3,
"location" : "europe",
"2020" : {"sales" : 230, "buys" : 112},
"2019" : {"sales" : 234, "buys" : 231}
},
{"company_id" : 2,
"location" : "usa",
"2020" : {"sales" : 443, "buys" : 351},
"2019" : {"sales" : 224, "buys" : 256}
}
]
有没有办法编写摄入管道(处理器)以实现这一目标?
事先感谢您的宝贵答案。
I have a csv file that has the following format:
company_id | year | sales | buys | location |
---|---|---|---|---|
3 | 2020 | 230 | 112 | europe |
3 | 2019 | 234 | 231 | europe |
2 | 2020 | 443 | 351 | usa |
2 | 2019 | 224 | 256 | usa |
and when I import it to elastic search I end up having one entry for each line.
However, I would like to import it in the format below:
[
{"company_id" : 3,
"location" : "europe",
"2020" : {"sales" : 230, "buys" : 112},
"2019" : {"sales" : 234, "buys" : 231}
},
{"company_id" : 2,
"location" : "usa",
"2020" : {"sales" : 443, "buys" : 351},
"2019" : {"sales" : 224, "buys" : 256}
}
]
Is there a way to write the ingest pipeline (processor) in order to achieve this?
Thanks in advance for your precious answers.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
在摄入管道级别,您只能一次处理一个文档(即一行),因此,为了汇总所需的方式,您需要使用
codeg> grockregate
filter 。if your rows are correctly sorted by location, you can use the
不过,一个谨慎的词:如果您添加
年
作为字段,您的映射将随着数年的流逝而不断增长,并且您可能会冒险At the ingest pipeline level you'll only be able to handle one document (i.e. one row) at a time, so in order to aggregate the way you want, you need to do it at the Logstash level using the
aggregate
filter.if your rows are correctly sorted by location, you can use the following example from the official documentation.
One word of caution, though: if you add
year
as a field, your mapping will keep growing as years go by and you potentially risk mapping explosion.