通过NGINX自定义路由 - 从第三方来源读取

发布于 2025-02-12 06:37:54 字数 249 浏览 0 评论 0 原文

我是Nginx的新手,想知道它是否可以帮助我解决我们遇到的用例。

我有n个节点,它们是从具有相同组ID的Kafka主题中读取的,这意味着每个节点都有不相交的数据,该数据由某些密钥划分。

nginx无法知道apriori哪个节点具有与键相对应的数据。但是我们可以构建一个API或具有REDIS实例,可以告诉我们给定的键节点。

NGINX是否可以将此类信息的第三方信息合并到路由请求?

我也欢迎任何答案,即使它不涉及nginx。

I am new to nginx, and am wondering if it can help me to solve a use-case we've encountered.

I have n nodes,which are reading from from a kafka topic with the same group id, which means that each node has disjoint data, partitioned by some key.

Nginx has no way of knowing apriori which node has data corresponding to which keys. But we can build an API or have a redis instance which can tell us the node given the key.

Is there a way nginx can incorporate third party information of this kind to route requests?

I'd also welcome any answers, even if it doesn't involve nginx.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

风尘浪孓 2025-02-19 06:38:02

nginx无法知道apriori哪个节点具有与键

相对应的数据

Nginx不需要知道的数据相对应的数据。您需要使用。 (Spring-kafka具有 InteractiveQueryService 接口,BTW,可以从Spring Web使用)。

如果您想向用户提供KStreams http/rpc端点的单个地址,那么这将是标准 nginx 上游定义对于反向代理,它将路由到后端服务器的任何 ,该服务器随身携带以获取必要的键/值,然后将响应返回给客户端。

我不知道Kafka分区如何

看一下源代码看到它使用 murmur2 hash,在Lua中可用,可以在Nginx中使用。

但是同样,这是您可能应该避免的兔子洞。


其他选项,请使用Kafka Connect将数据转储到REDIS(或您想要的任何数据库)。然后编写非常相似的HTTP API服务,然后((可选))nginx点。

Nginx has no way of knowing apriori which node has data corresponding to which keys

Nginx doesn't need to know. You would need to do this in Kafka Streams RPC layer with Interactive Queries. (Spring-Kafka has an InteractiveQueryService interface, btw, that can be used from Spring Web).

If you want to present users with a single address for the KStreams HTTP/RPC endpoints, then that would be a standard Nginx upstream definition for a reverse proxy, which would route to any of the backend servers, which in-turn communicate with themselves to fetch the necessary key/value, and return the response back to the client.

I have no idea how Kafka partitions

You could look at the source code and see it uses a murmur2 hash, which is available in Lua, and can be used in Nginx.

But again, this is a rabbit hole you should probably avoid.


Other option, use Kafka Connect to dump data to Redis (or whatever database you want). Then write a very similar HTTP API service, then (optionally) point Nginx at that.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文