发送一系列批处理API请求

发布于 2025-02-05 21:14:08 字数 616 浏览 3 评论 0原文

寻找一种表现方法的方法以批量发送约1000多个请求,例如6并行,当这6个完成后,将下一个6

批次发送将防止浏览器请求队列完全阻止在批处理时可能发生的任何其他API请求呼叫正在进行中,

我以前使用rxjs(下图)进行了此操作,但是想知道是否有等效的fetch promise 基于基于的方法?

// Array of observables
const urls = [
  this.http.get('url1'),
  this.http.get('url2'),
  this.http.get('url3'),
  ...
];


bufferedRequests(urls) {
  from(urls).pipe(
    bufferCount(6),
    concatMap(buffer => forkJoin(buffer))
  ).subscribe(
    res => console.log(res),
    err => console.log(err),
    () => console.log('complete')
  );
}

Looking for a performant approach to send circa 1000+ requests in batches e.g 6 in parallel, and when these 6 have completed, send next 6

Sending in batches will prevent the browser request queue from fully blocking any other API requests that may occur while the batch calls are in progress

I have done this previously with RxJS (example below), but wondering is there an equivalent fetch Promise based approach?

// Array of observables
const urls = [
  this.http.get('url1'),
  this.http.get('url2'),
  this.http.get('url3'),
  ...
];


bufferedRequests(urls) {
  from(urls).pipe(
    bufferCount(6),
    concatMap(buffer => forkJoin(buffer))
  ).subscribe(
    res => console.log(res),
    err => console.log(err),
    () => console.log('complete')
  );
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

丘比特射中我 2025-02-12 21:14:08

我使用 bottleneck nofe to。

它使您可以使用客户端速率限制器瓶颈。您可以选择每分钟发送多少请求,并且也可以运行多少个并发的请求。

您可以设置限制器

const limiter = new Bottleneck({
 maxConcurrent: 1,
 minTime: 333 //this will execute 3 requests every second, aka wait 333 ms to execute the next request
});

然后用它包装您的功能。

const wrapped = limiter.wrap(myFunction);

wrapped(arg1, arg2)
.then((result) => {
  /* handle result */
});

在您的情况下,我会写一个围绕提取请求的功能并返回承诺。然后,我将其用限制器包裹。这是一个例子:

const throttledGetMyData = limiter.wrap(yourFetchFunction);

  const allThePromises = requests.map(item => {
    return throttledGetMyData(request);
  })
  try {
    const results = await Promise.all(allThePromises);
    console.log(results);
  } catch (err) {
    console.log(err);
  }
}

I used bottleneck a while ago.

It allows you to bottleneck your requests with a client side rate limiter. You can choose how many requests to send out per minute and how many concurrent ones can run too.

You can set up a limiter:

const limiter = new Bottleneck({
 maxConcurrent: 1,
 minTime: 333 //this will execute 3 requests every second, aka wait 333 ms to execute the next request
});

then wrap your function with it.

const wrapped = limiter.wrap(myFunction);

wrapped(arg1, arg2)
.then((result) => {
  /* handle result */
});

In your case, I'd write a function that wraps around the fetch requests and returns a promise. Then, I would wrap that with the limiter. Here's an example:

const throttledGetMyData = limiter.wrap(yourFetchFunction);

  const allThePromises = requests.map(item => {
    return throttledGetMyData(request);
  })
  try {
    const results = await Promise.all(allThePromises);
    console.log(results);
  } catch (err) {
    console.log(err);
  }
}
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文