并行动态提取请求

发布于 2025-02-07 00:01:01 字数 1551 浏览 1 评论 0原文

我想知道如何并行执行动态提取请求,我一直在尝试这样做12个小时,但我无法弄清楚,我在Google和Stackoverflow上到处都看,我真的很累。


      for (const fileindex of filelist) {
        let dcmFilename = `slice_${fileindex}.dcm`
        slices.push( {index:fileindex, filename:dcmFilename} )
          fd.append(dcmFilename, files[fileindex], dcmFilename);
        fd.append('slices', JSON.stringify(slices));
        loopcount += 1
        if (filecount == 24 || loopcount == filelist.length){
          
      if (cursorPos !== false) {
        let scaleFactor = Tegaki.scaleFactor;
        cursorPos.x = parseInt(cursorPos.x / scaleFactor);
        cursorPos.y = parseInt(cursorPos.y / scaleFactor);
        fd.append('x', cursorPos.x);
        fd.append('y', cursorPos.y)
      }

      // Get layer id from index
      if (index !== -1) {
        type = this.layerTypes[index]['id']
      }
      // switch mode from heuristic to pytorch vertebrae for vertebral bone
      if (type === 'vertebral-bone' ){
        mode='PyTorch-Vertebrae'
      }

      // Post to endpoint
      let domain = window.location.origin.replace(':8080', ':5000')
      let list = await fetch(`${domain}/segment?mode=${mode}&type=${type}`,  {    method: 'POST', body: fd   })
      let result = await list.json()
      // do something with result
      // finish then continue the loop and create new body and send a new request
     // clear formdata and continue loop
     fd =  new FormData()
}

我有以下我发送的提取请求,在每个请求生成主体的地方,它永远不会是相同的身体,身体会通过功能动态生成。有没有办法一次发送所有请求,然后等待他们返回响应,然后继续我的其余代码?

I would like to know how I can do dynamic fetch requests in parallel, I have been trying to do this for 12 hours and I cant figure it out, I looked everywhere on google and StackOverflow, I really am tired.


      for (const fileindex of filelist) {
        let dcmFilename = `slice_${fileindex}.dcm`
        slices.push( {index:fileindex, filename:dcmFilename} )
          fd.append(dcmFilename, files[fileindex], dcmFilename);
        fd.append('slices', JSON.stringify(slices));
        loopcount += 1
        if (filecount == 24 || loopcount == filelist.length){
          
      if (cursorPos !== false) {
        let scaleFactor = Tegaki.scaleFactor;
        cursorPos.x = parseInt(cursorPos.x / scaleFactor);
        cursorPos.y = parseInt(cursorPos.y / scaleFactor);
        fd.append('x', cursorPos.x);
        fd.append('y', cursorPos.y)
      }

      // Get layer id from index
      if (index !== -1) {
        type = this.layerTypes[index]['id']
      }
      // switch mode from heuristic to pytorch vertebrae for vertebral bone
      if (type === 'vertebral-bone' ){
        mode='PyTorch-Vertebrae'
      }

      // Post to endpoint
      let domain = window.location.origin.replace(':8080', ':5000')
      let list = await fetch(`${domain}/segment?mode=${mode}&type=${type}`,  {    method: 'POST', body: fd   })
      let result = await list.json()
      // do something with result
      // finish then continue the loop and create new body and send a new request
     // clear formdata and continue loop
     fd =  new FormData()
}

I have the following fetch request that I send, where the body is generated for each request, It's never the same body, the body gets generated dynamically by the functions. Is there a way to send all the requests at once, then wait for them to return a response, then proceed with the rest of my code?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

耳钉梦 2025-02-14 00:01:01

您可以在其中使用等待等待等待,而不是,您可以使用filelist.map()async 回调。

由于.map()只是盲目地迭代数组而无需等待任何返回的承诺,因此它将从所有async回调.map(map .map(map)中返回一系列承诺)调用。这些承诺最初将无法实现,所有fetch()操作将同时“飞行”。然后,您可以使用等待Promise.All(...)在返回的承诺阵列上都知道它们何时完成:

await Promise.all(filelist.map(async fileindex => {
    let dcmFilename = `slice_${fileindex}.dcm`
    slices.push({ index: fileindex, filename: dcmFilename })
    fd.append(dcmFilename, files[fileindex], dcmFilename);
    fd.append('slices', JSON.stringify(slices));
    loopcount += 1
    if (filecount == 24 || loopcount == filelist.length) {

        if (cursorPos !== false) {
            let scaleFactor = Tegaki.scaleFactor;
            cursorPos.x = parseInt(cursorPos.x / scaleFactor);
            cursorPos.y = parseInt(cursorPos.y / scaleFactor);
            fd.append('x', cursorPos.x);
            fd.append('y', cursorPos.y)
        }

        // Get layer id from index
        if (index !== -1) {
            type = this.layerTypes[index]['id']
        }
        // switch mode from heuristic to pytorch vertebrae for vertebral bone
        if (type === 'vertebral-bone') {
            mode = 'PyTorch-Vertebrae'
        }

        // Post to endpoint
        let domain = window.location.origin.replace(':8080', ':5000')

        let list = await fetch(`${domain}/segment?mode=${mode}&type=${type}`, { method: 'POST', body: fd })
        let result = await list.json()
        // do something with result
        // finish then continue the loop and create new body and send a new request
    }
}));

注释#1:,因为您现在正在运行多次获取操作并行,任何对其结果处理的代码都不得共享变量。您在此代码中有几个变量,这些变量未显示其声明。这些声明可能应该在此循环内部使用LETconst,因此循环的每个迭代都有一个新变量,除非变量为明确应该在迭代中积累。此循环中的可疑变量没有可能受并行操作影响的本地声明是filecountloopcountcursorposindex index代码>,<代码>模式 fdtype。您不会在此处显示整个执行上下文,因此我们看不到足够的内容来对其进行完整的建议。

注释#2:包含此的代码if(fileCount == 24 || loopcount == filelist.length)看来它可能容易出现问题。如果您在完成所有迭代时要运行一些代码,那么在promise.all()之后运行该代码比尝试检测何时完成您的上一次迭代会更好。请记住,您的迭代不一定会立即进行,因为您正在并行运行它们。他们将按顺序启动,但不一定按顺序完成。

Instead of the for/of loop with await in it, you can use filelist.map() with an async callback.

Since .map() just blindly iterates the array without waiting for any returned promises, it will return an array of promises from all the async callbacks that .map() called. Those promises will initially be unfulfilled and all the fetch() operations will be "in-flight" at the same time. You can then use await Promise.all(...) on that returned array of promises to know when they are all done:

await Promise.all(filelist.map(async fileindex => {
    let dcmFilename = `slice_${fileindex}.dcm`
    slices.push({ index: fileindex, filename: dcmFilename })
    fd.append(dcmFilename, files[fileindex], dcmFilename);
    fd.append('slices', JSON.stringify(slices));
    loopcount += 1
    if (filecount == 24 || loopcount == filelist.length) {

        if (cursorPos !== false) {
            let scaleFactor = Tegaki.scaleFactor;
            cursorPos.x = parseInt(cursorPos.x / scaleFactor);
            cursorPos.y = parseInt(cursorPos.y / scaleFactor);
            fd.append('x', cursorPos.x);
            fd.append('y', cursorPos.y)
        }

        // Get layer id from index
        if (index !== -1) {
            type = this.layerTypes[index]['id']
        }
        // switch mode from heuristic to pytorch vertebrae for vertebral bone
        if (type === 'vertebral-bone') {
            mode = 'PyTorch-Vertebrae'
        }

        // Post to endpoint
        let domain = window.location.origin.replace(':8080', ':5000')

        let list = await fetch(`${domain}/segment?mode=${mode}&type=${type}`, { method: 'POST', body: fd })
        let result = await list.json()
        // do something with result
        // finish then continue the loop and create new body and send a new request
    }
}));

Note #1: Since you are now running multiple fetch operations in parallel, any code processing their results must not share variables. You have a few variables in this code that do not show their declarations. Those declarations should probably be inside this loop with let or const so there's a new variable for each iteration of the loop and you aren't sharing variables across iterations unless the variable is explicitly supposed to be accumulated across iterations. Suspicious variables in this loop with no local declaration that may be affected by parallel operation are filecount, loopcount, cursorPos, index, mode fd and type. You don't show the whole execution context here so we can't see enough to make a complete recommendation on these.

Note #2: The code that contains this if (filecount == 24 || loopcount == filelist.length) looks like it may be prone to problems. If you're trying to run some code when all the iterations are done, it would be better to run that code after the Promise.all() than trying to detect when your last iteration is done. Remember, your iterations will not necessarily proceed in order now because you're running them in parallel. They will launch in order, but not necessarily complete in order.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文