fastapi:客户端的要求保持待处理

发布于 2025-01-23 02:43:14 字数 1695 浏览 5 评论 0原文

我创建了一个简单的Fullstack应用程序,需要根据特定请求在背面进行一些长时间的计算。为了遵循这些计算的进度,我将以特定的时间间隔向后端发送投票请求。

问题在于那些投票请求一直待定,直到我手动重新加载后端为止。

#Server side
Initial request

@app.post("/", status_code=202)
async def read_root(
  background_tasks: BackgroundTasks,
  file: Optional[UploadFile] = File(None), 
):
  task_id = str(uuid.uuid4())
  background_tasks.add_task(launch_process, task_id,file)
  #launch_process does the calculations and keeps track of the progress
task_id = str(uuid.uuid4())
 return {"id": task_id }

我正在将nextJ用于前端,但是出于演示目的,我使用的是香草JS,因为它也不起作用。

const progressDiv = document.getElementById('progressDiv')

const sendFile = async (data) => {

 let progress = 0
 progressDiv.textContent = progress
 
 const res1 = await fetch("http:localhost:8000", {
    method: 'post',
    body: data,
 })
 
 const jsonData = await res1.json()
 const id = jsonData.id

 let finish = false
 while(!finish){
  
   const idInterval = setInterval(()=>{
  
      fetch(`http://localhost:8000/polling/${id}`)
           .then(res => res.json())
           .then(json => {
             progress = json.progress 

             progressDiv.textContent = progress

             if(progress === 100){
               finish = true
               clearInterval(idInterval)
             }

           })
   }, 3000)
 }
}

所有的投票请求都不会通过并保持待处理

@app.get("/polling/{task_id}")
def polling(task_id: str):

  progress = store[task_id]["progress"]
  # the store keeps tracks of all the data 

  return {"progress": progress}

I have created a simple fullstack app that needs to do some long calculations on the back on a specific request. In order to follow the progress of those calculations, I'm sending polling requests to the backend at a specific time interval.

The problem is that those polling requests stay pending until I manually reload my backend.

#Server side
Initial request

@app.post("/", status_code=202)
async def read_root(
  background_tasks: BackgroundTasks,
  file: Optional[UploadFile] = File(None), 
):
  task_id = str(uuid.uuid4())
  background_tasks.add_task(launch_process, task_id,file)
  #launch_process does the calculations and keeps track of the progress
task_id = str(uuid.uuid4())
 return {"id": task_id }

I'm using nextjs for the frontend but for demonstration purpose I'm using vanilla JS as it doesn't work either.

const progressDiv = document.getElementById('progressDiv')

const sendFile = async (data) => {

 let progress = 0
 progressDiv.textContent = progress
 
 const res1 = await fetch("http:localhost:8000", {
    method: 'post',
    body: data,
 })
 
 const jsonData = await res1.json()
 const id = jsonData.id

 let finish = false
 while(!finish){
  
   const idInterval = setInterval(()=>{
  
      fetch(`http://localhost:8000/polling/${id}`)
           .then(res => res.json())
           .then(json => {
             progress = json.progress 

             progressDiv.textContent = progress

             if(progress === 100){
               finish = true
               clearInterval(idInterval)
             }

           })
   }, 3000)
 }
}

All the polling requests don't go through and stay pending

@app.get("/polling/{task_id}")
def polling(task_id: str):

  progress = store[task_id]["progress"]
  # the store keeps tracks of all the data 

  return {"progress": progress}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

﹂绝世的画 2025-01-30 02:43:14

感谢 @chris 建议,我了解我的问题的来源。

我的背景任务函数(auginn_process())具有一些重型同步计算。这些计算阻止了主线程。
因此,尽管我最初的帖子请求给出了响应,但我正在进行的abound_process()呼叫都阻止了所有正在进行的请求(待定)。

我解决的方式是使用

我将代码从以下方法更改

async def launch_process(task_id, file):
    content = await file.read()
    long_synchronous_function(content, task_id)
    return

为:

from fastapi.concurrency import run_in_threadpool

async def launch_process(task_id, file):
    content = await file.read()
    await run_in_threadpool(lambda: long_synchronous_function(content, task_id))
    return

A>也帮助了我(它为问题提供了多种解决方案)。

Thanks to @Chris's suggestion, I understood where my problem came from.

My background task function (launch_process()) had some heavy synchronous computations. Those computations were blocking the main thread.
Thus, despite my initial post request giving a response, all the ongoing requests were blocked (stayed pending) by my ongoing launch_process() call.

The way I solved it was to use fastapi.concurrency.run_in_threadpool

I changed my code from :

async def launch_process(task_id, file):
    content = await file.read()
    long_synchronous_function(content, task_id)
    return

To :

from fastapi.concurrency import run_in_threadpool

async def launch_process(task_id, file):
    content = await file.read()
    await run_in_threadpool(lambda: long_synchronous_function(content, task_id))
    return

This post also helped me (it gives several solutions to the problem).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文