[THUDM/ChatGLM-6B][BUG/Help] 如何执行api,py时,让它的workers数量大于1

2024-05-10 772 views
9

api.py 中原版是 if name == 'main': tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True) model = AutoModel.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True).half().cuda() model.eval() uvicorn.run(app, host='127.0.0.1', port=8000,workers=1) 可以正常运行不出错

如果修改为 uvicorn.run(“api:app", host='127.0.0.1', port=8000,workers=5) 把workers数量改大,可以启动,但是报错 NameError: name 'model' is not defined

Traceback (most recent call last): File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 428, in run_asgi result = await app( # type: ignore[func-returns-value] File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in call return await self.app(scope, receive, send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\fastapi\applications.py", line 276, in call await super().call(scope, receive, send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\applications.py", line 122, in call await self.middleware_stack(scope, receive, send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\errors.py", line 184, in call raise exc File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\errors.py", line 162, in call await self.app(scope, receive, _send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\exceptions.py", line 79, in call raise exc File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\middleware\exceptions.py", line 68, in call await self.app(scope, receive, sender) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 21, in call raise e File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\fastapi\middleware\asyncexitstack.py", line 18, in call await self.app(scope, receive, send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 718, in call await route.handle(scope, receive, send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 276, in handle await self.app(scope, receive, send) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\starlette\routing.py", line 66, in app response = await func(request) File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\fastapi\routing.py", line 237, in app raw_response = await run_endpoint_function( File "C:\Users\Administrator\AppData\Local\Programs\Python\Python310\lib\site-packages\fastapi\routing.py", line 163, in run_endpoint_function return await dependant.call(**values) File "C:\ChatGLM-6B\api.py", line 32, in create_item response, history = model.chat(tokenizer, NameError: name 'model' is not defined INFO: 127.0.0.1:64602 - "POST / HTTP/1.1" 500 Internal Server Error ERROR: Exception in ASGI application

只得将workers数量改为1,但这样效率太低了 请问如何能让 workers数量大于1呢。谢谢

Environment
- OS:
- Python:
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :

回答

5

好的,谢谢

2

现在如何啦?

3

没搞定,只能多拷一份,开多个进程

5

Duplicate of #701