-
Notifications
You must be signed in to change notification settings - Fork 57
Moves models to another file, adds description to views and adds basic LCEL support #148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
src/dialog/llm/lcel_default.py
Outdated
| def process(self, input): | ||
| processed_input = self.preprocess(input) | ||
| self.generate_prompt(processed_input) | ||
| chain = self.prompt | self.llm | self.parser |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left a comment about this PR in the issue, but I'll paste here as well:
This is a good first step towards adopting LCEL, since it has small changes from the default_llm, but it does not close the issue, I think we should discuss better in the issue (where I gave several ideas and it i'd be good to hear other ideas) how the roadmap could be to really get a LCEL interface which langchain's users will easily understand.
For example, I think that AbstractLLM forces a flow which langchain's chains already were designed for (the idea of input data, retrieval phase, prompt generation, branches and if-else cases, llm calls, post processing and output parsers), and, I my opinion, to really make dialog flexible with LCEL, the roadmap should go in the direction of removing this custom class in favor of building chains entirely with langchain's objects in every step (runnables), since this is what guarantees new functionalities like streaming, async, parallelism, retries/fallbacks, access to intermediate results, and tracing with langsmith etc. In this direction, I imagine dialog exposing in the api any chain the user wants to build or even combine, as long as input/output remain the same meaning, with a default chain available (using runnables in all steps). To really make a new class which follows LCEL will demand more changes and probably several PRs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@lgabs we are building chains into classes because they are:
-
easily inheritable and modifiable - every developer knows a little bit of OOP and our focus here is to make LLMs very accessible to users;
-
we are doing small changes so we dont break retro-compatibility with existing code;
-
because the user is supposed to implement their own LLMs using whichever methods they want, so giving them a class with methods that allow them to pre/post-process and run the LLM are quicker.
If a user wants to use a different approach to the LLMs, they are allowed to modify and do whatever they want with their own version of the LLM or even modify the source code for talkd.
|
I tried to run locally and found that the |
src/dialog/llm/lcel_default.py
Outdated
| ) | ||
| self.prompt = ChatPromptTemplate.from_template( | ||
| "\n".join([ | ||
| SystemMessagePromptTemplate.from_template(header), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is breaking because the join over the list expects strings, not the langchain template object:
dialog-1 | INFO: 192.168.65.1:47093 - "POST /ask HTTP/1.1" 500 Internal Server Error
dialog-1 | ERROR: Exception in ASGI application
dialog-1 | Traceback (most recent call last):
dialog-1 | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 426, in run_asgi
dialog-1 | result = await app( # type: ignore[func-returns-value]
dialog-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
dialog-1 | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
dialog-1 | return await self.app(scope, receive, send)
dialog-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
dialog-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
dialog-1 | await super().__call__(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
dialog-1 | await self.middleware_stack(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
dialog-1 | raise exc
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
dialog-1 | await self.app(scope, receive, _send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 83, in __call__
dialog-1 | await self.app(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
dialog-1 | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
dialog-1 | raise exc
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
dialog-1 | await app(scope, receive, sender)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 758, in __call__
dialog-1 | await self.middleware_stack(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 778, in app
dialog-1 | await route.handle(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 299, in handle
dialog-1 | await self.app(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 79, in app
dialog-1 | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
dialog-1 | raise exc
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
dialog-1 | await app(scope, receive, sender)
dialog-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 74, in app
dialog-1 | response = await func(request)
dialog-1 | ^^^^^^^^^^^^^^^^^^^
dialog-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 299, in app
dialog-1 | raise e
dialog-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 294, in app
dialog-1 | raw_response = await run_endpoint_function(
dialog-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
dialog-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
dialog-1 | return await dependant.call(**values)
dialog-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
dialog-1 | File "/app/src/main.py", line 98, in ask_question_to_llm
dialog-1 | ai_message = llm_instance.process(message.message)
dialog-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
dialog-1 | File "/app/src/dialog/llm/lcel_default.py", line 108, in process
dialog-1 | self.generate_prompt(processed_input)
dialog-1 | File "/app/src/dialog/llm/lcel_default.py", line 55, in generate_prompt
dialog-1 | "\n".join([
dialog-1 | TypeError: sequence item 0: expected str instance, SystemMessagePromptTemplate found
it would be better to continue to use langchain system and human templates like we already do in the default llm or get some ideas from langchain's Q&A documentation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fixing this.
good catch, will fix this |
|
@lgabs I'm removing the close statement from this PR, so we can still discuss the LCEL, but I'm very against changing the signature of the methods right now. |
This PR moves the BaseModels used to a models file, adds a new LCEL-supported LLM and adds some descriptions