JSON Format for LangGraph .invoke() #85
Unanswered
brian-nowak
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hey, nice! I haven’t looked at this and don’t have access to a Databricks environment. It sounds like this might be a general LangGraph issue and thus get a better review in their discussions on the main repo? Is there some documentation about specifically running LangGraph on the databricks model / apps endpoint? That might be a good starting point. I’m curious about this capability too… |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi there! @JoshuaC215 this is a great project and is very helpful to see a full implementation of a LangGraph agent.
I am trying to do something very similar, working within the constraints of Databricks. I want to use their new serving feature, Apps - which supports Streamlit projects, and models (incl. LangGraphs) served in Databricks.
The blocker I have right now is, using the model serving endpoint I need to pass a JSON payload to the endpoint, but I can't figure out how to do this with the thread_id required by LG for its MemorySaver checkpointer.
Finally coming to my question =) - as you implemented the API piece, did you ever come across how one could pass the raw JSON into the model with a thread_id and get it to work with the MemorySaver? Put another way, when we do
app.invoke({"messages": messages}, config)
- is there a JSON representation of messages + config that you are aware of?Beta Was this translation helpful? Give feedback.
All reactions