Skip to content

Conversation

samie
Copy link
Contributor

@samie samie commented Mar 12, 2025

  • Parse documents using Apache Tika
  • Create prompt with document content
Screenshot 2025-03-12

Still in draft mode for discussion.

- Parse documents using Apache Tika
- Create prompt with document content
@samie
Copy link
Contributor Author

samie commented Mar 12, 2025

@amithkoujalgi There is a problem with some models that the prompting does not work and I get the following stacktrace. For some models it works. Not sure if that is related to my prompt here and didn't look further yet. Any ideas?

Exception in thread "Thread-2482" java.lang.RuntimeException: java.lang.NullPointerException: Cannot invoke "io.github.ollama4j.models.chat.OllamaChatMessage.getToolCalls()" because the return value of "io.github.ollama4j.models.chat.OllamaChatResponseModel.getMessage()" is null
	at io.github.ollama4j.webui.service.ChatService.ask(ChatService.java:88)
	at io.github.ollama4j.webui.views.chat.ChatWithDocumentView.lambda$onSubmit$3(ChatWithDocumentView.java:192)
	at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: java.lang.NullPointerException: Cannot invoke "io.github.ollama4j.models.chat.OllamaChatMessage.getToolCalls()" because the return value of "io.github.ollama4j.models.chat.OllamaChatResponseModel.getMessage()" is null
	at io.github.ollama4j.models.request.OllamaChatEndpointCaller.callSync(OllamaChatEndpointCaller.java:121)
	at io.github.ollama4j.models.request.OllamaChatEndpointCaller.call(OllamaChatEndpointCaller.java:76)
	at io.github.ollama4j.OllamaAPI.chatStreaming(OllamaAPI.java:816)
	at io.github.ollama4j.OllamaAPI.chat(OllamaAPI.java:789)
	at io.github.ollama4j.webui.service.ChatService.ask(ChatService.java:84)
	... 2 more

@amithkoujalgi
Copy link
Collaborator

@samie It seems like the issue is coming from the OllamaChatResponseModel.getMessage() method in Ollama4j library, where the model didn't return a message. I think throwing an NPE wouldn't be the right approach if the model doesn't respond. Instead, we should just return an empty string.

Also, could you try updating to the latest version of Ollama4j? There were a few fixes implemented, so it's possible this issue is already addressed in the update. I can't recall the specific fixes since I’ve been juggling multiple projects.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants