Ollama chat with streaming mode using TextArea component

3 次查看(过去 30 天)
I refered the follow URL to implement.
This example shows "Print stream to HTML UI Component". But I'd like to use TextArea component instead.
As the callback function of addChat
function addChat(app, content)
app.TextArea.Value = [app.TextArea.Value, content];
pause(0.1);
drawnow
end
To setup the ollama chat,
app.chat = ollamaChat("gemma3:12b", ...
systemPrompt, ...
Temperature=0.2, ...
StreamFun=@(x) app.addChat(x), ...
TimeOut=600);
Then, in order to generate text like following;
prompt = "Context:" + join(selectedDocs, " ") + newline + ...
"Answer the following question: " + query;
app.history = addUserMessage(app.history, prompt);
[text, response] = generate(app.chat, app.history);
app.addChat(text);
app.history = addResponseMessage(app.history, response);
However, it was not possble to type text retrieved from ollama chat in streaming mode.
I appreciate you could find the solution.
Iori

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Environment and Settings 的更多信息

产品


版本

R2024b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by