Category WEB Tips Update 2024.11.13
MacのVSCode(VisualStudioCode)でClineプラグインを導入し動作チェックをしました。
しかし、OpenAI等のAPIでは動作しますが、OllamaでQwen2.5-Coder-32B-Instructをローカルで動かそうとしても下記のような表示がループしてしまい。動作しませんでした。
[ERROR] You did not use a tool in your previous response! Please retry with a tool use. # Reminder: Instructions for Tool Use Tool uses are formatted using XML-style tags. The tool name is enclosed in opening and closing tags, and each parameter is similarly enclosed within its own set of tags. Here's the structure: <tool_name> <parameter1_name>value1</parameter1_name> <parameter2_name>value2</parameter2_name> ... </tool_name> For example: <attempt_completion> <result> I have completed the task... </result> </attempt_completion> Always adhere to this format for all tool uses to ensure proper parsing and execution. # Next Steps If you have completed the user's task, use the attempt_completion tool. If you require additional information from the user, use the ask_followup_question tool. Otherwise, if you have not completed the task and do not need additional information, then proceed with the next step of the task. (This is an automated message, so do not respond to it conversationally.) <environment_details> # VSCode Visible Files (No visible files) ︙ ︙
clineのissuesに同様の事象がありました。
If you increase the model context size in ollama it will work again. To increase the size create an custommodel file, after that apply it to the model.
Example contents for Modelfile:
# Modelfile
FROM qwen2.5-coder:7b
PARAMETER num_ctx 32768
This example will increase it to 32K (choose lower for less VRAM). Apply it to a model using command:ollama create -f Modelfile qwen2.5-coder:7b
引用:Cline not sending the task to locally hosted Ollama · Issue #511 · cline/cline · GitHub
一旦モデルのmodelfileを確認します。
ollama show [Ollamaのモデル名] --modelfile
issuesを参考にModelfileを現在のディレクトリに作成します。
# Modelfile FROM [モデル名] PARAMETER num_ctx 32768
ターミナルから実行
ollama create -f Modelfile [モデル名]
で問題なく動作しました。