If you’ve been working with Copilot Studio’s autonomous agents, you’ve probably encountered the frustrating “Incomplete” status. The agent runs, but doesn’t complete successfully, leaving you wondering what went wrong. I’ve dealt with this issue multiple times, and here are two common scenarios with practical solutions.
The most puzzling scenario is when your agent run shows “Incomplete” status with 0 completed steps. The agent literally does nothing – no actions, no progress, just stops dead in its tracks. All you see is the trigger input and rationale, which doesn’t help much.
Here’s the trick most people miss: the Activity map view hides the actual error. Click on the “View as” dropdown and switch from “Activity map” to “Transcript”. This reveals the Copilot interaction details where the real error message lives.
In my case, I found the Error Code OpenAIMaxTokenLengthExceeded
buried in the error message. The agent was choking on too much input data before it could even start processing.
The culprit was a lengthy JSON payload from the trigger input. When you’re passing complex data structures from Power Automate flows or other sources, it’s easy to exceed the token limits without realising it. The agent receives this massive input, tries to process it, and immediately fails.
I went back to the Power Automate flow that triggers the agent and added an AI Prompt step before the agent call. This step cleans up and simplifies the JSON input, removing unnecessary fields and condensing the data structure. Think of it as pre-processing your data to make it digestible for the agent.
The key is being selective about what information the agent actually needs versus what you’re passing along “just in case.”
The Power Platform Community forum thread has also discussed about similar issue for “An error has occurred. Error code: OpenAIMaxTokenLengthExceeded” error.
The second scenario involves agents that start processing but fail during Dataverse operations. You’ll see some completed steps, but the run ends with “Incomplete” status and a generic HTTP 400 error from the Dataverse connector.
The error message connectorRequestFailure The connector 'Microsoft Dataverse' returned an HTTP error with code 400
tells you almost nothing useful. Here’s how to dig deeper:
trim()
function in the custom value hack as mentioned in my blog post instead of selecting from the list – this makes the step dynamic and allows you to paste in the Row Item dataIn my situation, the agent was populating a Choice column with a value that wasn’t in the valid options list. The agent “thought” it was providing a reasonable value, but Dataverse rejected it because it wasn’t one of the predefined choices.
I updated the agent’s general instructions and knowledge sources to explicitly specify the valid options for that Choice column. Instead of letting the agent guess, I gave it a clear list of acceptable values.
The lesson here is that agents need explicit guidance about data constraints, especially for Choice columns, lookups, and other fields with restricted values.
Both scenarios highlight the importance of being specific with your agent instructions. Don’t assume the agent will figure out data constraints or input limitations on its own.
For triggers, consider what data the agent actually needs and pre-process complex inputs. For Dataverse operations, document the valid values for constrained fields in your knowledge sources.
The Copilot Studio interface doesn’t always surface errors clearly, but with these troubleshooting techniques, you can get to the root cause faster and build more reliable agents. Once the changes are done, you can re-run the same agent run by resubmitting the particular cloud flow run.
Original Post http://linnzawwin.blogspot.com/2025/07/troubleshooting-copilot-studio.html