feat:include available connectors in search/execute tool descriptions#165
feat:include available connectors in search/execute tool descriptions#165shashi-stackone merged 8 commits intomainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR improves LLM trigger reliability for the tool_search / tool_execute “search_and_execute” mode by dynamically injecting the set of available connector keys into the tools’ descriptions, so the model knows what is searchable/executable up front.
Changes:
- Extend
_create_search_tool()and_create_execute_tool()to optionally append an “Available connectors: …” line to each tool’s description. - Update
StackOneToolSet._build_tools()to best-effort discover connectors viafetch_tools()and pass them into the tool constructors.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| all_tools = self.fetch_tools(account_ids=self._account_ids) | ||
| connectors = sorted(all_tools.get_connectors()) | ||
| if connectors: | ||
| connectors_str = ", ".join(connectors) |
There was a problem hiding this comment.
_build_tools() now calls fetch_tools() during tool definition building. This introduces an eager network call (and potentially a large /mcp catalog fetch) even before the LLM decides to use tool_search/tool_execute, and it will likely be repeated again when tool_execute runs (since _ExecuteTool fetches tools on first execute). Consider caching connector discovery results (e.g., per account_ids) and/or seeding execute_tool’s internal tools cache from the already-fetched all_tools to avoid duplicate catalog fetches.
| all_tools = self.fetch_tools(account_ids=self._account_ids) | |
| connectors = sorted(all_tools.get_connectors()) | |
| if connectors: | |
| connectors_str = ", ".join(connectors) | |
| # Cache connector discovery results per account_ids to avoid | |
| # repeated catalog fetches when building tools. | |
| if not hasattr(self, "_connector_cache"): | |
| self._connector_cache = {} | |
| cache_key = ( | |
| tuple(sorted(self._account_ids)) if self._account_ids else None | |
| ) | |
| if cache_key in self._connector_cache: | |
| connectors_str = self._connector_cache[cache_key] | |
| else: | |
| all_tools = self.fetch_tools(account_ids=self._account_ids) | |
| connectors = sorted(all_tools.get_connectors()) | |
| if connectors: | |
| connectors_str = ", ".join(connectors) | |
| self._connector_cache[cache_key] = connectors_str |
| if connectors: | ||
| connectors_str = ", ".join(connectors) | ||
| except Exception: | ||
| logger.debug("Could not discover connectors for tool descriptions") |
There was a problem hiding this comment.
The connector discovery exception handler logs a generic debug message but drops the underlying exception/context. Including the exception (and stack trace via exc_info) would make it much easier to diagnose discovery failures without changing the best-effort behavior.
| logger.debug("Could not discover connectors for tool descriptions") | |
| logger.debug( | |
| "Could not discover connectors for tool descriptions", | |
| exc_info=True, | |
| ) |
| connector_line = f" Available connectors: {connectors}." if connectors else "" | ||
| description = ( | ||
| "Search for available tools by describing what you need. " | ||
| "Returns matching tool names, descriptions, and parameter schemas. " | ||
| "Use the returned parameter schemas to know exactly what to pass " | ||
| "when calling tool_execute." | ||
| f"when calling tool_execute.{connector_line}" |
There was a problem hiding this comment.
The connectors string is appended verbatim into the tool description with no length/size cap. If an account has many connectors, this can bloat tool definitions and may exceed downstream tool-description limits or increase prompt size unnecessarily. Consider truncating/limiting the number of connectors included (e.g., first N + “and X more”).
| connector_line = f" Available connectors: {connectors}." if connectors else "" | ||
| description = ( | ||
| "Execute a tool by name with the given parameters. " | ||
| "Use tool_search first to find available tools. " | ||
| "The parameters field must match the parameter schema returned " | ||
| "by tool_search. Pass parameters as a nested object matching " | ||
| "the schema structure." | ||
| f"by tool_search. Pass parameters as a nested object matching the schema structure.{connector_line}" |
There was a problem hiding this comment.
Same as tool_search: the connectors list is appended into this tool’s description without any bounding/truncation. If connector count is high, this can significantly increase tool definition size and potentially exceed provider limits; consider limiting the rendered list and/or total description length.
There was a problem hiding this comment.
1 issue found across 2 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="stackone_ai/toolset.py">
<violation number="1" location="stackone_ai/toolset.py:602">
P2: Top-level timeout precedence is value-dependent, so explicitly passing `timeout=60.0` is ignored when `execute.timeout` is set.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
There was a problem hiding this comment.
1 issue found across 2 files (changes from recent commits).
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="examples/search_tool_example.py">
<violation number="1" location="examples/search_tool_example.py:30">
P2: This example now runs API calls on import instead of behind a main entrypoint, causing side effects and import-time failures in unconfigured environments.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| @@ -1,24 +1,15 @@ | |||
| #!/usr/bin/env python | |||
There was a problem hiding this comment.
Removing example code as its not needed fr new API .. Most of the code not needed to avoid duplication.
| print(f"Top {len(results_limited)} matches from the full catalog:") | ||
| for r in results_limited: | ||
| print(f" [{r.similarity_score:.2f}] {r.id}") | ||
| print(f" {r.description}") |
There was a problem hiding this comment.
Not aligned to new SDK API
| # Get all available tools using MCP-backed fetch_tools() | ||
| all_tools = toolset.fetch_tools(account_ids=_account_ids) | ||
| print(f"Total tools available: {len(all_tools)}") | ||
| def main() -> None: |
There was a problem hiding this comment.
don't we just want to have STACKONE_API_KEY= as the default config
There was a problem hiding this comment.
This example is old one and trimmed to use new API .. Can we delete this example altogether?
There was a problem hiding this comment.
Yes, Only API KEY can be enough but refering to the issue #167 we should be able to pass it here too (Optionally)
willleeney
left a comment
There was a problem hiding this comment.
why are we using the account_ids as the default credentials rather than the STACKONE_API_KEY
@willleeney We are using |
Summary
_build_tools()now discovers available connectors viafetch_tools()and injects them into thetool_searchandtool_executedescriptionsA recurring issue with Search/Execute is that search doesn't reliably trigger in the first place the LLM doesn't know what's searchable. By dynamically listing connector keys in the tool
description (e.g. "Available connectors: bamboohr, calendly, jotform"), the LLM has the context it needs to trigger search confidently.
Connector discovery is best-effort if it fails, descriptions remain generic.
New things
StackOneToolSet(timeout=120)or in execute configexecute={"timeout": 120}Resolves:
#166
StackOneHQ/stackone-ai-node#355
Summary by cubic
Include available connectors in
tool_search/tool_executedescriptions and add a configurable execution timeout with account-scoped execution. Added a Workday example and updated examples to passSTACKONE_API_KEY; discovery is best-effort and the default timeout is 60s.New Features
fetch_tools()with safe fallback and debug logging.timeouttoStackOneToolSet(...)andexecute.timeout; top-level param takes precedence and is applied tohttpx.request.examples/workday_integration.py; updateexamples/test_examples.pyto include it; simplifysearch_tool_example.pyto new SDK API and passSTACKONE_API_KEY.Bug Fixes
execute.account_idswhen building and executing tools.Written for commit 04b8bd0. Summary will update on new commits.