feat: add skills support and marketplace for discovering and importing extensions#1022
feat: add skills support and marketplace for discovering and importing extensions#1022Gkrumbach07 wants to merge 12 commits intomainfrom
Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
WalkthroughAdds a marketplace feature: backend Gin handlers for sources, catalogs, Git scans, and installed-item management; frontend UI, types, API proxy routes, React Query hooks, and import dialog; Kubernetes ConfigMap and CRD schema; operator injects installed items into pods; runner clones/materializes marketplace items and exposes skills in workflow metadata. Changes
Sequence Diagram(s)sequenceDiagram
participant User as User (Browser)
participant Frontend as Frontend App
participant FrontProxy as Next.js API
participant Backend as Backend API
participant ConfigMap as Kubernetes ConfigMap
participant Git as Git Repository
participant K8s as Kubernetes API
User->>Frontend: Open Marketplace page
Frontend->>FrontProxy: GET /api/marketplace/sources
FrontProxy->>Backend: GET /marketplace/sources (forwarded)
Backend->>ConfigMap: Read `marketplace-sources` ConfigMap
ConfigMap-->>Backend: sources.json
Backend-->>FrontProxy: MarketplaceSource[]
FrontProxy-->>Frontend: MarketplaceSource[]
Frontend->>FrontProxy: GET /api/marketplace/sources/:idx/catalog
FrontProxy->>Backend: GET /marketplace/sources/:idx/catalog
Backend->>Git: Fetch catalog JSON (30s timeout, cache 5m)
Git-->>Backend: catalog JSON
Backend-->>FrontProxy: Catalog items
FrontProxy-->>Frontend: Catalog items
Frontend->>FrontProxy: POST /api/projects/{proj}/marketplace/scan
FrontProxy->>Backend: POST /projects/{proj}/marketplace/scan
Backend->>Git: Shallow clone repo (2m), scan .claude and frontmatter
Git-->>Backend: ScanResult
Backend-->>FrontProxy: ScanResult
FrontProxy-->>Frontend: ScanResult
Frontend->>FrontProxy: POST /api/projects/{proj}/marketplace/items
FrontProxy->>Backend: POST /projects/{proj}/marketplace/items
Backend->>K8s: Read/Update ProjectSettings.spec.installedItems
K8s-->>Backend: Updated CR
Backend-->>FrontProxy: Success
FrontProxy-->>Frontend: Success
sequenceDiagram
participant Operator as Operator
participant K8s as Kubernetes API
participant Pod as Session Pod
participant Runner as Ambient Runner
participant Git as Git Repository
participant Workspace as Workspace FS
Operator->>K8s: Create session pod
Operator->>K8s: GET ProjectSettings.spec.installedItems
K8s-->>Operator: installedItems JSON
Operator->>Pod: Inject INSTALLED_ITEMS_JSON env
Pod->>Runner: Startup
Runner->>Git: Parallel clone marketplace repos per installedItems
Git-->>Runner: Clones
Runner->>Workspace: Materialize .claude/{skills,commands,agents} (symlinks)
Runner->>Runner: resolve_workspace_paths includes marketplace dirs
Runner->>Runner: /workflow-metadata scans aggregated .claude dirs
Runner-->>Pod: Return metadata (commands, agents, skills)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~50 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
| ctx, cancel := context.WithTimeout(c.Request.Context(), 2*time.Minute) | ||
| defer cancel() | ||
|
|
||
| cmd := exec.CommandContext(ctx, "git", "clone", "--depth", "1", "-b", req.Branch, req.URL, tmpDir) |
Check failure
Code scanning / CodeQL
Command built from user-controlled sources Critical
There was a problem hiding this comment.
Mitigated in 7e0189b: endpoint moved behind auth (ValidateProjectContext), URL scheme restricted to HTTPS/SSH only, and exec.CommandContext passes args as separate parameters (no shell invocation).
| ctx, cancel := context.WithTimeout(c.Request.Context(), 2*time.Minute) | ||
| defer cancel() | ||
|
|
||
| cmd := exec.CommandContext(ctx, "git", "clone", "--depth", "1", "-b", req.Branch, req.URL, tmpDir) |
Check failure
Code scanning / CodeQL
Command built from user-controlled sources Critical
There was a problem hiding this comment.
Mitigated in 7e0189b: endpoint moved behind auth (ValidateProjectContext), URL scheme restricted to HTTPS/SSH only, and exec.CommandContext passes args as separate parameters (no shell invocation).
| } | ||
|
|
||
| // Check for CLAUDE.md | ||
| if _, err := os.Stat(filepath.Join(scanRoot, "CLAUDE.md")); err == nil { |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 4 days ago
To fix the problem in general, the code should not trust req.Path even after simple string checks; instead it should enforce that the final resolved scan directory lives within the temporary clone directory. This is typically done by (1) constructing the candidate path with filepath.Join, (2) resolving both the base directory and candidate to absolute, cleaned paths, and (3) verifying that the candidate has the base as a prefix (using a path-aware check that avoids prefix tricks like /tmp/x vs /tmp/xevil).
The best minimal change here is:
- Keep the existing basic validation on
req.Pathto reject obvious bad inputs (.., absolute paths). - After cloning to
tmpDir, computebaseAbs, err := filepath.Abs(tmpDir). - When
req.Pathis non-empty, computecandidate := filepath.Join(tmpDir, req.Path), thenscanRootAbs, err := filepath.Abs(candidate). - Verify that
scanRootAbsis insidebaseAbs. A robust way without extra dependencies is to usefilepath.Rel(baseAbs, scanRootAbs)and ensure it does not start with..and is not absolute. If validation fails, return a 400 error. - Use
scanRootAbsasscanRootthereafter, so that all subsequent file operations (os.Stat,os.ReadFile) operate on a verified safe root.
Concretely in components/backend/handlers/marketplace.go:
- After successfully creating
tmpDir, compute its absolute path for later checks. - Replace the scan-root determination block (lines 330–334) with a version that resolves and validates the path using
filepath.Absandfilepath.Rel. This keeps the existing behavior (scanning either the clone root or a subdirectory) but guarantees that the chosenscanRootcannot escape the temp dir.
No new imports are required; filepath and strings are already imported.
| @@ -315,6 +315,14 @@ | ||
| } | ||
| defer os.RemoveAll(tmpDir) | ||
|
|
||
| // Resolve temp directory to an absolute path for safe comparisons | ||
| tmpDirAbs, err := filepath.Abs(tmpDir) | ||
| if err != nil { | ||
| log.Printf("ScanGitSource: failed to resolve temp dir: %v", err) | ||
| c.JSON(http.StatusInternalServerError, gin.H{"error": "Failed to resolve temp directory"}) | ||
| return | ||
| } | ||
|
|
||
| // Clone the repo | ||
| ctx, cancel := context.WithTimeout(c.Request.Context(), 2*time.Minute) | ||
| defer cancel() | ||
| @@ -328,9 +336,22 @@ | ||
| } | ||
|
|
||
| // Determine scan root | ||
| scanRoot := tmpDir | ||
| scanRoot := tmpDirAbs | ||
| if req.Path != "" { | ||
| scanRoot = filepath.Join(tmpDir, req.Path) | ||
| candidate := filepath.Join(tmpDirAbs, req.Path) | ||
| scanRootAbs, err := filepath.Abs(candidate) | ||
| if err != nil { | ||
| log.Printf("ScanGitSource: failed to resolve scan path %q: %v", candidate, err) | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid scan path"}) | ||
| return | ||
| } | ||
| // Ensure the scan root stays within the cloned repository directory | ||
| rel, err := filepath.Rel(tmpDirAbs, scanRootAbs) | ||
| if err != nil || strings.HasPrefix(rel, "..") || filepath.IsAbs(rel) { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path: must be within cloned repository"}) | ||
| return | ||
| } | ||
| scanRoot = scanRootAbs | ||
| } | ||
|
|
||
| result := ScanResult{ |
There was a problem hiding this comment.
Mitigated in 7e0189b: path is validated to reject .\. traversal and absolute paths before use. All path operations are scoped within the temp clone directory.
|
|
||
| // Check for .ambient/ambient.json (workflow) | ||
| ambientJSONPath := filepath.Join(scanRoot, ".ambient", "ambient.json") | ||
| if data, err := os.ReadFile(ambientJSONPath); err == nil { |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 4 days ago
In general, when building filesystem paths from user input, normalize the resulting path and verify it is confined within a known safe base directory. For multi-component paths, this is commonly done by computing an absolute path from the base plus user input, and then checking that the resulting path still has the safe base as a prefix.
In this specific handler, the best minimal fix is to (1) normalize req.Path relative to tmpDir using filepath.Abs(filepath.Join(tmpDir, req.Path)), (2) verify that the resulting absolute path still resides under tmpDir (e.g., with a prefix check), and (3) use that validated path as scanRoot. This keeps existing behavior (the user can still choose a subdirectory in the cloned repo) but adds canonicalization and a robust containment check. We should also adjust the existing req.Path validation to include a leading path-separator rejection (to avoid odd absolute-path behavior on some platforms) and rely primarily on the containment check for security rather than string heuristics alone.
Concretely, in ScanGitSource in components/backend/handlers/marketplace.go:
- After
tmpDiris created and beforescanRootis used, computesafeBase, err := filepath.Abs(tmpDir)and check the error. UsesafeBasefor subsequent containment checks. - Replace the existing
req.Pathvalidation block andscanRootderivation with logic that:- Rejects
req.Pathif it contains..or if it is an absolute path or starts with a path separator (for extra safety). - If
req.Pathis non-empty, computecandidateRoot, err := filepath.Abs(filepath.Join(safeBase, req.Path)). - Verify
candidateRootstarts withsafeBase+string(os.PathSeparator)or equalssafeBase; if not, reject the request. - Set
scanRoottocandidateRoot(orsafeBaseifreq.Pathis empty).
- Rejects
- All subsequent uses of
scanRoot(including buildingambientJSONPathforos.ReadFile) then operate on a path guaranteed to be within the temporary clone directory.
No new methods or imports are required; we reuse filepath and os already imported in this file.
| @@ -300,10 +300,12 @@ | ||
| return | ||
| } | ||
|
|
||
| // Validate path doesn't escape the clone directory | ||
| if req.Path != "" && (strings.Contains(req.Path, "..") || filepath.IsAbs(req.Path)) { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path: must be relative without parent directory references"}) | ||
| return | ||
| // Basic validation that the requested path is relative and does not contain parent directory references. | ||
| if req.Path != "" { | ||
| if strings.Contains(req.Path, "..") || filepath.IsAbs(req.Path) || strings.HasPrefix(req.Path, string(os.PathSeparator)) { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path: must be relative without parent directory references"}) | ||
| return | ||
| } | ||
| } | ||
|
|
||
| // Create temp directory | ||
| @@ -315,6 +317,14 @@ | ||
| } | ||
| defer os.RemoveAll(tmpDir) | ||
|
|
||
| // Normalize the temp directory path for later containment checks. | ||
| safeBase, err := filepath.Abs(tmpDir) | ||
| if err != nil { | ||
| log.Printf("ScanGitSource: failed to resolve temp dir: %v", err) | ||
| c.JSON(http.StatusInternalServerError, gin.H{"error": "Failed to resolve temp directory"}) | ||
| return | ||
| } | ||
|
|
||
| // Clone the repo | ||
| ctx, cancel := context.WithTimeout(c.Request.Context(), 2*time.Minute) | ||
| defer cancel() | ||
| @@ -327,10 +337,22 @@ | ||
| return | ||
| } | ||
|
|
||
| // Determine scan root | ||
| scanRoot := tmpDir | ||
| // Determine scan root, ensuring it stays within the cloned repository directory. | ||
| scanRoot := safeBase | ||
| if req.Path != "" { | ||
| scanRoot = filepath.Join(tmpDir, req.Path) | ||
| candidateRoot, err := filepath.Abs(filepath.Join(safeBase, req.Path)) | ||
| if err != nil { | ||
| log.Printf("ScanGitSource: failed to resolve scan root for path %q: %v", req.Path, err) | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path"}) | ||
| return | ||
| } | ||
| // Ensure the resolved path is within the temporary clone directory. | ||
| safePrefix := safeBase + string(os.PathSeparator) | ||
| if candidateRoot != safeBase && !strings.HasPrefix(candidateRoot, safePrefix) { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path: must stay within cloned repository"}) | ||
| return | ||
| } | ||
| scanRoot = candidateRoot | ||
| } | ||
|
|
||
| result := ScanResult{ |
There was a problem hiding this comment.
Mitigated in 7e0189b: path is validated to reject .\. traversal and absolute paths before use. All path operations are scoped within the temp clone directory.
|
|
||
| // parseFrontmatter reads YAML frontmatter from a markdown file and extracts name and description. | ||
| func parseFrontmatter(path string) (name, description string) { | ||
| f, err := os.Open(path) |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 4 days ago
In general, to fix uncontrolled path usage you must either validate the user-supplied path string (disallowing path separators and .. if it should be a single component) or ensure that, after joining it with a base directory and normalizing, the resulting absolute path is still within that base directory.
In this case, req.Path is used to select a subdirectory under tmpDir to scan. The correct approach is to: (1) compute scanRootCandidate := filepath.Join(tmpDir, req.Path); (2) resolve both tmpDir (the base) and scanRootCandidate to absolute, cleaned paths; and (3) confirm that the absolute scanRoot begins with the absolute tmpDir plus a path separator (or is equal to it). If the check fails, reject the request with a 400 Bad Request and do not proceed. Once this constraint is enforced, all later uses of scanRoot—including paths flowing into parseFrontmatter—are guaranteed to reside within the temporary checkout, eliminating the uncontrolled path traversal.
Concretely, in ScanGitSource around lines 330–334, replace the simple filepath.Join(tmpDir, req.Path) assignment with logic that computes and validates an absolute scanRoot relative to an absolute tmpDir. On error or if the prefix check fails, respond with an HTTP error and return. No changes are needed to parseFrontmatter itself, since it will only be called on validated paths derived from scanRoot. We can implement this using only the existing filepath and strings imports; no new imports are necessary.
| @@ -330,7 +330,26 @@ | ||
| // Determine scan root | ||
| scanRoot := tmpDir | ||
| if req.Path != "" { | ||
| scanRoot = filepath.Join(tmpDir, req.Path) | ||
| baseAbs, err := filepath.Abs(tmpDir) | ||
| if err != nil { | ||
| c.JSON(http.StatusInternalServerError, gin.H{"error": "failed to resolve scan root"}) | ||
| return | ||
| } | ||
|
|
||
| candidate := filepath.Join(tmpDir, req.Path) | ||
| scanRootAbs, err := filepath.Abs(candidate) | ||
| if err != nil { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "invalid path"}) | ||
| return | ||
| } | ||
|
|
||
| // Ensure the resolved scan root stays within the cloned repository directory | ||
| if scanRootAbs != baseAbs && !strings.HasPrefix(scanRootAbs, baseAbs+string(os.PathSeparator)) { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "path traversal outside repository is not allowed"}) | ||
| return | ||
| } | ||
|
|
||
| scanRoot = scanRootAbs | ||
| } | ||
|
|
||
| result := ScanResult{ |
There was a problem hiding this comment.
Mitigated in 7e0189b: path is validated to reject .\. traversal and absolute paths before use. All path operations are scoped within the temp clone directory.
There was a problem hiding this comment.
Actionable comments posted: 13
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 68-78: The global catalogCache currently holds only one source via
the single cacheKey which evicts other sources; change catalogCache to hold a
map of entries keyed by source (e.g., CatalogURL) by replacing
cacheKey/data/cachedAt with entries map[string]catalogEntry and a catalogEntry
struct { data json.RawMessage; cachedAt time.Time }, preserve mu and
mktCatalogCacheTTL, then update all accesses (reads/writes/expiry checks) in
functions that reference mktCatalogCache, catalogCache, cacheKey, data, and
cachedAt to look up the entry by the source key, create/refresh entries when
stale, and delete or skip absent keys as needed so multiple source catalogs can
be cached concurrently.
- Around line 245-251: The git clone call in ScanGitSource passes unvalidated
req.URL and req.Branch into exec.CommandContext (git clone), exposing
SSRF/local-file risks and malformed branch names; before calling
exec.CommandContext (where you construct cmd with req.URL and req.Branch) parse
and validate req.URL using net/url to allow only https scheme (reject file://,
git://, ssh://, localhost, 127.0.0.1, and private IPs/subnets), and sanitize
req.Branch against a strict regex (e.g., allow only [A-Za-z0-9._/-] and
reasonable length) or reject unsafe characters; on validation failure return a
clear 400 error and avoid invoking exec.CommandContext.
- Around line 501-512: The current removal loop in UninstallItem filters only by
itemId and may remove items from different sources; update the loop over
existingItems (where itemMap is derived) to also read sourceUrl from each item
(e.g., src, ok := itemMap["sourceUrl"].(string)) and only exclude items when
both id == itemID AND src == sourceURL (the incoming uninstall source URL
variable); otherwise append the item to filtered. Ensure you keep the existing
type-assertion guard for itemMap and handle missing sourceUrl safely by treating
it as non-matching so only the exact itemId+sourceUrl pair is removed.
- Around line 438-461: The loop appends newItems into existingItems without
deduplication, causing duplicate installedItems; modify the code around
existingItems/newItems handling to skip any new item whose sourceUrl+itemId
already exists in obj.Object["spec"]["installedItems"]. Build a set/map of keys
from installedItems (read each entry's "sourceUrl" and "itemId"), then for each
item in newItems compute key := item.SourceURL + "|" + item.ItemID and only
append the entry if the key is not present; use the same unique key logic when
comparing to existing unstructured entries (check "sourceUrl" and "itemId") and
add newly appended keys to the set to avoid duplicates within the same loop.
In `@components/backend/routes.go`:
- Around line 19-22: The POST /marketplace/scan route is currently
unauthenticated and calls handlers.ScanGitSource; protect it by moving its
registration behind the project/user auth middleware (e.g., the same middleware
applying ValidateProjectContext()) or by attaching an
authentication/authorization middleware to that route so only authenticated
projects/users can call ScanGitSource; additionally consider adding
rate-limiting or an allowlist check inside handlers.ScanGitSource to further
mitigate SSRF/resource-abuse.
In `@components/frontend/src/components/import-source-dialog.tsx`:
- Around line 73-76: The onSuccess handler currently replaces the user’s
preselection by calling setSelectedIds(new Set(data.items.map(i => i.id))) —
change it to preserve prefillItems (or intersect with it) instead of selecting
all: when prefillItems (prop/state) is present, compute the intersection between
new discovered IDs (data.items.map(i => i.id)) and prefillItems and pass that
set to setSelectedIds; otherwise fall back to selecting all discovered IDs;
leave setImportAsWorkflow(data.isWorkflow) as-is. Ensure you reference the
prefillItems variable and the setSelectedIds / onSuccess handler when making
this change.
- Around line 42-48: When the dialog opens it must sync state from props: add a
useEffect that runs when open becomes true and updates gitUrl via
setGitUrl(prefillUrl) and selectedIds via setSelectedIds(new Set(prefillItems)),
and also reset branch, path, and importAsWorkflow as needed (use setBranch,
setPath, setImportAsWorkflow or call resetForm inside the effect). Place the
effect near the existing state declarations and ensure its dependency array
includes open, prefillUrl, and prefillItems so form fields reflect incoming
props each time the dialog is opened.
In `@components/manifests/base/marketplace-sources-configmap.yaml`:
- Around line 1-16: The ConfigMap named "marketplace-sources" is placed outside
the kustomization root and breaks the base/core manifest build; move the
marketplace-sources ConfigMap manifest into the base kustomization's reachable
files (for example place this YAML under components/manifests/base/core/) or
alternatively add it to an upper-level kustomization's resources so base/core
can reference it without referencing upward; ensure the ConfigMap filename and
the resource entry in the kustomization.yaml match the "marketplace-sources"
metadata name so the build can load it.
In `@components/operator/internal/handlers/sessions.go`:
- Around line 919-932: The code currently computes INSTALLED_ITEMS_JSON and
appends it only to the init container env list (base); compute the
installedItems JSON once (e.g. into a local installedItemsJSON string) when
reading ProjectSettings (psGVR/psObj) and, after successful serialization,
append a corev1.EnvVar{Name:"INSTALLED_ITEMS_JSON", Value: installedItemsJSON}
to both the init container env list (base) and the main runner env list (the
variable that builds the ambient runner/env for the runner process—add to that
list as well) so the runner
(ambient_runner/bridges/claude/bridge.py::_clone_marketplace_items) receives it;
preserve the existing error logging and errors.IsNotFound handling.
In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py`:
- Around line 114-120: The current logic sets clone_dir = marketplace_base /
repo_name which causes collisions for different registries or branches; change
clone_dir to incorporate a stable unique key derived from source_url and
source_branch (for example include the branch name or a short hash of
source_url+source_branch) so each checkout gets its own path (use repo_name as a
base but append branch/hash). Replace the early-return when clone_dir.exists()
with reconciliation: open the existing checkout (clone_dir), fetch and checkout
the requested branch (or pull updates) so the workspace is updated instead of
silently skipping; update any code that calls repo_name/clone_dir to use the new
path derivation and ensure _inject_token() is still used for clone_url.
- Around line 122-126: The code clones the repo root when has_workflow is true
and then skips materialization, which ignores the workflow's
installed_item.model.sourcePath so subdirectory workflows are never discovered;
update the branch of code that calls _run_git when has_workflow (and the similar
block around the later 168-176 region) to detect installed_item.model.sourcePath
and either clone/checkout into (or copy/move the repo's) subdirectory: combine
clone_dir with that sourcePath (e.g., use clone_dir / sourcePath as the
effective workspace for downstream operations) or extract that subpath after
cloning so downstream discovery (setup_marketplace_dirs,
content_workflow_metadata) sees the .claude located in the workflow's
sourcePath. Ensure you reference installed_item.model.sourcePath, has_workflow,
_run_git, and clone_dir when making the change.
- Around line 61-69: The _inject_token function currently embeds tokens into
HTTPS clone URLs (and uses substring host checks); stop writing credentials into
the URL and instead return the clean URL while using an ephemeral auth mechanism
at clone time. Change _inject_token (or replace its callers) to parse the URL
(e.g., via urllib.parse), validate the exact hostname against allowed values (do
not use substring matches), never insert tokens into the returned URL, and
ensure origin remains the clean HTTPS URL; perform authentication when invoking
git (e.g., configure a temporary credential helper, use git -c
http.extraheader="Authorization: token $TOKEN" for the single clone operation,
or use GIT_ASKPASS) so tokens are not persisted to .git/config. Ensure host
comparison uses exact netloc matching and remove any code paths that write
credentials into the URL string.
In `@components/runners/ambient-runner/ambient_runner/endpoints/content.py`:
- Around line 531-562: The early return when _find_active_workflow_dir() yields
no workflow_dir prevents marketplace scanning; instead of returning immediately,
set the default ambient config (the same object returned today) and continue: if
not workflow_dir, assign ambient_config to the default config and skip scanning
the workflow's .claude/ (i.e., don't call _scan_claude_dir on wf_path), but
still run the marketplace scanning logic that uses _get_workspace_path() and
_scan_claude_dir(item_dir, source_label=...). Update references to
wf_path/_parse_ambient_config so they only run when workflow_dir exists, and
ensure commands, agents, skills are initialized empty before marketplace
scanning so marketplace items are returned when no workflow is selected.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: d58667ad-78ac-43f3-ac67-383e1b8a44d6
📒 Files selected for processing (19)
components/backend/handlers/marketplace.gocomponents/backend/routes.gocomponents/frontend/src/app/projects/[name]/marketplace/page.tsxcomponents/frontend/src/app/projects/[name]/sessions/[sessionName]/components/sessions-sidebar.tsxcomponents/frontend/src/app/projects/[name]/sessions/[sessionName]/lib/types.tscomponents/frontend/src/components/import-source-dialog.tsxcomponents/frontend/src/components/session/MessagesTab.tsxcomponents/frontend/src/services/api/marketplace.tscomponents/frontend/src/services/api/workflows.tscomponents/frontend/src/services/queries/use-marketplace.tscomponents/frontend/src/types/marketplace.tscomponents/frontend/src/types/workflow.tscomponents/manifests/base/core/kustomization.yamlcomponents/manifests/base/crds/projectsettings-crd.yamlcomponents/manifests/base/marketplace-sources-configmap.yamlcomponents/operator/internal/handlers/sessions.gocomponents/runners/ambient-runner/ambient_runner/bridges/claude/bridge.pycomponents/runners/ambient-runner/ambient_runner/endpoints/content.pycomponents/runners/ambient-runner/ambient_runner/platform/workspace.py
| type catalogCache struct { | ||
| mu sync.RWMutex | ||
| data json.RawMessage | ||
| cachedAt time.Time | ||
| cacheKey string | ||
| } | ||
|
|
||
| var ( | ||
| mktCatalogCache = &catalogCache{} | ||
| mktCatalogCacheTTL = 5 * time.Minute | ||
| ) |
There was a problem hiding this comment.
🧹 Nitpick | 🔵 Trivial
Global catalog cache stores only one source at a time.
The catalogCache uses a single cacheKey field, meaning switching between source indices will evict the previous source's cached catalog. If users frequently switch between multiple marketplace sources, this could lead to unnecessary repeated fetches.
Consider using a map-based cache if multiple sources need to be cached concurrently:
type catalogCache struct {
mu sync.RWMutex
entries map[string]catalogEntry // keyed by CatalogURL
}
type catalogEntry struct {
data json.RawMessage
cachedAt time.Time
}Also applies to: 152-163
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/backend/handlers/marketplace.go` around lines 68 - 78, The global
catalogCache currently holds only one source via the single cacheKey which
evicts other sources; change catalogCache to hold a map of entries keyed by
source (e.g., CatalogURL) by replacing cacheKey/data/cachedAt with entries
map[string]catalogEntry and a catalogEntry struct { data json.RawMessage;
cachedAt time.Time }, preserve mu and mktCatalogCacheTTL, then update all
accesses (reads/writes/expiry checks) in functions that reference
mktCatalogCache, catalogCache, cacheKey, data, and cachedAt to look up the entry
by the source key, create/refresh entries when stale, and delete or skip absent
keys as needed so multiple source catalogs can be cached concurrently.
There was a problem hiding this comment.
Acknowledged. Single-source cache is acceptable for MVP since we currently have one registry source. Will switch to a per-source map if we add more sources.
| cmd := exec.CommandContext(ctx, "git", "clone", "--depth", "1", "-b", req.Branch, req.URL, tmpDir) | ||
| cmd.Env = append(os.Environ(), "GIT_TERMINAL_PROMPT=0") | ||
| if output, err := cmd.CombinedOutput(); err != nil { | ||
| log.Printf("ScanGitSource: git clone failed: %v, output: %s", err, string(output)) | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Failed to clone repository"}) | ||
| return | ||
| } |
There was a problem hiding this comment.
Consider validating Git URL and branch inputs.
The req.URL and req.Branch are user-provided and passed directly to git clone. While exec.CommandContext doesn't invoke a shell (mitigating command injection), consider:
- SSRF risk: Users could specify internal/private URLs (e.g.,
git://internal-host/...,file:///etc/...). - Branch validation: Branches with unusual characters could cause unexpected behavior.
Consider adding URL scheme validation (allow only https://) and branch name sanitization:
🛡️ Suggested validation
func ScanGitSource(c *gin.Context) {
var req ScanRequest
if err := c.ShouldBindJSON(&req); err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid request body"})
return
}
if req.URL == "" || req.Branch == "" {
c.JSON(http.StatusBadRequest, gin.H{"error": "url and branch are required"})
return
}
+ // Validate URL scheme to prevent SSRF
+ if !strings.HasPrefix(req.URL, "https://") {
+ c.JSON(http.StatusBadRequest, gin.H{"error": "Only HTTPS URLs are allowed"})
+ return
+ }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/backend/handlers/marketplace.go` around lines 245 - 251, The git
clone call in ScanGitSource passes unvalidated req.URL and req.Branch into
exec.CommandContext (git clone), exposing SSRF/local-file risks and malformed
branch names; before calling exec.CommandContext (where you construct cmd with
req.URL and req.Branch) parse and validate req.URL using net/url to allow only
https scheme (reject file://, git://, ssh://, localhost, 127.0.0.1, and private
IPs/subnets), and sanitize req.Branch against a strict regex (e.g., allow only
[A-Za-z0-9._/-] and reasonable length) or reject unsafe characters; on
validation failure return a clear 400 error and avoid invoking
exec.CommandContext.
components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py
Outdated
Show resolved
Hide resolved
| repo_name = source_url.split("/")[-1].removesuffix(".git") | ||
| clone_dir = marketplace_base / repo_name | ||
| clone_url = _inject_token(source_url) | ||
|
|
||
| if clone_dir.exists(): | ||
| logger.info(f"Marketplace: {repo_name} already exists, skipping") | ||
| return |
There was a problem hiding this comment.
Use a unique checkout directory per source and branch.
clone_dir = marketplace_base / repo_name collapses different registries and different branches of the same repo into the same path, and the exists() fast-path silently skips the later install. That will serve the wrong contents as soon as two installed sources share a basename, or when a persisted workspace needs to pick up new items/branches on a later session. Derive the directory from a stable key like sourceUrl + sourceBranch and reconcile existing checkouts instead of returning early.
🧰 Tools
🪛 Ruff (0.15.6)
[warning] 119-119: Logging statement uses f-string
(G004)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py`
around lines 114 - 120, The current logic sets clone_dir = marketplace_base /
repo_name which causes collisions for different registries or branches; change
clone_dir to incorporate a stable unique key derived from source_url and
source_branch (for example include the branch name or a short hash of
source_url+source_branch) so each checkout gets its own path (use repo_name as a
base but append branch/hash). Replace the early-return when clone_dir.exists()
with reconciliation: open the existing checkout (clone_dir), fetch and checkout
the requested branch (or pull updates) so the workspace is updated instead of
silently skipping; update any code that calls repo_name/clone_dir to use the new
path derivation and ensure _inject_token() is still used for clone_url.
| if has_workflow: | ||
| rc, _, err = await _run_git( | ||
| "clone", "--depth", "1", "-b", branch, | ||
| clone_url, str(clone_dir), | ||
| ) |
There was a problem hiding this comment.
Workflow imports ignore sourcePath.
When a group contains a workflow item you clone the repo root and then skip the materialization path for that item type. The installed item model carries sourcePath, but downstream discovery in platform/workspace.py::setup_marketplace_dirs() and endpoints/content.py::content_workflow_metadata() only looks for .claude at the top of each marketplace directory. Workflows that live under a subdirectory will therefore import successfully yet never be discovered at runtime.
Also applies to: 168-176
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py`
around lines 122 - 126, The code clones the repo root when has_workflow is true
and then skips materialization, which ignores the workflow's
installed_item.model.sourcePath so subdirectory workflows are never discovered;
update the branch of code that calls _run_git when has_workflow (and the similar
block around the later 168-176 region) to detect installed_item.model.sourcePath
and either clone/checkout into (or copy/move the repo's) subdirectory: combine
clone_dir with that sourcePath (e.g., use clone_dir / sourcePath as the
effective workspace for downstream operations) or extract that subpath after
cloning so downstream discovery (setup_marketplace_dirs,
content_workflow_metadata) sees the .claude located in the workflow's
sourcePath. Ensure you reference installed_item.model.sourcePath, has_workflow,
_run_git, and clone_dir when making the change.
There was a problem hiding this comment.
Good catch. For workflow items, the full repo is cloned so sourcePath determines the scan root but the entire repo content is available. The _find_active_workflow_dir() in content.py handles locating the workflow within the clone. Will add explicit sourcePath handling if needed.
There was a problem hiding this comment.
Actionable comments posted: 5
♻️ Duplicate comments (4)
components/frontend/src/components/import-source-dialog.tsx (1)
42-48:⚠️ Potential issue | 🟠 MajorAdd
useEffectto sync form state when dialog opens with new props.
useState(prefillUrl)captures the initial value only. If the parent updatesprefillUrlorprefillItemswhile the dialog stays mounted (closed), reopening it will show stale values. Add an effect to sync state whenopenbecomestrue.🔧 Proposed fix
-import { useState, useMemo } from "react"; +import { useState, useMemo, useEffect } from "react"; // After the useState declarations (around line 48): + useEffect(() => { + if (open) { + setGitUrl(prefillUrl); + setSelectedIds(new Set(prefillItems)); + setBranch("main"); + setPath(""); + setImportAsWorkflow(false); + scanMutation.reset(); + } + }, [open, prefillUrl, prefillItems]);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/frontend/src/components/import-source-dialog.tsx` around lines 42 - 48, When the dialog opens with new props the local state (gitUrl, branch, path, selectedIds, importAsWorkflow) can be stale because useState(prefillUrl) and useState(new Set(prefillItems)) only initialize once; add a useEffect that watches the dialog's open prop and when open becomes true reset state by calling setGitUrl(prefillUrl), setBranch("main") or keep current default, setPath(""), setSelectedIds(new Set(prefillItems)), and setImportAsWorkflow(false) (or the intended default) so the form reflects the latest prefillUrl/prefillItems; reference the gitUrl/setGitUrl, selectedIds/setSelectedIds, prefillUrl, prefillItems, and open in the effect.components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py (1)
135-143:⚠️ Potential issue | 🟡 MinorDifferent registries with the same repository basename still collide.
While appending the branch name (line 137) helps distinguish branches, two different registries hosting a repo with the same basename (e.g.,
github.com/org1/helpers.gitandgitlab.com/org2/helpers.git) will still map to the same directory. Consider including a hash or domain prefix indir_name.🛠️ Suggested approach
+import hashlib + repo_name = source_url.split("/")[-1].removesuffix(".git") -dir_name = f"{repo_name}-{branch}" if branch != "main" else repo_name +# Include URL hash to avoid collisions between different registries +url_hash = hashlib.sha256(source_url.encode()).hexdigest()[:8] +dir_name = f"{repo_name}-{url_hash}" if branch == "main" else f"{repo_name}-{branch}-{url_hash}"🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py` around lines 135 - 143, The current dir_name logic (repo_name, branch) causes collisions for identical repo basenames hosted on different registries; update the directory naming in the block using repo_name, branch, dir_name and clone_dir so it incorporates a registry/domain or short hash derived from source_url (e.g., include the parsed netloc or a short sha1/sha256 of source_url) — e.g., compute registry = parsed netloc or url_hash = short_hash(source_url) and build dir_name = f"{registry}-{repo_name}-{branch}" or f"{repo_name}-{branch}-{url_hash}" to ensure unique clone directories across registries and keep the existing exists check/logging paths intact.components/backend/handlers/marketplace.go (2)
68-73: 🧹 Nitpick | 🔵 TrivialUse per-source catalog cache entries instead of a single global slot.
Line 68-Line 73 and Line 152-Line 163 keep only one cached catalog at a time; switching sources constantly evicts cache and forces repeat remote fetches.
As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
Also applies to: 152-163
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 68 - 73, The current catalogCache struct (mu, data, cachedAt, cacheKey) holds a single global cached catalog which causes evictions when switching sources; change it to store per-source entries (e.g., replace data/cacheKey with a map[string]entry where key is source ID and entry contains json.RawMessage and cachedAt) and update all accessors used in handlers (lookup/set logic around catalogCache usage in the marketplace handler functions) to read/write by source key under the same mu (or use RWMutex per-entry) so each source keeps its own cached catalog and avoids repeated remote fetches.
232-235:⚠️ Potential issue | 🟠 MajorStrengthen Git URL validation beyond string-prefix checks.
Line 232-Line 235 can still reach internal targets (
https://localhost/..., private IPs, and SSH-style internal hosts). This keeps SSRF/internal-network access risk open ingit clone.Proposed hardening direction
- if !strings.HasPrefix(req.URL, "https://") && !strings.HasPrefix(req.URL, "git@") { - c.JSON(http.StatusBadRequest, gin.H{"error": "Only HTTPS and SSH Git URLs are allowed"}) - return - } + parsed, err := url.Parse(req.URL) + if err != nil || parsed.Scheme != "https" || parsed.Hostname() == "" { + c.JSON(http.StatusBadRequest, gin.H{"error": "Only valid HTTPS Git URLs are allowed"}) + return + } + if isLoopbackOrPrivateHost(parsed.Hostname()) { + c.JSON(http.StatusBadRequest, gin.H{"error": "URL host is not allowed"}) + return + }As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
Also applies to: 257-257
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 232 - 235, The current prefix check using req.URL and strings.HasPrefix (in the marketplace handler) is insufficient for SSRF: parse the incoming req.URL with net/url.Parse for HTTPS URLs and, for SSH-style git URLs (git@... or ssh://), extract the host portion (after '@' and before ':' or path) instead of relying on prefix checks; then resolve/validate the host by rejecting loopback names (localhost), literal IPs that are in private/reserved ranges (check with net.ParseIP and net.IP.IsLoopback/containment against RFC1918/CIDR ranges), and reject hosts that resolve to private/internal IPs (use net.LookupIP and validate each resulting IP); ensure the validation logic replaces the strings.HasPrefix branch and is applied for both HTTPS and SSH git formats (references: req.URL, strings.HasPrefix checks, and the validation block in the marketplace handler).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 436-488: The handler currently accepts newItems (slice of
InstalledItem) and persists fields without validation; add server-side
validation inside the loop that processes newItems (before building entry and
appending to existingItems) to reject malformed or unsafe values: validate
SourceURL with url.ParseRequestURI (return 400 on invalid), ensure ItemID and
ItemName meet allowed patterns/lengths (e.g., regex and max length), restrict
ItemType to an allowlist, and sanitize SourcePath and FilePath using
filepath.Clean and reject paths that are absolute or contain ".." (or otherwise
disallowed characters); implement these checks in the same scope that computes
key := item.SourceURL + "\x00" + item.ItemID and reference InstalledItem,
newItems, key, entry, existingItems so invalid items are skipped or cause a
BadRequest response before persisting.
In `@components/frontend/src/components/import-source-dialog.tsx`:
- Around line 53-66: scannedItems is recreated every render causing the grouped
useMemo to recompute; wrap the scannedItems expression in useMemo so it is
stable (e.g. replace const scannedItems = scanMutation.data?.items ?? [] with
const scannedItems = useMemo(() => scanMutation.data?.items ?? [],
[scanMutation.data?.items]) ), keep grouped as useMemo([... , [scannedItems]]),
and similarly memoize isWorkflow if needed.
In `@components/frontend/src/services/queries/use-marketplace.ts`:
- Around line 58-67: The uninstall mutation in useUninstallItem currently calls
marketplaceApi.uninstallItem with only projectName and itemId, which can remove
sibling installs because installs are keyed by (sourceUrl, itemId); update
mutationFn in useUninstallItem to accept and forward sourceUrl as well (add
sourceUrl to the typed parameter object) and call
marketplaceApi.uninstallItem(projectName, itemId, sourceUrl) so the API receives
the full key and only the intended install is removed; ensure any callers and
the marketplaceApi.uninstallItem signature are updated accordingly to accept the
optional/required sourceUrl parameter.
In `@components/operator/internal/handlers/sessions.go`:
- Around line 1198-1207: Extract the ProjectSettings lookup and installedItems
JSON serialization into a single helper/local variable (e.g., compute
installedItemsJSON once using types.GetProjectSettingsResource(),
config.DynamicClient.Resource(psGVR).Namespace(...).Get(...), and
unstructured.NestedSlice) and then reuse that value when appending
corev1.EnvVar{Name: "INSTALLED_ITEMS_JSON", Value: installedItemsJSON} in both
the init-container env builder and the main env builder where you currently
append to base; replace the duplicated blocks with a reference to the shared
installedItemsJSON (ensure nil/empty checks remain and only append the env var
when serialization succeeded).
In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py`:
- Around line 145-154: When has_workflow is true and you clone the repo via
_run_git, ensure the marketplace discovery can see a workflow that lives under a
non-root sourcePath: either perform a sparse-checkout that only checks out the
repository path at sourcePath/.claude (modify the _run_git invocation to
init/sparse-checkout and set core.sparseCheckout and add sourcePath/.claude), or
after the normal clone (in the same block that checks rc != 0) create a symlink
at str(clone_dir)/.claude pointing to str(clone_dir)/{sourcePath}/.claude when
sourcePath is non-empty; update the code around has_workflow, _run_git, and
clone_dir to create the symlink (or add sparse-checkout) so content.py’s
marketplace discovery (which expects marketplace/*/.claude at repo root) will
find the workflow.
---
Duplicate comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 68-73: The current catalogCache struct (mu, data, cachedAt,
cacheKey) holds a single global cached catalog which causes evictions when
switching sources; change it to store per-source entries (e.g., replace
data/cacheKey with a map[string]entry where key is source ID and entry contains
json.RawMessage and cachedAt) and update all accessors used in handlers
(lookup/set logic around catalogCache usage in the marketplace handler
functions) to read/write by source key under the same mu (or use RWMutex
per-entry) so each source keeps its own cached catalog and avoids repeated
remote fetches.
- Around line 232-235: The current prefix check using req.URL and
strings.HasPrefix (in the marketplace handler) is insufficient for SSRF: parse
the incoming req.URL with net/url.Parse for HTTPS URLs and, for SSH-style git
URLs (git@... or ssh://), extract the host portion (after '@' and before ':' or
path) instead of relying on prefix checks; then resolve/validate the host by
rejecting loopback names (localhost), literal IPs that are in private/reserved
ranges (check with net.ParseIP and net.IP.IsLoopback/containment against
RFC1918/CIDR ranges), and reject hosts that resolve to private/internal IPs (use
net.LookupIP and validate each resulting IP); ensure the validation logic
replaces the strings.HasPrefix branch and is applied for both HTTPS and SSH git
formats (references: req.URL, strings.HasPrefix checks, and the validation block
in the marketplace handler).
In `@components/frontend/src/components/import-source-dialog.tsx`:
- Around line 42-48: When the dialog opens with new props the local state
(gitUrl, branch, path, selectedIds, importAsWorkflow) can be stale because
useState(prefillUrl) and useState(new Set(prefillItems)) only initialize once;
add a useEffect that watches the dialog's open prop and when open becomes true
reset state by calling setGitUrl(prefillUrl), setBranch("main") or keep current
default, setPath(""), setSelectedIds(new Set(prefillItems)), and
setImportAsWorkflow(false) (or the intended default) so the form reflects the
latest prefillUrl/prefillItems; reference the gitUrl/setGitUrl,
selectedIds/setSelectedIds, prefillUrl, prefillItems, and open in the effect.
In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py`:
- Around line 135-143: The current dir_name logic (repo_name, branch) causes
collisions for identical repo basenames hosted on different registries; update
the directory naming in the block using repo_name, branch, dir_name and
clone_dir so it incorporates a registry/domain or short hash derived from
source_url (e.g., include the parsed netloc or a short sha1/sha256 of
source_url) — e.g., compute registry = parsed netloc or url_hash =
short_hash(source_url) and build dir_name = f"{registry}-{repo_name}-{branch}"
or f"{repo_name}-{branch}-{url_hash}" to ensure unique clone directories across
registries and keep the existing exists check/logging paths intact.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 7e72a98b-dcc2-4ca0-ba33-4cbe42c94489
📒 Files selected for processing (10)
components/backend/handlers/marketplace.gocomponents/backend/routes.gocomponents/frontend/src/components/import-source-dialog.tsxcomponents/frontend/src/services/api/marketplace.tscomponents/frontend/src/services/queries/use-marketplace.tscomponents/manifests/base/core/kustomization.yamlcomponents/manifests/base/core/marketplace-sources-configmap.yamlcomponents/operator/internal/handlers/sessions.gocomponents/runners/ambient-runner/ambient_runner/bridges/claude/bridge.pycomponents/runners/ambient-runner/ambient_runner/endpoints/content.py
| var newItems []InstalledItem | ||
| if err := c.ShouldBindJSON(&newItems); err != nil { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid request body"}) | ||
| return | ||
| } | ||
|
|
||
| gvr := GetProjectSettingsResource() | ||
| obj, err := k8sDyn.Resource(gvr).Namespace(project).Get(c.Request.Context(), "projectsettings", v1.GetOptions{}) | ||
| if err != nil { | ||
| log.Printf("InstallItems: failed to get ProjectSettings for %s: %v", project, err) | ||
| c.JSON(http.StatusInternalServerError, gin.H{"error": "Failed to read project settings"}) | ||
| return | ||
| } | ||
|
|
||
| // Get existing items | ||
| existingItems, _, _ := unstructured.NestedSlice(obj.Object, "spec", "installedItems") | ||
|
|
||
| // Build a set of existing (sourceUrl, itemId) pairs for dedup | ||
| existingKeys := make(map[string]bool) | ||
| for _, item := range existingItems { | ||
| if m, ok := item.(map[string]interface{}); ok { | ||
| src, _ := m["sourceUrl"].(string) | ||
| id, _ := m["itemId"].(string) | ||
| existingKeys[src+"\x00"+id] = true | ||
| } | ||
| } | ||
|
|
||
| // Convert new items to unstructured and append (skip duplicates) | ||
| for _, item := range newItems { | ||
| key := item.SourceURL + "\x00" + item.ItemID | ||
| if existingKeys[key] { | ||
| continue | ||
| } | ||
| entry := map[string]interface{}{ | ||
| "sourceUrl": item.SourceURL, | ||
| "itemId": item.ItemID, | ||
| "itemType": item.ItemType, | ||
| } | ||
| if item.SourceBranch != "" { | ||
| entry["sourceBranch"] = item.SourceBranch | ||
| } | ||
| if item.SourcePath != "" { | ||
| entry["sourcePath"] = item.SourcePath | ||
| } | ||
| if item.ItemName != "" { | ||
| entry["itemName"] = item.ItemName | ||
| } | ||
| if item.FilePath != "" { | ||
| entry["filePath"] = item.FilePath | ||
| } | ||
| existingItems = append(existingItems, entry) | ||
| existingKeys[key] = true | ||
| } |
There was a problem hiding this comment.
Validate InstalledItem payload server-side before persisting.
Line 436-Line 488 accepts and stores unvalidated sourceUrl, sourcePath, itemType, and identifiers. Malformed or unsafe values can be persisted and later break or abuse runner-side clone/import flows.
Proposed validation guard
var newItems []InstalledItem
if err := c.ShouldBindJSON(&newItems); err != nil {
c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid request body"})
return
}
+for i, item := range newItems {
+ if item.SourceURL == "" || item.ItemID == "" || item.ItemType == "" {
+ c.JSON(http.StatusBadRequest, gin.H{"error": fmt.Sprintf("Invalid item at index %d", i)})
+ return
+ }
+ if !strings.HasPrefix(item.SourceURL, "https://") {
+ c.JSON(http.StatusBadRequest, gin.H{"error": "sourceUrl must be HTTPS"})
+ return
+ }
+ if item.SourcePath != "" && (filepath.IsAbs(item.SourcePath) || strings.Contains(item.SourcePath, "..")) {
+ c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid sourcePath"})
+ return
+ }
+ switch item.ItemType {
+ case "skill", "command", "agent", "workflow":
+ default:
+ c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid itemType"})
+ return
+ }
+}As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/backend/handlers/marketplace.go` around lines 436 - 488, The
handler currently accepts newItems (slice of InstalledItem) and persists fields
without validation; add server-side validation inside the loop that processes
newItems (before building entry and appending to existingItems) to reject
malformed or unsafe values: validate SourceURL with url.ParseRequestURI (return
400 on invalid), ensure ItemID and ItemName meet allowed patterns/lengths (e.g.,
regex and max length), restrict ItemType to an allowlist, and sanitize
SourcePath and FilePath using filepath.Clean and reject paths that are absolute
or contain ".." (or otherwise disallowed characters); implement these checks in
the same scope that computes key := item.SourceURL + "\x00" + item.ItemID and
reference InstalledItem, newItems, key, entry, existingItems so invalid items
are skipped or cause a BadRequest response before persisting.
| const scannedItems = scanMutation.data?.items ?? []; | ||
| const isWorkflow = scanMutation.data?.isWorkflow ?? false; | ||
|
|
||
| const grouped = useMemo(() => { | ||
| const groups: Record<string, DiscoveredItem[]> = { | ||
| skill: [], | ||
| command: [], | ||
| agent: [], | ||
| }; | ||
| for (const item of scannedItems) { | ||
| (groups[item.type] ??= []).push(item); | ||
| } | ||
| return groups; | ||
| }, [scannedItems]); |
There was a problem hiding this comment.
Wrap scannedItems in useMemo to stabilize the dependency.
The linter correctly flags that scannedItems (line 53) is recalculated on every render, which can cause the grouped memo (line 56) to recompute unnecessarily.
🔧 Proposed fix
-const scannedItems = scanMutation.data?.items ?? [];
+const scannedItems = useMemo(
+ () => scanMutation.data?.items ?? [],
+ [scanMutation.data?.items]
+);🧰 Tools
🪛 GitHub Check: Frontend Lint and Type Check
[warning] 53-53:
The 'scannedItems' logical expression could make the dependencies of useMemo Hook (at line 66) change on every render. To fix this, wrap the initialization of 'scannedItems' in its own useMemo() Hook
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/frontend/src/components/import-source-dialog.tsx` around lines 53
- 66, scannedItems is recreated every render causing the grouped useMemo to
recompute; wrap the scannedItems expression in useMemo so it is stable (e.g.
replace const scannedItems = scanMutation.data?.items ?? [] with const
scannedItems = useMemo(() => scanMutation.data?.items ?? [],
[scanMutation.data?.items]) ), keep grouped as useMemo([... , [scannedItems]]),
and similarly memoize isWorkflow if needed.
| export function useUninstallItem() { | ||
| const queryClient = useQueryClient(); | ||
| return useMutation({ | ||
| mutationFn: ({ | ||
| projectName, | ||
| itemId, | ||
| }: { | ||
| projectName: string; | ||
| itemId: string; | ||
| }) => marketplaceApi.uninstallItem(projectName, itemId), |
There was a problem hiding this comment.
Pass sourceUrl in uninstall mutation to avoid deleting sibling installs.
Line 61-Line 67 remove by itemId only. Because installs are keyed by (sourceUrl, itemId), this can uninstall multiple items when IDs overlap across sources.
Proposed fix
export function useUninstallItem() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: ({
projectName,
+ sourceUrl,
itemId,
}: {
projectName: string;
+ sourceUrl: string;
itemId: string;
- }) => marketplaceApi.uninstallItem(projectName, itemId),
+ }) => marketplaceApi.uninstallItem(projectName, itemId, sourceUrl),
onSuccess: (_data, variables) => {
void queryClient.invalidateQueries({
queryKey: marketplaceKeys.installed(variables.projectName),
});
},
});
}As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export function useUninstallItem() { | |
| const queryClient = useQueryClient(); | |
| return useMutation({ | |
| mutationFn: ({ | |
| projectName, | |
| itemId, | |
| }: { | |
| projectName: string; | |
| itemId: string; | |
| }) => marketplaceApi.uninstallItem(projectName, itemId), | |
| export function useUninstallItem() { | |
| const queryClient = useQueryClient(); | |
| return useMutation({ | |
| mutationFn: ({ | |
| projectName, | |
| sourceUrl, | |
| itemId, | |
| }: { | |
| projectName: string; | |
| sourceUrl: string; | |
| itemId: string; | |
| }) => marketplaceApi.uninstallItem(projectName, itemId, sourceUrl), | |
| onSuccess: (_data, variables) => { | |
| void queryClient.invalidateQueries({ | |
| queryKey: marketplaceKeys.installed(variables.projectName), | |
| }); | |
| }, | |
| }); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/frontend/src/services/queries/use-marketplace.ts` around lines 58
- 67, The uninstall mutation in useUninstallItem currently calls
marketplaceApi.uninstallItem with only projectName and itemId, which can remove
sibling installs because installs are keyed by (sourceUrl, itemId); update
mutationFn in useUninstallItem to accept and forward sourceUrl as well (add
sourceUrl to the typed parameter object) and call
marketplaceApi.uninstallItem(projectName, itemId, sourceUrl) so the API receives
the full key and only the intended install is removed; ensure any callers and
the marketplaceApi.uninstallItem signature are updated accordingly to accept the
optional/required sourceUrl parameter.
components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py
Show resolved
Hide resolved
…g extensions Add support for Claude Code skills (.claude/skills/), a workspace-scoped Marketplace page for browsing and importing skills/commands/agents from external registries, and per-item installation stored in ProjectSettings CR. Key changes: Runner: - Parse .claude/skills/*/SKILL.md in workflow metadata endpoint - Extract shared _scan_claude_dir() helper for commands/agents/skills - Add setup_marketplace_dirs() to include /workspace/marketplace/ in add_dirs - Sparse/shallow clone installed marketplace items at session startup - Parallel cloning via asyncio.gather() for multiple sources Backend: - New marketplace-sources ConfigMap with AI Helpers as default registry - Extend ProjectSettings CRD with installedItems array - New marketplace.go handlers: list sources, fetch catalog, scan Git repos, install/uninstall items per workspace - ScanGitSource endpoint: clone, discover skills/commands/agents/workflows Operator: - Read installedItems from ProjectSettings CR and pass as INSTALLED_ITEMS_JSON env var to runner container Frontend: - WorkflowSkill type + skills in workflow metadata and autocomplete - Marketplace page with Browse (curated catalogs) and Installed tabs - Universal ImportSourceDialog: scan Git URL, discover items, select and import - Marketplace API layer, React Query hooks, sidebar navigation - MARKETPLACE_CATEGORY_COLORS shared constant
- Move POST /marketplace/scan behind ValidateProjectContext auth - Validate Git URL scheme (only HTTPS/SSH) and path (no .. traversal) - Add INSTALLED_ITEMS_JSON to runner container env vars (was only on init-hydrate) - Use git credential helper instead of embedding tokens in clone URLs - Deduplicate items on install (by sourceUrl + itemId) - Filter uninstall by sourceUrl + itemId (not just itemId) - Fix marketplace scanning to work without active workflow - Use unique clone dir per source+branch to avoid collisions - Fix import dialog to respect prefillItems instead of select-all override
7e0189b to
7bf19ef
Compare
…ency - Hoist ProjectSettings CR read above both init-hydrate and runner container env closures — single K8s API call instead of two - Fix skills source field to only include 'source' when source_label is non-empty, matching commands/agents behavior
There was a problem hiding this comment.
Actionable comments posted: 4
♻️ Duplicate comments (7)
components/operator/internal/handlers/sessions.go (1)
922-935:⚠️ Potential issue | 🟠 MajorRead
installedItemsonce and reuse the same value for both containers.These two blocks perform separate live reads of
ProjectSettings.spec.installedItems. If one read observes a newer version—or one fails while the other succeeds—the init container and runner can start with differentINSTALLED_ITEMS_JSONvalues, which makes marketplace setup nondeterministic. The runner block also drops failures silently. Compute the JSON once before building either env list and append that same value to both containers.Also applies to: 1201-1211
components/frontend/src/components/import-source-dialog.tsx (1)
42-48:⚠️ Potential issue | 🟠 MajorSync the form from the latest prefill props when the dialog opens.
useState(prefillUrl/prefillItems)only captures the first render. BecauseMarketplacePagekeeps this component mounted, opening the dialog from a different catalog card later still shows whatever was left in local state instead of the new source/item props. RefreshgitUrl,selectedIds, and the rest of the form from the latest props each timeopenflips totrue.Also applies to: 127-134
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/frontend/src/components/import-source-dialog.tsx` around lines 42 - 48, The component initializes local state (gitUrl, branch, path, selectedIds, importAsWorkflow) from props (prefillUrl, prefillItems) only on first render; fix this by adding a useEffect that watches the dialog's open flag and the prefill props and resets those states when open becomes true—e.g., useEffect(() => { if (open) { setGitUrl(prefillUrl); setSelectedIds(new Set(prefillItems)); setBranch(prefillBranchOrDefault); setPath(prefillPathOrEmpty); setImportAsWorkflow(prefillImportAsWorkflowOrFalse); } }, [open, prefillUrl, prefillItems, /* other prefill props */]);—apply the same reset logic for the other state hooks referenced around lines 127-134 so the form reflects the latest prefill each time the dialog opens.components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py (2)
109-117:⚠️ Potential issue | 🟠 MajorDerive a stable, sanitized checkout key and reconcile existing clones.
repo_name+ rawbranchis still not unique across different source URLs, and branch names containing/turn this into a nested path. Theclone_dir.exists()fast-path is also too aggressive: on a persisted workspace it skips both repo refresh and any new item materialization for that source. Key the directory off a stable hash ofsourceUrl + branch, and refresh/rebuild existing checkouts instead of returning early.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py` around lines 109 - 117, The current checkout key (repo_name + branch) can collide and create nested paths; replace it with a sanitized, stable key derived from a short hash of source_url + branch (e.g., use repo_name + "-" + short_hash) and ensure any "/" in branch is sanitized so it never creates nested directories; compute dir_name from that key and set clone_dir = marketplace_base / dir_name (use source_url and branch to build the hash). Also change the fast-path in the existing conditional that uses clone_dir.exists(): do not return early—when clone_dir.exists() trigger a refresh/rebuild of the checkout (pull, re-clone, or remove-and-clone) and continue materialization so persisted workspaces still get refreshed and new items are materialized; keep the original variables repo_name, branch, source_url, marketplace_base, clone_dir and clone_url to locate where to apply these changes.
119-123:⚠️ Potential issue | 🟠 MajorWorkflow installs still ignore non-root
sourcePaths.When a group contains a workflow item, this code clones the repo root and skips materialization entirely. Downstream discovery in
components/runners/ambient-runner/ambient_runner/endpoints/content.pyonly scansmarketplace/*/.claude, so a workflow whose.claudedirectory lives undersourcePath/.claudewill install successfully and then never be discovered at runtime.Also applies to: 172-173
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py` around lines 119 - 123, The workflow-handling branch in the Claude bridge uses has_workflow to clone the repo root via _run_git (clone_url, clone_dir, branch) and then skips materialization, which ignores non-root workflow sourcePath; change the logic so that when a workflow item has a non-root sourcePath you either perform a sparse checkout or clone the repo and then copy/check out only the sourcePath into clone_dir (or adjust clone_dir to point at the sourcePath subdirectory) so the .claude directory ends up at the expected path for downstream discovery; update the same handling used at the other occurrence (lines referenced around 172-173) so both locations respect the workflow's sourcePath.components/frontend/src/services/api/marketplace.ts (1)
61-66:⚠️ Potential issue | 🟠 MajorPass
sourceUrlin uninstall requests to avoid cross-source deletions.Line 65 deletes by
itemIdpath only. Installs are keyed by(sourceUrl, itemId), so this can remove unintended sibling installs when IDs overlap.Proposed fix
export async function uninstallItem( projectName: string, - itemId: string + itemId: string, + sourceUrl: string ): Promise<void> { - await apiClient.delete(`/projects/${projectName}/marketplace/items/${itemId}`); + await apiClient.delete( + `/projects/${projectName}/marketplace/items/${itemId}`, + { params: { sourceUrl } } + ); }Also update callers (e.g.,
useUninstallItem) to providesourceUrlin mutation variables. As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/frontend/src/services/api/marketplace.ts` around lines 61 - 66, The uninstallItem function currently deletes by itemId only, which can remove installs from the wrong source when IDs collide; modify uninstallItem(projectName: string, itemId: string) to accept a sourceUrl (e.g., uninstallItem(projectName: string, itemId: string, sourceUrl: string)) and include sourceUrl in the delete request (as a query parameter or body as expected by the backend) so the server can key by (sourceUrl, itemId); then update callers such as useUninstallItem to supply sourceUrl in their mutation variables and adjust any types/signatures accordingly.components/backend/handlers/marketplace.go (2)
232-235:⚠️ Potential issue | 🟠 MajorHarden Git URL validation against internal-network SSRF.
Prefix checks still allow targets like
https://127.0.0.1/...,https://*.svc/..., orgit@internal-host:.... This endpoint can be abused to probe internal services.Proposed hardening direction
- // Validate URL scheme (only allow https:// and git:// to prevent SSRF with file://, ftp://, etc.) - if !strings.HasPrefix(req.URL, "https://") && !strings.HasPrefix(req.URL, "git@") { + // Validate and restrict URL/host (scheme + disallow localhost/private/internal hosts) + if err := validateGitSourceURL(req.URL); err != nil { c.JSON(http.StatusBadRequest, gin.H{"error": "Only HTTPS and SSH Git URLs are allowed"}) return }// Add helper: parse host, reject localhost, loopback, link-local, private CIDRs, and cluster-local hostnames. func validateGitSourceURL(raw string) error { /* ... */ }As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
Also applies to: 257-257
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 232 - 235, The current prefix-based check for req.URL in the handler is insufficient to prevent SSRF to internal addresses; add a new helper function validateGitSourceURL(raw string) error and call it from the handler instead of the simple strings.HasPrefix checks. validateGitSourceURL should parse the URL/SSH target, extract the hostname/IP, reject localhost/127.0.0.0/8, ::1, link-local addresses, and all private CIDRs, and also block cluster-local hostnames (e.g., *.svc, *.cluster.local) and bare internal hostnames; return a clear error for any rejected host so the handler can return http.StatusBadRequest. Ensure you use this helper when validating req.URL (and any other uses noted at the same location) so all repo fetches are vetted centrally.
436-488:⚠️ Potential issue | 🟠 MajorValidate
InstalledItempayload fields before persisting.
InstallItemscurrently stores user-providedsourceUrl,sourcePath,filePath, anditemTypewithout server-side validation. Invalid or unsafe values can corrupt persisted state and create downstream path/sync risks.Proposed validation guard
var newItems []InstalledItem if err := c.ShouldBindJSON(&newItems); err != nil { c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid request body"}) return } +for i, item := range newItems { + if item.SourceURL == "" || item.ItemID == "" || item.ItemType == "" { + c.JSON(http.StatusBadRequest, gin.H{"error": fmt.Sprintf("Invalid item at index %d", i)}) + return + } + if err := validateGitSourceURL(item.SourceURL); err != nil { + c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid sourceUrl"}) + return + } + if item.SourcePath != "" && (filepath.IsAbs(item.SourcePath) || strings.Contains(filepath.Clean(item.SourcePath), "..")) { + c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid sourcePath"}) + return + } + if item.FilePath != "" && (filepath.IsAbs(item.FilePath) || strings.Contains(filepath.Clean(item.FilePath), "..")) { + c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid filePath"}) + return + } + switch item.ItemType { + case "skill", "command", "agent", "workflow": + default: + c.JSON(http.StatusBadRequest, gin.H{"error": fmt.Sprintf("Invalid itemType at index %d", i)}) + return + } +}As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 436 - 488, InstallItems is persisting user-supplied InstalledItem fields without validation; before converting items to unstructured in the InstallItems handler (the loop that reads newItems and builds entry maps), validate each InstalledItem: parse and validate SourceURL with url.Parse and allow only safe schemes (e.g., http, https, git), enforce ItemType is one of the accepted enum values, reject or normalize empty/oversized fields (set max length), ensure SourcePath and FilePath are not absolute and do not contain path traversal segments (no ".." or leading "/" ), and sanitize ItemName and SourceBranch (strip control chars / enforce regex). If any item fails validation, return c.JSON(http.StatusBadRequest, gin.H{"error": "<field> invalid"}) and skip persisting; only append validated entries to existingItems.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 259-261: In ScanGitSource, stop logging the raw git command output
(the output variable from cmd.CombinedOutput) because it can contain credentials
or sensitive remote details; instead log only the error value and a
sanitized/truncated form of the output. Replace the log.Printf call that
mentions output with a log that includes err and at most a small,
URL/credential-stripped snippet (implement a helper like sanitizeGitOutput or
sanitizeRepoOutput that removes credentials from URLs and truncates to e.g. 200
chars), and keep the HTTP response generic (e.g. "Failed to clone repository")
as already done.
In `@components/frontend/src/app/projects/`[name]/marketplace/page.tsx:
- Around line 146-154: HandleUninstall and rendering use itemId alone which
causes duplicate keys and ambiguous uninstalls; change to use a composite
identity (e.g., `${sourceUrl}:${itemId}` or an object { sourceUrl, itemId }
matching backend identity) throughout: update handleUninstall signature/usage to
accept the composite id or both sourceUrl and itemId and pass both to
uninstallMutation (replace { projectName, itemId } with { projectName,
sourceUrl, itemId } or the composite key the backend expects), update any React
keys and spinner logic to use the composite key (where items are
mapped/rendered), and make the same change for the other occurrence referenced
(the block around lines 293-318) so keys, loading states, and API calls
consistently use the composite identity.
- Around line 137-143: The import handler only forwards source.url and item id,
causing curated sources' branch/path to be lost; update handleImportFromCatalog
(and the similar handler at the other location) to also forward
MarketplaceSource.branch and MarketplaceSource.path into the import dialog's
prefill state (e.g., add setPrefillBranch(source.branch) and
setPrefillPath(source.path) or extend the prefillUrl/prefill state to include
branch/path) so the dialog re-scans the correct branch/subdirectory before
opening via setImportDialogOpen(true).
In `@components/frontend/src/components/import-source-dialog.tsx`:
- Around line 97-109: The current handleImport flow only builds InstalledItem
entries from selectedIds in scannedItems, preventing import of repos that are a
single workflow with no discovered items; update handleImport (and the similar
handlers around the other ranges) to detect when there are no selected items but
the scan result indicates a workflow (use ScanResult.isWorkflow and
ScanResult.workflowName), and in that case construct one InstalledItem using
gitUrl/branch/path, itemId derived from the repo (or empty/placeholder),
itemType set to "workflow" (or based on importAsWorkflow), and itemName from
ScanResult.workflowName; ensure UI enabling logic that disables the action when
selectedIds is empty is adjusted so workflow-only repos can still trigger the
import.
---
Duplicate comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 232-235: The current prefix-based check for req.URL in the handler
is insufficient to prevent SSRF to internal addresses; add a new helper function
validateGitSourceURL(raw string) error and call it from the handler instead of
the simple strings.HasPrefix checks. validateGitSourceURL should parse the
URL/SSH target, extract the hostname/IP, reject localhost/127.0.0.0/8, ::1,
link-local addresses, and all private CIDRs, and also block cluster-local
hostnames (e.g., *.svc, *.cluster.local) and bare internal hostnames; return a
clear error for any rejected host so the handler can return
http.StatusBadRequest. Ensure you use this helper when validating req.URL (and
any other uses noted at the same location) so all repo fetches are vetted
centrally.
- Around line 436-488: InstallItems is persisting user-supplied InstalledItem
fields without validation; before converting items to unstructured in the
InstallItems handler (the loop that reads newItems and builds entry maps),
validate each InstalledItem: parse and validate SourceURL with url.Parse and
allow only safe schemes (e.g., http, https, git), enforce ItemType is one of the
accepted enum values, reject or normalize empty/oversized fields (set max
length), ensure SourcePath and FilePath are not absolute and do not contain path
traversal segments (no ".." or leading "/" ), and sanitize ItemName and
SourceBranch (strip control chars / enforce regex). If any item fails
validation, return c.JSON(http.StatusBadRequest, gin.H{"error": "<field>
invalid"}) and skip persisting; only append validated entries to existingItems.
In `@components/frontend/src/components/import-source-dialog.tsx`:
- Around line 42-48: The component initializes local state (gitUrl, branch,
path, selectedIds, importAsWorkflow) from props (prefillUrl, prefillItems) only
on first render; fix this by adding a useEffect that watches the dialog's open
flag and the prefill props and resets those states when open becomes true—e.g.,
useEffect(() => { if (open) { setGitUrl(prefillUrl); setSelectedIds(new
Set(prefillItems)); setBranch(prefillBranchOrDefault);
setPath(prefillPathOrEmpty);
setImportAsWorkflow(prefillImportAsWorkflowOrFalse); } }, [open, prefillUrl,
prefillItems, /* other prefill props */]);—apply the same reset logic for the
other state hooks referenced around lines 127-134 so the form reflects the
latest prefill each time the dialog opens.
In `@components/frontend/src/services/api/marketplace.ts`:
- Around line 61-66: The uninstallItem function currently deletes by itemId
only, which can remove installs from the wrong source when IDs collide; modify
uninstallItem(projectName: string, itemId: string) to accept a sourceUrl (e.g.,
uninstallItem(projectName: string, itemId: string, sourceUrl: string)) and
include sourceUrl in the delete request (as a query parameter or body as
expected by the backend) so the server can key by (sourceUrl, itemId); then
update callers such as useUninstallItem to supply sourceUrl in their mutation
variables and adjust any types/signatures accordingly.
In `@components/runners/ambient-runner/ambient_runner/bridges/claude/bridge.py`:
- Around line 109-117: The current checkout key (repo_name + branch) can collide
and create nested paths; replace it with a sanitized, stable key derived from a
short hash of source_url + branch (e.g., use repo_name + "-" + short_hash) and
ensure any "/" in branch is sanitized so it never creates nested directories;
compute dir_name from that key and set clone_dir = marketplace_base / dir_name
(use source_url and branch to build the hash). Also change the fast-path in the
existing conditional that uses clone_dir.exists(): do not return early—when
clone_dir.exists() trigger a refresh/rebuild of the checkout (pull, re-clone, or
remove-and-clone) and continue materialization so persisted workspaces still get
refreshed and new items are materialized; keep the original variables repo_name,
branch, source_url, marketplace_base, clone_dir and clone_url to locate where to
apply these changes.
- Around line 119-123: The workflow-handling branch in the Claude bridge uses
has_workflow to clone the repo root via _run_git (clone_url, clone_dir, branch)
and then skips materialization, which ignores non-root workflow sourcePath;
change the logic so that when a workflow item has a non-root sourcePath you
either perform a sparse checkout or clone the repo and then copy/check out only
the sourcePath into clone_dir (or adjust clone_dir to point at the sourcePath
subdirectory) so the .claude directory ends up at the expected path for
downstream discovery; update the same handling used at the other occurrence
(lines referenced around 172-173) so both locations respect the workflow's
sourcePath.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: df0e87e8-c377-4691-bc44-024d0251eac9
📒 Files selected for processing (19)
components/backend/handlers/marketplace.gocomponents/backend/routes.gocomponents/frontend/src/app/projects/[name]/marketplace/page.tsxcomponents/frontend/src/app/projects/[name]/sessions/[sessionName]/components/sessions-sidebar.tsxcomponents/frontend/src/app/projects/[name]/sessions/[sessionName]/lib/types.tscomponents/frontend/src/components/import-source-dialog.tsxcomponents/frontend/src/components/session/MessagesTab.tsxcomponents/frontend/src/services/api/marketplace.tscomponents/frontend/src/services/api/workflows.tscomponents/frontend/src/services/queries/use-marketplace.tscomponents/frontend/src/types/marketplace.tscomponents/frontend/src/types/workflow.tscomponents/manifests/base/core/kustomization.yamlcomponents/manifests/base/core/marketplace-sources-configmap.yamlcomponents/manifests/base/crds/projectsettings-crd.yamlcomponents/operator/internal/handlers/sessions.gocomponents/runners/ambient-runner/ambient_runner/bridges/claude/bridge.pycomponents/runners/ambient-runner/ambient_runner/endpoints/content.pycomponents/runners/ambient-runner/ambient_runner/platform/workspace.py
| if output, err := cmd.CombinedOutput(); err != nil { | ||
| log.Printf("ScanGitSource: git clone failed: %v, output: %s", err, string(output)) | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Failed to clone repository"}) |
There was a problem hiding this comment.
Avoid logging raw git clone output from user-supplied repos.
Line 260 logs full command output, which can include credential-bearing URLs or sensitive remote error details.
Proposed fix
if output, err := cmd.CombinedOutput(); err != nil {
- log.Printf("ScanGitSource: git clone failed: %v, output: %s", err, string(output))
+ _ = output // optionally inspect/redact before logging
+ log.Printf("ScanGitSource: git clone failed: %v", err)
c.JSON(http.StatusBadRequest, gin.H{"error": "Failed to clone repository"})
return
}As per coding guidelines, "Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity."
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| if output, err := cmd.CombinedOutput(); err != nil { | |
| log.Printf("ScanGitSource: git clone failed: %v, output: %s", err, string(output)) | |
| c.JSON(http.StatusBadRequest, gin.H{"error": "Failed to clone repository"}) | |
| if output, err := cmd.CombinedOutput(); err != nil { | |
| _ = output // optionally inspect/redact before logging | |
| log.Printf("ScanGitSource: git clone failed: %v", err) | |
| c.JSON(http.StatusBadRequest, gin.H{"error": "Failed to clone repository"}) | |
| return | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/backend/handlers/marketplace.go` around lines 259 - 261, In
ScanGitSource, stop logging the raw git command output (the output variable from
cmd.CombinedOutput) because it can contain credentials or sensitive remote
details; instead log only the error value and a sanitized/truncated form of the
output. Replace the log.Printf call that mentions output with a log that
includes err and at most a small, URL/credential-stripped snippet (implement a
helper like sanitizeGitOutput or sanitizeRepoOutput that removes credentials
from URLs and truncates to e.g. 200 chars), and keep the HTTP response generic
(e.g. "Failed to clone repository") as already done.
| const handleImportFromCatalog = ( | ||
| item: MarketplaceCatalogItem, | ||
| source: MarketplaceSource | ||
| ) => { | ||
| setPrefillUrl(source.url); | ||
| setPrefillItems([item.id]); | ||
| setImportDialogOpen(true); |
There was a problem hiding this comment.
Preserve the curated source branch/path when opening the import dialog.
MarketplaceSource carries branch and optional path, but this flow only forwards the URL and item id. The dialog then rescans main at repo root, so any curated source that points at a release branch or subdirectory will import the wrong content or fail to find the preselected item.
Also applies to: 346-355
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/frontend/src/app/projects/`[name]/marketplace/page.tsx around
lines 137 - 143, The import handler only forwards source.url and item id,
causing curated sources' branch/path to be lost; update handleImportFromCatalog
(and the similar handler at the other location) to also forward
MarketplaceSource.branch and MarketplaceSource.path into the import dialog's
prefill state (e.g., add setPrefillBranch(source.branch) and
setPrefillPath(source.path) or extend the prefillUrl/prefill state to include
branch/path) so the dialog re-scans the correct branch/subdirectory before
opening via setImportDialogOpen(true).
| const handleUninstall = (itemId: string, itemName: string) => { | ||
| uninstallMutation.mutate( | ||
| { projectName, itemId }, | ||
| { | ||
| onSuccess: () => toast.success(`Removed "${itemName}"`), | ||
| onError: (error) => | ||
| toast.error(error instanceof Error ? error.message : "Remove failed"), | ||
| } | ||
| ); |
There was a problem hiding this comment.
Use a composite identity for installed items.
Installed items are not uniquely identified by itemId alone. Using only itemId here gives duplicate React keys, shows the spinner on multiple rows when ids collide, and makes uninstall requests ambiguous for two sources that expose the same command/skill id. Key and uninstall by at least sourceUrl + itemId (and whatever else the backend uses for identity).
Also applies to: 293-318
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/frontend/src/app/projects/`[name]/marketplace/page.tsx around
lines 146 - 154, HandleUninstall and rendering use itemId alone which causes
duplicate keys and ambiguous uninstalls; change to use a composite identity
(e.g., `${sourceUrl}:${itemId}` or an object { sourceUrl, itemId } matching
backend identity) throughout: update handleUninstall signature/usage to accept
the composite id or both sourceUrl and itemId and pass both to uninstallMutation
(replace { projectName, itemId } with { projectName, sourceUrl, itemId } or the
composite key the backend expects), update any React keys and spinner logic to
use the composite key (where items are mapped/rendered), and make the same
change for the other occurrence referenced (the block around lines 293-318) so
keys, loading states, and API calls consistently use the composite identity.
Add route handlers to proxy marketplace API calls from the frontend to the backend: - GET /api/marketplace/sources → backend marketplace sources ConfigMap - GET /api/marketplace/sources/[idx]/catalog → remote catalog data.json - POST /api/projects/[name]/marketplace/scan → scan Git repo for items - GET/POST /api/projects/[name]/marketplace/items → list/install items - DELETE /api/projects/[name]/marketplace/items/[itemId] → uninstall
All marketplace proxy routes were manually setting X-User-ID/X-User-Groups which doesn't match the auth forwarding pattern used by the rest of the app. Switch to buildForwardHeadersAsync() which correctly forwards X-Forwarded-User, X-Forwarded-Access-Token, Authorization, and handles mock auth for local development.
The ai-helpers data.json uses {tools: {skills: [...], commands: [...], agents: [...]}}
but the frontend expects {items: [...]}. Add normalizeCatalog() to flatten
the nested structure into a consistent {items: [...]} array with category
field. Also handles flat array and pre-normalized formats.
Fixes 'Query data cannot be undefined' error for catalog queries.
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
components/operator/internal/handlers/sessions.go (1)
1206-1215:⚠️ Potential issue | 🟠 MajorKeep
INSTALLED_ITEMS_JSONauthoritative in the runner env.This value comes from
ProjectSettings, but the followingreplaceOrAppendEnvVars(base, userEnvVars)can replace it. That lets a session start with a different marketplace set than the workspace actually has installed.Suggested fix
- if installedItemsJSON != "" { - base = append(base, corev1.EnvVar{Name: "INSTALLED_ITEMS_JSON", Value: installedItemsJSON}) - } - // Inject registry-defined env vars (e.g. RUNNER_TYPE, RUNNER_STATE_DIR) base = appendNonConflictingEnvVars(base, registryEnvVars) // Apply user-provided environmentVariables with replace-if-exists // semantics (overrides are intentional for the runner container) base = replaceOrAppendEnvVars(base, userEnvVars) + + if installedItemsJSON != "" { + base = replaceOrAppendEnvVars(base, []corev1.EnvVar{ + {Name: "INSTALLED_ITEMS_JSON", Value: installedItemsJSON}, + }) + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/operator/internal/handlers/sessions.go` around lines 1206 - 1215, INSTALLED_ITEMS_JSON must be authoritative and not be overridden by userEnvVars: move the logic so that replaceOrAppendEnvVars(base, userEnvVars) is applied before adding INSTALLED_ITEMS_JSON, or alternatively ensure you append/set INSTALLED_ITEMS_JSON after calling replaceOrAppendEnvVars so the final base contains corev1.EnvVar{Name: "INSTALLED_ITEMS_JSON", Value: installedItemsJSON} that wins; update the block around appendNonConflictingEnvVars, replaceOrAppendEnvVars, and the installedItemsJSON append (references: installedItemsJSON, base, appendNonConflictingEnvVars, replaceOrAppendEnvVars) accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@components/frontend/src/app/api/marketplace/sources/`[idx]/catalog/route.ts:
- Around line 9-11: The upstream path uses the raw params.idx when building the
fetch URL, so reserved characters can alter the backend path; before
interpolating into `${BACKEND_URL}/marketplace/sources/${idx}/catalog` encode
idx (e.g., via encodeURIComponent(String(idx))) and use that encoded value in
the fetch call; locate the const { idx } = await params and replace/interpose an
encodedIdx variable (used when calling fetch) while leaving
buildForwardHeadersAsync and BACKEND_URL usage unchanged.
In `@components/frontend/src/app/api/marketplace/sources/route.ts`:
- Around line 4-15: The GET handler calls
fetch(`${BACKEND_URL}/marketplace/sources`) with no deadline; wrap that fetch in
an AbortController timeout (e.g., 5–10s) inside the GET function: create an
AbortController, pass signal to fetch, set a timer to call controller.abort()
and clear the timer after fetch completes, and catch AbortError to return a 504
(or appropriate) Response.json error; update the same pattern in the sibling
marketplace catalog route and keep buildForwardHeadersAsync as-is when passing
headers to fetch.
---
Outside diff comments:
In `@components/operator/internal/handlers/sessions.go`:
- Around line 1206-1215: INSTALLED_ITEMS_JSON must be authoritative and not be
overridden by userEnvVars: move the logic so that replaceOrAppendEnvVars(base,
userEnvVars) is applied before adding INSTALLED_ITEMS_JSON, or alternatively
ensure you append/set INSTALLED_ITEMS_JSON after calling replaceOrAppendEnvVars
so the final base contains corev1.EnvVar{Name: "INSTALLED_ITEMS_JSON", Value:
installedItemsJSON} that wins; update the block around
appendNonConflictingEnvVars, replaceOrAppendEnvVars, and the installedItemsJSON
append (references: installedItemsJSON, base, appendNonConflictingEnvVars,
replaceOrAppendEnvVars) accordingly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: b5d153f5-dcd7-41b8-bd05-3fecbe0d2ba1
📒 Files selected for processing (7)
components/frontend/src/app/api/marketplace/sources/[idx]/catalog/route.tscomponents/frontend/src/app/api/marketplace/sources/route.tscomponents/frontend/src/app/api/projects/[name]/marketplace/items/[itemId]/route.tscomponents/frontend/src/app/api/projects/[name]/marketplace/items/route.tscomponents/frontend/src/app/api/projects/[name]/marketplace/scan/route.tscomponents/operator/internal/handlers/sessions.gocomponents/runners/ambient-runner/ambient_runner/endpoints/content.py
| const { idx } = await params; | ||
| const headers = await buildForwardHeadersAsync(request); | ||
| const response = await fetch(`${BACKEND_URL}/marketplace/sources/${idx}/catalog`, { headers }); |
There was a problem hiding this comment.
URL-encode idx before building the upstream path.
idx is interpolated raw here, so reserved characters can change the backend path/query instead of staying inside the segment. The project-scoped proxies already encode their params; this one should do the same.
Suggested fix
- const response = await fetch(`${BACKEND_URL}/marketplace/sources/${idx}/catalog`, { headers });
+ const response = await fetch(
+ `${BACKEND_URL}/marketplace/sources/${encodeURIComponent(idx)}/catalog`,
+ { headers }
+ );📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const { idx } = await params; | |
| const headers = await buildForwardHeadersAsync(request); | |
| const response = await fetch(`${BACKEND_URL}/marketplace/sources/${idx}/catalog`, { headers }); | |
| const { idx } = await params; | |
| const headers = await buildForwardHeadersAsync(request); | |
| const response = await fetch( | |
| `${BACKEND_URL}/marketplace/sources/${encodeURIComponent(idx)}/catalog`, | |
| { headers } | |
| ); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/frontend/src/app/api/marketplace/sources/`[idx]/catalog/route.ts
around lines 9 - 11, The upstream path uses the raw params.idx when building the
fetch URL, so reserved characters can alter the backend path; before
interpolating into `${BACKEND_URL}/marketplace/sources/${idx}/catalog` encode
idx (e.g., via encodeURIComponent(String(idx))) and use that encoded value in
the fetch call; locate the const { idx } = await params and replace/interpose an
encodedIdx variable (used when calling fetch) while leaving
buildForwardHeadersAsync and BACKEND_URL usage unchanged.
| export async function GET(request: Request) { | ||
| try { | ||
| const headers = await buildForwardHeadersAsync(request); | ||
| const response = await fetch(`${BACKEND_URL}/marketplace/sources`, { headers }); | ||
| const data = await response.text(); | ||
| return new Response(data, { | ||
| status: response.status, | ||
| headers: { "Content-Type": "application/json" }, | ||
| }); | ||
| } catch (error) { | ||
| console.error("Failed to fetch marketplace sources:", error); | ||
| return Response.json({ error: "Failed to fetch marketplace sources" }, { status: 500 }); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
fd route.ts -p "*marketplace*"Repository: ambient-code/platform
Length of output: 165
🏁 Script executed:
ls -la components/frontend/src/app/api/marketplace/Repository: ambient-code/platform
Length of output: 232
🏁 Script executed:
cat -n components/frontend/src/app/api/marketplace/sources/route.tsRepository: ambient-code/platform
Length of output: 834
🏁 Script executed:
rg "fetch.*marketplace" --type ts --type tsxRepository: ambient-code/platform
Length of output: 92
🏁 Script executed:
rg "AbortSignal.timeout|signal.*AbortSignal" --type ts --type tsx -A 2 -B 2Repository: ambient-code/platform
Length of output: 92
🏁 Script executed:
find components/frontend/src/app/api/marketplace -name "route.ts" -o -name "route.js"Repository: ambient-code/platform
Length of output: 200
🏁 Script executed:
rg "fetch.*BACKEND_URL" --type ts -A 3 -B 2Repository: ambient-code/platform
Length of output: 46508
🏁 Script executed:
rg "AbortSignal.timeout|signal.*AbortSignal|TimeoutError" --type tsRepository: ambient-code/platform
Length of output: 130
🏁 Script executed:
ls -la components/frontend/src/app/api/Repository: ambient-code/platform
Length of output: 813
🏁 Script executed:
cat -n components/frontend/src/app/api/marketplace/sources/\[idx\]/catalog/route.tsRepository: ambient-code/platform
Length of output: 967
Bound the upstream proxy call with a timeout.
This fetch runs on the request path with no deadline. If the backend or cluster networking stalls, the frontend handler stays open indefinitely; the same unbounded pattern is repeated in the sibling marketplace catalog route.
Minimal fix
- const response = await fetch(`${BACKEND_URL}/marketplace/sources`, { headers });
+ const response = await fetch(`${BACKEND_URL}/marketplace/sources`, {
+ headers,
+ signal: AbortSignal.timeout(10_000),
+ });
const data = await response.text();
return new Response(data, {
status: response.status,
headers: { "Content-Type": "application/json" },
});
} catch (error) {
+ if (error instanceof Error && error.name === "TimeoutError") {
+ return Response.json({ error: "Marketplace backend timed out" }, { status: 504 });
+ }
console.error("Failed to fetch marketplace sources:", error);
return Response.json({ error: "Failed to fetch marketplace sources" }, { status: 500 });
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| export async function GET(request: Request) { | |
| try { | |
| const headers = await buildForwardHeadersAsync(request); | |
| const response = await fetch(`${BACKEND_URL}/marketplace/sources`, { headers }); | |
| const data = await response.text(); | |
| return new Response(data, { | |
| status: response.status, | |
| headers: { "Content-Type": "application/json" }, | |
| }); | |
| } catch (error) { | |
| console.error("Failed to fetch marketplace sources:", error); | |
| return Response.json({ error: "Failed to fetch marketplace sources" }, { status: 500 }); | |
| export async function GET(request: Request) { | |
| try { | |
| const headers = await buildForwardHeadersAsync(request); | |
| const response = await fetch(`${BACKEND_URL}/marketplace/sources`, { | |
| headers, | |
| signal: AbortSignal.timeout(10_000), | |
| }); | |
| const data = await response.text(); | |
| return new Response(data, { | |
| status: response.status, | |
| headers: { "Content-Type": "application/json" }, | |
| }); | |
| } catch (error) { | |
| if (error instanceof Error && error.name === "TimeoutError") { | |
| return Response.json({ error: "Marketplace backend timed out" }, { status: 504 }); | |
| } | |
| console.error("Failed to fetch marketplace sources:", error); | |
| return Response.json({ error: "Failed to fetch marketplace sources" }, { status: 500 }); | |
| } | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@components/frontend/src/app/api/marketplace/sources/route.ts` around lines 4
- 15, The GET handler calls fetch(`${BACKEND_URL}/marketplace/sources`) with no
deadline; wrap that fetch in an AbortController timeout (e.g., 5–10s) inside the
GET function: create an AbortController, pass signal to fetch, set a timer to
call controller.abort() and clear the timer after fetch completes, and catch
AbortError to return a 504 (or appropriate) Response.json error; update the same
pattern in the sibling marketplace catalog route and keep
buildForwardHeadersAsync as-is when passing headers to fetch.
There was a problem hiding this comment.
Actionable comments posted: 1
♻️ Duplicate comments (4)
components/backend/handlers/marketplace.go (4)
518-533:⚠️ Potential issue | 🟠 MajorUse one composite identity for install and uninstall.
existingKeyscurrently treatssourceUrl + itemIdas unique, butInstalledItemalready carriessourceBranch,sourcePath,itemType, andfilePath. Two valid items from the same repo can share an id across types or branches, so one gets silently dropped on install, and delete stays ambiguous when the request only keys onitemId. Use the same composite key everywhere—at minimumsourceUrl + sourceBranch + itemType + itemId, and preferably include path as well.Also applies to: 593-607
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 518 - 533, The dedup key construction using sourceUrl + itemId (existingKeys and the new item key built from item.SourceURL + "\x00" + item.ItemID) is too coarse and causes collisions; update both the existingKeys population and the new-item key computation to use the same composite identity including sourceBranch, sourcePath (or filePath), itemType and itemId (e.g. join InstalledItem.SourceURL, InstalledItem.SourceBranch, InstalledItem.ItemType, InstalledItem.ItemID, and path with a null separator) so installs and uninstalls use the identical composite key; locate and change the code around existingKeys, the loop over existingItems, and the newItems loop (and the similar block at the other occurrence noted around lines 593-607) to build and compare that composite key consistently.
324-326:⚠️ Potential issue | 🟠 MajorStop logging raw
git cloneoutput.
CombinedOutput()often includes the remote URL and auth failure details. For user-supplied repositories, that can leak credentials or sensitive remote metadata into server logs.Suggested fix
if output, err := cmd.CombinedOutput(); err != nil { - log.Printf("ScanGitSource: git clone failed: %v, output: %s", err, string(output)) + _ = output + log.Printf("ScanGitSource: git clone failed: %v", err) c.JSON(http.StatusBadRequest, gin.H{"error": "Failed to clone repository"}) return }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 324 - 326, The handler's git clone error path (in ScanGitSource where cmd.CombinedOutput() is called) is logging raw CombinedOutput which can contain remote URLs or credentials; change the error handling to stop printing the full output in log.Printf and instead log a sanitized message: include the error.Error() and a short, redacted/trimmed diagnostic (e.g., fixed prefix like "git clone failed, output omitted" or at most a short, constant-length sanitized snippet with URLs removed), and return the same client-facing JSON (c.JSON) without exposing output; ensure you remove or redact any use of string(output) in logs and only surface non-sensitive info.
501-553:⚠️ Potential issue | 🟠 MajorValidate
InstalledItemfields before persisting them.This handler writes
sourceUrl,sourceBranch,sourcePath,filePath,itemType, and identifiers straight from the request intospec.installedItems. Because the runner later consumes that CR to clone/materialize content, callers can bypassScanGitSourceentirely and persist values the scan endpoint would have rejected.Minimal guard
+func validateInstalledItem(item InstalledItem) error { + if item.SourceURL == "" || item.ItemID == "" || item.ItemType == "" { + return fmt.Errorf("sourceUrl, itemId, and itemType are required") + } + switch item.ItemType { + case "skill", "command", "agent", "workflow": + default: + return fmt.Errorf("invalid itemType %q", item.ItemType) + } + if item.SourcePath != "" && (filepath.IsAbs(item.SourcePath) || strings.Contains(item.SourcePath, "..")) { + return fmt.Errorf("invalid sourcePath") + } + if item.FilePath != "" && (filepath.IsAbs(item.FilePath) || strings.Contains(item.FilePath, "..")) { + return fmt.Errorf("invalid filePath") + } + return nil +} + // Convert new items to unstructured and append (skip duplicates) for _, item := range newItems { + if err := validateInstalledItem(item); err != nil { + c.JSON(http.StatusBadRequest, gin.H{"error": err.Error()}) + return + } key := item.SourceURL + "\x00" + item.ItemID🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 501 - 553, The handler currently appends raw InstalledItem objects from newItems into spec.installedItems; add validation before persisting: for each item in newItems (the slice bound into variable newItems) validate SourceURL (well-formed URL and allowed host/scheme), ItemID/ItemType (non-empty and ItemType in allowed enum), and that SourceBranch/SourcePath/FilePath contain no path-traversal or invalid characters and match existing ScanGitSource rules; reuse or call the ScanGitSource validation function (or extract its logic) to perform the same checks, and if any item fails validation respond with HTTP 400 and a clear error instead of appending; only append items to existingItems after they pass validation and continue to dedupe using existingKeys.
297-322:⚠️ Potential issue | 🟠 MajorReject private/loopback clone targets, not just non-HTTPS schemes.
The prefix check blocks
file://, but it still permits targets likehttps://10.x.x.x/...andgit@internal-host:repo.git. That leavesScanGitSourceas an SSRF primitive against internal services from the backend. Parse the host from both URL and scp-style SSH remotes, then reject loopback/private/link-local destinations before invokinggit clone.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@components/backend/handlers/marketplace.go` around lines 297 - 322, The current validation in ScanGitSource only checks URL prefixes and misses private/loopback targets; update the handler to parse the remote host (handle both URL form via url.Parse(req.URL) and SCP-style SSH like "git@host:repo" by extracting the substring after '@' and before ':'), then resolve that hostname to IPs (use net.LookupIP or similar) and reject any addresses in loopback, private, link-local or unique-local ranges (e.g., 127.0.0.0/8, 10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16, 169.254.0.0/16, ::1, fe80::/10, fc00::/7) before calling exec.CommandContext to run "git clone"; return a Bad Request if any resolved IP is in those ranges or if the host cannot be safely resolved. Ensure you perform this check for both HTTPS and SSH (scp-style) inputs and reference req.URL, req.Branch, ScanGitSource and the git clone invocation when locating where to add the validation.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 487-488: The handler currently uses
unstructured.NestedSlice(obj.Object, "spec", "installedItems") and returns
whatever comes back (nil when absent), causing {"items": null}; change the
response to always serialize an array by checking the returned items and, if
nil, replacing it with an empty slice before calling c.JSON(http.StatusOK,
gin.H{"items": items}); update the code path around unstructured.NestedSlice and
the c.JSON call so "items" is guaranteed to be an array (e.g., []interface{}{})
when spec.installedItems is missing.
---
Duplicate comments:
In `@components/backend/handlers/marketplace.go`:
- Around line 518-533: The dedup key construction using sourceUrl + itemId
(existingKeys and the new item key built from item.SourceURL + "\x00" +
item.ItemID) is too coarse and causes collisions; update both the existingKeys
population and the new-item key computation to use the same composite identity
including sourceBranch, sourcePath (or filePath), itemType and itemId (e.g. join
InstalledItem.SourceURL, InstalledItem.SourceBranch, InstalledItem.ItemType,
InstalledItem.ItemID, and path with a null separator) so installs and uninstalls
use the identical composite key; locate and change the code around existingKeys,
the loop over existingItems, and the newItems loop (and the similar block at the
other occurrence noted around lines 593-607) to build and compare that composite
key consistently.
- Around line 324-326: The handler's git clone error path (in ScanGitSource
where cmd.CombinedOutput() is called) is logging raw CombinedOutput which can
contain remote URLs or credentials; change the error handling to stop printing
the full output in log.Printf and instead log a sanitized message: include the
error.Error() and a short, redacted/trimmed diagnostic (e.g., fixed prefix like
"git clone failed, output omitted" or at most a short, constant-length sanitized
snippet with URLs removed), and return the same client-facing JSON (c.JSON)
without exposing output; ensure you remove or redact any use of string(output)
in logs and only surface non-sensitive info.
- Around line 501-553: The handler currently appends raw InstalledItem objects
from newItems into spec.installedItems; add validation before persisting: for
each item in newItems (the slice bound into variable newItems) validate
SourceURL (well-formed URL and allowed host/scheme), ItemID/ItemType (non-empty
and ItemType in allowed enum), and that SourceBranch/SourcePath/FilePath contain
no path-traversal or invalid characters and match existing ScanGitSource rules;
reuse or call the ScanGitSource validation function (or extract its logic) to
perform the same checks, and if any item fails validation respond with HTTP 400
and a clear error instead of appending; only append items to existingItems after
they pass validation and continue to dedupe using existingKeys.
- Around line 297-322: The current validation in ScanGitSource only checks URL
prefixes and misses private/loopback targets; update the handler to parse the
remote host (handle both URL form via url.Parse(req.URL) and SCP-style SSH like
"git@host:repo" by extracting the substring after '@' and before ':'), then
resolve that hostname to IPs (use net.LookupIP or similar) and reject any
addresses in loopback, private, link-local or unique-local ranges (e.g.,
127.0.0.0/8, 10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16, 169.254.0.0/16, ::1,
fe80::/10, fc00::/7) before calling exec.CommandContext to run "git clone";
return a Bad Request if any resolved IP is in those ranges or if the host cannot
be safely resolved. Ensure you perform this check for both HTTPS and SSH
(scp-style) inputs and reference req.URL, req.Branch, ScanGitSource and the git
clone invocation when locating where to add the validation.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Organization UI
Review profile: ASSERTIVE
Plan: Pro
Run ID: 503f5f99-9daf-409f-9307-c4dc1b1f1be4
📒 Files selected for processing (2)
components/backend/handlers/marketplace.gocomponents/frontend/src/services/api/marketplace.ts
Frontend:
- Compact card tiles in 3-4 column grid (was 2-col with tall cards)
- Direct install from browse tab (no modal) with check/spinner states
- Installed tab uses same compact grid layout instead of bare list
- Search/filter controls on both tabs
Backend:
- ScanGitSource scans both .claude/{type}/ and {type}/ patterns
(supports ai-helpers layout alongside standard Claude Code layout)
- Deduplication by type:id when both patterns match
| filepath.Join(scanRoot, ".claude", "skills"), | ||
| filepath.Join(scanRoot, "skills"), | ||
| } { | ||
| entries, err := os.ReadDir(skillsDir) |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 4 days ago
In general, this kind of issue is fixed by ensuring that any user-supplied path is validated/normalized and then checked to be within a specific “safe” directory before being used in file system operations. This typically means: (1) rejecting absolute paths, (2) cleaning the path to collapse . and .. segments, and (3) verifying via prefix comparison that the resulting absolute path is inside a known base directory.
For this handler, the best fix with minimal behavior change is to update the existing validation and scanRoot computation so that:
- We compute
scanRootas an absolute path within the cloned temporary directory (usingfilepath.Joinandfilepath.Abs). - We reject any
req.Paththat would causescanRootto escape the clone directory, by checking that the cleaned, absolutescanRootstill hastmpDiras its prefix. - We also keep rejecting absolute
req.Pathdirectly.
Concretely:
- Remove or refactor the current
req.Pathvalidation at lines 303–307 into a form that usesfilepath.Cleanand a prefix check instead of juststrings.Contains(".."). - After
tmpDiris created (and before usingscanRootanywhere), compute a temporarycandidateRoot := filepath.Join(tmpDir, req.Path)and thenabsScanRoot, err := filepath.Abs(candidateRoot). - If
err != nilor!strings.HasPrefix(absScanRoot, tmpDir+string(os.PathSeparator)) && absScanRoot != tmpDir, return a 400 error indicating an invalid path. - Set
scanRoot = absScanRoot(orscanRoot = tmpDirifreq.Pathis empty) and use thisscanRootfor all subsequent path joins.
We already import path/filepath, os, and strings, so no new imports are necessary. All changes happen in ScanGitSource in components/backend/handlers/marketplace.go.
| filepath.Join(scanRoot, ".claude", "commands"), | ||
| filepath.Join(scanRoot, "commands"), | ||
| } { | ||
| entries, err := os.ReadDir(commandsDir) |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 4 days ago
To fix this, we should validate and constrain req.Path before using it to compute scanRoot. Since req.Path is meant to select a subdirectory within the cloned repository, we should treat it as a possibly multi-component relative path and ensure that the resolved absolute path stays inside tmpDir. The standard pattern is to compute absScanRoot := filepath.Abs(filepath.Join(tmpDir, req.Path)), then verify that this path is still under tmpDir using a prefix check on cleaned/absolute paths. If the check fails or path resolution errors, we reject the request with HTTP 400 and do not proceed.
Concretely, in ScanGitSource in components/backend/handlers/marketplace.go, replace the current “Determine scan root” block (lines 330–334) with logic that: (1) initializes scanRoot := tmpDir, (2) if req.Path is non-empty, joins it to tmpDir, calls filepath.Abs on the result, obtains absTmpDir := filepath.Abs(tmpDir), and checks that absScanRoot has absTmpDir as a path prefix. If not, return a 400 error such as "Invalid path". If valid, set scanRoot = absScanRoot. This keeps all later uses of scanRoot (CLAUDE.md, .ambient, .claude/*, skills, commands) unchanged in behavior for legitimate inputs but prevents directory traversal. We should also handle errors from filepath.Abs. No new imports are needed because path/filepath and strings are already imported.
| @@ -330,7 +330,23 @@ | ||
| // Determine scan root | ||
| scanRoot := tmpDir | ||
| if req.Path != "" { | ||
| scanRoot = filepath.Join(tmpDir, req.Path) | ||
| // Ensure that the requested path stays within the cloned repository root | ||
| absTmpDir, err := filepath.Abs(tmpDir) | ||
| if err != nil { | ||
| log.Printf("ScanGitSource: failed to resolve tmpDir: %v", err) | ||
| c.JSON(http.StatusInternalServerError, gin.H{"error": "Internal server error"}) | ||
| return | ||
| } | ||
| absScanRoot, err := filepath.Abs(filepath.Join(tmpDir, req.Path)) | ||
| if err != nil { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path"}) | ||
| return | ||
| } | ||
| if !strings.HasPrefix(absScanRoot, absTmpDir+string(os.PathSeparator)) && absScanRoot != absTmpDir { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "Invalid path"}) | ||
| return | ||
| } | ||
| scanRoot = absScanRoot | ||
| } | ||
|
|
||
| result := ScanResult{ |
| filepath.Join(scanRoot, ".claude", "agents"), | ||
| filepath.Join(scanRoot, "agents"), | ||
| } { | ||
| entries, err := os.ReadDir(agentsDir) |
Check failure
Code scanning / CodeQL
Uncontrolled data used in path expression High
Show autofix suggestion
Hide autofix suggestion
Copilot Autofix
AI 4 days ago
In general, the fix is to validate and constrain the user-provided path (req.Path) before using it to build filesystem paths. A common pattern is to resolve the combined path (tmpDir + req.Path) to an absolute, cleaned path and ensure it still resides within the intended base directory (tmpDir). If the check fails, reject the request.
The best way to fix this snippet without changing behavior is:
- After
tmpDiris defined (not shown) and beforescanRootis used, compute the absolute path oftmpDir. - If
req.Pathis non-empty, join it totmpDir, resolve the result to an absolute path, and verify that this absolute path starts with the absolutetmpDirpath (plus a path separator, to avoid prefix tricks). - If resolving fails or the resulting path is outside
tmpDir, return an HTTP 400 error. - Use the validated absolute path as
scanRootfor the rest of the handler.
Concretely, inside ScanGitSource around lines 330–334, instead of directly setting scanRoot := tmpDir and then scanRoot = filepath.Join(tmpDir, req.Path), we will:
- Resolve
tmpDirAbs, err := filepath.Abs(tmpDir)and handle errors by returning 500. - Initialize
scanRoot := tmpDirAbs. - If
req.Path != "", computecandidate := filepath.Join(tmpDirAbs, req.Path), resolvecandidateAbs := filepath.Abs(candidate), and check thatcandidateAbshas prefixtmpDirAbswithstrings.HasPrefixand, ideally, a separator check. - If the check passes, set
scanRoot = candidateAbs; otherwise, respond with 400 "Invalid path".
All subsequent uses of scanRoot and the derived skillsDir, commandsDir, and agentsDir will then be safe because they are anchored to the validated directory.
| @@ -328,9 +328,27 @@ | ||
| } | ||
|
|
||
| // Determine scan root | ||
| scanRoot := tmpDir | ||
| tmpDirAbs, err := filepath.Abs(tmpDir) | ||
| if err != nil { | ||
| log.Printf("failed to resolve tmpDir: %v", err) | ||
| c.JSON(http.StatusInternalServerError, gin.H{"error": "internal error"}) | ||
| return | ||
| } | ||
| scanRoot := tmpDirAbs | ||
| if req.Path != "" { | ||
| scanRoot = filepath.Join(tmpDir, req.Path) | ||
| candidate := filepath.Join(tmpDirAbs, req.Path) | ||
| candidateAbs, err := filepath.Abs(candidate) | ||
| if err != nil { | ||
| log.Printf("failed to resolve scan path: %v", err) | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "invalid path"}) | ||
| return | ||
| } | ||
| // Ensure the requested path stays within the cloned repository root | ||
| if candidateAbs != tmpDirAbs && !strings.HasPrefix(candidateAbs+string(os.PathSeparator), tmpDirAbs+string(os.PathSeparator)) { | ||
| c.JSON(http.StatusBadRequest, gin.H{"error": "invalid path"}) | ||
| return | ||
| } | ||
| scanRoot = candidateAbs | ||
| } | ||
|
|
||
| result := ScanResult{ |
…cted The Import button was disabled when 'Import as Workflow' was checked but no individual items were selected. Now workflow imports work with zero selections — creates a single workflow entry for the whole bundle.
- Click a card to open a side panel with full description, source link,
allowed tools, and install button
- Source file_path links to the repo on GitHub (tree/{branch}/{path})
- Installed tab shows clickable source repo link
- Install button on cards uses stopPropagation to not trigger detail view
- Rename 'Commands' button to 'Skills' in ChatInputBox toolbar
(commands and skills are merged as they use the same / trigger)
- Add Claude Code built-in skills (batch, simplify, debug, claude-api,
loop) to workflow metadata response
- Fix _scan_claude_dir to scan both .claude/{type}/ and {type}/ patterns
with deduplication (supports ai-helpers registry layout)
Review Queue Status
Action needed: Fix failing CI checks; Resolve merge conflicts
|
Summary
Add support for Claude Code skills (
.claude/skills/), a workspace-scoped Marketplace page for browsing and importing skills/commands/agents from external registries, and per-item installation stored in ProjectSettings CR.Architecture
marketplace-sources) holds curated registry sources (AI Helpers by default)spec.installedItems)add_dirs→ Claude Code auto-loads themChanges
Runner (Python)
.claude/skills/*/SKILL.mdin workflow metadata endpoint_scan_claude_dir()helper for commands/agents/skillssetup_marketplace_dirs()adds/workspace/marketplace/toadd_dirsasyncio.gather()Backend (Go)
marketplace-sourcesConfigMap with AI Helpers as default registryinstalledItemsarray added to ProjectSettings CRDmarketplace.go: list sources, fetch catalog, scan Git repos, install/uninstall itemsScanGitSourceendpoint: shallow clone → discover skills/commands/agents/workflows → return resultsOperator (Go)
installedItemsfrom ProjectSettings CR → pass asINSTALLED_ITEMS_JSONenv var to runnerFrontend (Next.js)
WorkflowSkilltype + skills merged into/autocompleteImportSourceDialog: scan Git URL → discover items → checkbox select → importVerification
.claude/skills/→ metadata endpoint returns skillsGET /marketplace/sources→ returns AI Helpers from ConfigMapPOST /marketplace/scanwith Git URL → discovers itemsadd_dirs