Skip to content

PLAT-8152 Add scene-trees:view and scene-trees:export commands#176

Open
brentav wants to merge 6 commits intomasterfrom
export-scene-structure
Open

PLAT-8152 Add scene-trees:view and scene-trees:export commands#176
brentav wants to merge 6 commits intomasterfrom
export-scene-structure

Conversation

@brentav
Copy link
Contributor

@brentav brentav commented Feb 19, 2026

Summary

Adds a scene-trees command category with two subcommands for working with a scene's item tree:

  • scene-trees:view <sceneId> — Fetches all scene items for a scene, builds them into a tree, and prints
    a visual ASCII tree dump to stdout.
  • scene-trees:export <sceneId> [--output <path>] — Fetches the scene item tree and serializes it as a
    JSON file inside a ZIP archive (defaults to <sceneId>.zip).

New supporting library code in src/lib/:

  • tree-node.ts — Generic TreeNode<T> class.
  • tree-builder.ts — Utility methods for building a tree from a flat array.
  • tree-serializer.ts — Utility methods for serializing a TreeNode structure to JSON/ZIP.
  • scene-items.ts — Helper methods for fetching scene items for a scene.

Dependencies: Added archiver (ZIP creation) and @types/archiver.

Test Plan

  • Run yarn test — all tests should pass with ≥60% statement/line coverage.
    • To exercise the CLI manually against a real environment:

View the tree in the terminal

./bin/run scene-trees:view <sceneId>

Export to a ZIP (default path: <sceneId>.zip)

./bin/run scene-trees:export <sceneId>

Export to a custom path

./bin/run scene-trees:export <sceneId> --output my-tree.zip

  • Unzip the output and inspect <sceneId>.json to verify the tree structure.

Release Notes

New commands: scene-trees:view and scene-trees:export

You can now inspect and export the scene item tree for any scene directly from the CLI.

  • vertex scene-trees:view <sceneId> prints a hierarchical ASCII view of the scene item tree to your terminal.
  • vertex scene-trees:export <sceneId> downloads the full scene item tree and saves it as a JSON file inside a ZIP archive. Use --output <path> to control the destination.

Possible Regressions

  • No existing code paths were functionally modified.
  • Some library dependencies were upgraded and some sonar findings were addressed, but there should not be any functional changes with those.
  • No elevated risks for regressions.

Dependencies

None.

@brentav brentav requested a review from a team as a code owner February 19, 2026 19:48
@@ -0,0 +1,75 @@
import { TreeNode } from './tree-node';
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I copied in a few of these tree utilities from the vertex-api-utils-node project. There are some issues with adding that library as a dependency right now. But the code for these has been previously vetted in that project.

@sonarqubecloud
Copy link


export default class Create extends BaseCommand {
public static description = `Create an export for a scene.`;
public static readonly description = `Create an export for a scene.`;
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I resolved a sonar finding on all these static values. It's a widespread pattern in this repo, so it resulted in a lot of files being changed. But there should be no effective functional change with all these.

@@ -0,0 +1,53 @@
import { flags } from '@oclif/command';
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This file and the next one are the primary functional additions.

Copy link
Contributor

@MadisonEhlers-Vertex MadisonEhlers-Vertex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had a few suggestions.

  1. Rename export to not conflict with the existing scene export concepts. "Scene Extraction"?

  2. Consideration to change the extraction formatting to be a cli friendly format (nit)

import { fetchSceneItemTree } from '../../lib/scene-items';
import { serializeTreeToZipFile } from '../../lib/tree-serializer';

export default class Export extends BaseCommand {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Export has a different meaning in our system. Should this be called Scene Extraction or something similar?

import { TreeNode } from './tree-node';

interface SerializedNode<T> {
data: T;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 I know this may be a bit more work, but with this extraction type, a consumer of this CLI would have to then write another script to parse this format and make it into a CLI friendly format. Maybe this can/should be outside the scope of this work, but could this extraction process also export the same format of the .json expected input? That way consumers could run an extraction process on a scene, and have a .json file in a cli friendly format.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are you saying support non-compressed JSON output in addition to zipped?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It could also be included in the zip, I just mean a format that the create scene cli can read and understand. But up to you.

@danschultz danschultz requested a review from Copilot March 5, 2026 18:33
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.

@danschultz danschultz requested a review from Copilot March 5, 2026 18:33
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.

@danschultz danschultz requested review from Copilot and removed request for Copilot March 5, 2026 19:01
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 46 out of 47 changed files in this pull request and generated 8 comments.


💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +38 to +44
): Promise<TreeNode<SceneItemData>> {
const allSceneItems = await fetchAllSceneItemsForScene(client, sceneId);
return buildTreeFromFlat(
allSceneItems,
(rec) => rec.id,
(rec) => rec.relationships.parent?.data.id
)[0];
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fetchSceneItemTree always returns the first root ([0]), which can be undefined (empty scene) or arbitrarily wrong (multiple roots). Consider handling 0/>1 roots explicitly (e.g., throw a clear error when none/multiple, or change the return type to TreeNode<SceneItemData> | undefined / return all roots).

Suggested change
): Promise<TreeNode<SceneItemData>> {
const allSceneItems = await fetchAllSceneItemsForScene(client, sceneId);
return buildTreeFromFlat(
allSceneItems,
(rec) => rec.id,
(rec) => rec.relationships.parent?.data.id
)[0];
): Promise<TreeNode<SceneItemData> | undefined> {
const allSceneItems = await fetchAllSceneItemsForScene(client, sceneId);
const roots = buildTreeFromFlat(
allSceneItems,
(rec) => rec.id,
(rec) => rec.relationships.parent?.data.id
);
if (roots.length === 0) {
return undefined;
}
return roots[0];

Copilot uses AI. Check for mistakes.
Comment on lines +142 to +145
return node
? node(this)
: `[TreeNode ${role} ${childInfo}, data=[${
data ? data(this.data) : JSON.stringify(this.data).slice(0, 160)
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

JSON.stringify(this.data) can return undefined (e.g., if data is undefined, a function, or a symbol), which will cause .slice(...) to throw at runtime. Consider coercing to a safe string first (e.g., store the stringify result in a variable and fall back to String(this.data) or '' when it’s undefined).

Suggested change
return node
? node(this)
: `[TreeNode ${role} ${childInfo}, data=[${
data ? data(this.data) : JSON.stringify(this.data).slice(0, 160)
const stringifiedData = JSON.stringify(this.data);
const safeDataString =
stringifiedData === undefined ? String(this.data) : stringifiedData;
return node
? node(this)
: `[TreeNode ${role} ${childInfo}, data=[${
data ? data(this.data) : safeDataString.slice(0, 160)

Copilot uses AI. Check for mistakes.
Comment on lines +257 to +267
await new Promise<void>((resolve, reject) => {
const fileStream = fs.createWriteStream(filePath);
const archive = archiver('zip');

fileStream.on('close', resolve);
archive.on('error', reject);

archive.pipe(fileStream);
archive.append(jsonBuffer, { name: entryName });
archive.finalize().catch(reject);
});
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ZIP write promise never rejects if fileStream errors (e.g., permission/ENOENT), which can lead to a hung command. Add a fileStream.on('error', reject) handler (and consider handling archive warnings if you want non-fatal cases surfaced consistently).

Copilot uses AI. Check for mistakes.
Comment on lines +76 to +107
// write the JSON root opening
out.write('{"data":');
out.write(JSON.stringify(root.data));
out.write(',"children":[');

// helper to write a node and its subtree
async function writeNode(node: TreeNode<T>): Promise<void> {
// open this node
out.write('{"data":');
out.write(JSON.stringify(node.data));
out.write(',"children":[');

// write each child (comma-separated)
for (let i = 0; i < node.children.length; i++) {
if (i > 0) out.write(',');
// eslint-disable-next-line no-await-in-loop
await writeNode(node.children[i]);
}

// close this node
out.write(']}');
}

// write top-level children
for (let i = 0; i < root.children.length; i++) {
if (i > 0) out.write(',');
// eslint-disable-next-line no-await-in-loop
await writeNode(root.children[i]);
}

// close the JSON
out.write(']}');
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This writes potentially large amounts of data to a stream without respecting backpressure (write() return value / drain), which can cause high memory usage for big trees. Consider a small helper that awaits drain when out.write(...) returns false, and use it for each write.

Suggested change
// write the JSON root opening
out.write('{"data":');
out.write(JSON.stringify(root.data));
out.write(',"children":[');
// helper to write a node and its subtree
async function writeNode(node: TreeNode<T>): Promise<void> {
// open this node
out.write('{"data":');
out.write(JSON.stringify(node.data));
out.write(',"children":[');
// write each child (comma-separated)
for (let i = 0; i < node.children.length; i++) {
if (i > 0) out.write(',');
// eslint-disable-next-line no-await-in-loop
await writeNode(node.children[i]);
}
// close this node
out.write(']}');
}
// write top-level children
for (let i = 0; i < root.children.length; i++) {
if (i > 0) out.write(',');
// eslint-disable-next-line no-await-in-loop
await writeNode(root.children[i]);
}
// close the JSON
out.write(']}');
async function writeChunk(chunk: string): Promise<void> {
if (!out.write(chunk)) {
await new Promise<void>((resolve, reject) => {
out.once('drain', resolve);
out.once('error', reject);
});
}
}
// write the JSON root opening
await writeChunk('{"data":');
await writeChunk(JSON.stringify(root.data));
await writeChunk(',"children":[');
// helper to write a node and its subtree
async function writeNode(node: TreeNode<T>): Promise<void> {
// open this node
await writeChunk('{"data":');
await writeChunk(JSON.stringify(node.data));
await writeChunk(',"children":[');
// write each child (comma-separated)
for (let i = 0; i < node.children.length; i++) {
if (i > 0) await writeChunk(',');
// eslint-disable-next-line no-await-in-loop
await writeNode(node.children[i]);
}
// close this node
await writeChunk(']}');
}
// write top-level children
for (let i = 0; i < root.children.length; i++) {
if (i > 0) await writeChunk(',');
// eslint-disable-next-line no-await-in-loop
await writeNode(root.children[i]);
}
// close the JSON
await writeChunk(']}');

Copilot uses AI. Check for mistakes.
const chunks: Buffer[] = [];
const collector = new Transform({
transform(chunk, _enc, callback) {
chunks.push(Buffer.from(chunk));
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Buffer.from(chunk) will copy when chunk is already a Buffer, increasing memory overhead for large payloads. Prefer pushing the existing Buffer when possible (and only convert/copy when chunk is a string/typed array).

Suggested change
chunks.push(Buffer.from(chunk));
if (Buffer.isBuffer(chunk)) {
chunks.push(chunk);
} else {
chunks.push(Buffer.from(chunk));
}

Copilot uses AI. Check for mistakes.
}

/*
* Streamly serialize a TreeNode<T> into a Buffer of JSON or gzipped JSON
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Correct typo: 'Streamly' should be 'Stream-serialize' (or similar).

Suggested change
* Streamly serialize a TreeNode<T> into a Buffer of JSON or gzipped JSON
* Stream-serialize a TreeNode<T> into a Buffer of JSON or gzipped JSON

Copilot uses AI. Check for mistakes.
Comment on lines +1 to +4
/**
* These functions serialize and deserialize a TreeNode<T> to/from
* a file or Buffer. The serialized format is JSON, and it can be
* optionally gzipped for smaller size.
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The module header describes file/buffer JSON (+ gzip) but this file also exports ZIP serialization (serializeTreeToZipFile). Consider updating the top-level doc comment to mention ZIP output and the JSON-in-ZIP format so callers understand the full API surface.

Copilot uses AI. Check for mistakes.
Comment on lines +248 to +252
export async function serializeTreeToZipFile<T>({
root,
filePath,
entryName,
}: SerializeZipFileOptions<T>): Promise<void> {
Copy link

Copilot AI Mar 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This new ZIP export path has no direct unit test coverage in the lib layer (e.g., verifying the zip entry name/content or error propagation). Consider adding tests for serializeTreeToZipFile (and ideally buffer/file serialize/deserialize) using a temp directory and reading back the produced ZIP to validate the JSON entry.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants