PLAT-8152 Add scene-trees:view and scene-trees:export commands#176
PLAT-8152 Add scene-trees:view and scene-trees:export commands#176
Conversation
| @@ -0,0 +1,75 @@ | |||
| import { TreeNode } from './tree-node'; | |||
There was a problem hiding this comment.
I copied in a few of these tree utilities from the vertex-api-utils-node project. There are some issues with adding that library as a dependency right now. But the code for these has been previously vetted in that project.
|
|
|
||
| export default class Create extends BaseCommand { | ||
| public static description = `Create an export for a scene.`; | ||
| public static readonly description = `Create an export for a scene.`; |
There was a problem hiding this comment.
I resolved a sonar finding on all these static values. It's a widespread pattern in this repo, so it resulted in a lot of files being changed. But there should be no effective functional change with all these.
| @@ -0,0 +1,53 @@ | |||
| import { flags } from '@oclif/command'; | |||
There was a problem hiding this comment.
This file and the next one are the primary functional additions.
| import { fetchSceneItemTree } from '../../lib/scene-items'; | ||
| import { serializeTreeToZipFile } from '../../lib/tree-serializer'; | ||
|
|
||
| export default class Export extends BaseCommand { |
There was a problem hiding this comment.
Export has a different meaning in our system. Should this be called Scene Extraction or something similar?
| import { TreeNode } from './tree-node'; | ||
|
|
||
| interface SerializedNode<T> { | ||
| data: T; |
There was a problem hiding this comment.
💡 I know this may be a bit more work, but with this extraction type, a consumer of this CLI would have to then write another script to parse this format and make it into a CLI friendly format. Maybe this can/should be outside the scope of this work, but could this extraction process also export the same format of the .json expected input? That way consumers could run an extraction process on a scene, and have a .json file in a cli friendly format.
There was a problem hiding this comment.
Are you saying support non-compressed JSON output in addition to zipped?
There was a problem hiding this comment.
It could also be included in the zip, I just mean a format that the create scene cli can read and understand. But up to you.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 46 out of 47 changed files in this pull request and generated 8 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| ): Promise<TreeNode<SceneItemData>> { | ||
| const allSceneItems = await fetchAllSceneItemsForScene(client, sceneId); | ||
| return buildTreeFromFlat( | ||
| allSceneItems, | ||
| (rec) => rec.id, | ||
| (rec) => rec.relationships.parent?.data.id | ||
| )[0]; |
There was a problem hiding this comment.
fetchSceneItemTree always returns the first root ([0]), which can be undefined (empty scene) or arbitrarily wrong (multiple roots). Consider handling 0/>1 roots explicitly (e.g., throw a clear error when none/multiple, or change the return type to TreeNode<SceneItemData> | undefined / return all roots).
| ): Promise<TreeNode<SceneItemData>> { | |
| const allSceneItems = await fetchAllSceneItemsForScene(client, sceneId); | |
| return buildTreeFromFlat( | |
| allSceneItems, | |
| (rec) => rec.id, | |
| (rec) => rec.relationships.parent?.data.id | |
| )[0]; | |
| ): Promise<TreeNode<SceneItemData> | undefined> { | |
| const allSceneItems = await fetchAllSceneItemsForScene(client, sceneId); | |
| const roots = buildTreeFromFlat( | |
| allSceneItems, | |
| (rec) => rec.id, | |
| (rec) => rec.relationships.parent?.data.id | |
| ); | |
| if (roots.length === 0) { | |
| return undefined; | |
| } | |
| return roots[0]; |
| return node | ||
| ? node(this) | ||
| : `[TreeNode ${role} ${childInfo}, data=[${ | ||
| data ? data(this.data) : JSON.stringify(this.data).slice(0, 160) |
There was a problem hiding this comment.
JSON.stringify(this.data) can return undefined (e.g., if data is undefined, a function, or a symbol), which will cause .slice(...) to throw at runtime. Consider coercing to a safe string first (e.g., store the stringify result in a variable and fall back to String(this.data) or '' when it’s undefined).
| return node | |
| ? node(this) | |
| : `[TreeNode ${role} ${childInfo}, data=[${ | |
| data ? data(this.data) : JSON.stringify(this.data).slice(0, 160) | |
| const stringifiedData = JSON.stringify(this.data); | |
| const safeDataString = | |
| stringifiedData === undefined ? String(this.data) : stringifiedData; | |
| return node | |
| ? node(this) | |
| : `[TreeNode ${role} ${childInfo}, data=[${ | |
| data ? data(this.data) : safeDataString.slice(0, 160) |
| await new Promise<void>((resolve, reject) => { | ||
| const fileStream = fs.createWriteStream(filePath); | ||
| const archive = archiver('zip'); | ||
|
|
||
| fileStream.on('close', resolve); | ||
| archive.on('error', reject); | ||
|
|
||
| archive.pipe(fileStream); | ||
| archive.append(jsonBuffer, { name: entryName }); | ||
| archive.finalize().catch(reject); | ||
| }); |
There was a problem hiding this comment.
The ZIP write promise never rejects if fileStream errors (e.g., permission/ENOENT), which can lead to a hung command. Add a fileStream.on('error', reject) handler (and consider handling archive warnings if you want non-fatal cases surfaced consistently).
| // write the JSON root opening | ||
| out.write('{"data":'); | ||
| out.write(JSON.stringify(root.data)); | ||
| out.write(',"children":['); | ||
|
|
||
| // helper to write a node and its subtree | ||
| async function writeNode(node: TreeNode<T>): Promise<void> { | ||
| // open this node | ||
| out.write('{"data":'); | ||
| out.write(JSON.stringify(node.data)); | ||
| out.write(',"children":['); | ||
|
|
||
| // write each child (comma-separated) | ||
| for (let i = 0; i < node.children.length; i++) { | ||
| if (i > 0) out.write(','); | ||
| // eslint-disable-next-line no-await-in-loop | ||
| await writeNode(node.children[i]); | ||
| } | ||
|
|
||
| // close this node | ||
| out.write(']}'); | ||
| } | ||
|
|
||
| // write top-level children | ||
| for (let i = 0; i < root.children.length; i++) { | ||
| if (i > 0) out.write(','); | ||
| // eslint-disable-next-line no-await-in-loop | ||
| await writeNode(root.children[i]); | ||
| } | ||
|
|
||
| // close the JSON | ||
| out.write(']}'); |
There was a problem hiding this comment.
This writes potentially large amounts of data to a stream without respecting backpressure (write() return value / drain), which can cause high memory usage for big trees. Consider a small helper that awaits drain when out.write(...) returns false, and use it for each write.
| // write the JSON root opening | |
| out.write('{"data":'); | |
| out.write(JSON.stringify(root.data)); | |
| out.write(',"children":['); | |
| // helper to write a node and its subtree | |
| async function writeNode(node: TreeNode<T>): Promise<void> { | |
| // open this node | |
| out.write('{"data":'); | |
| out.write(JSON.stringify(node.data)); | |
| out.write(',"children":['); | |
| // write each child (comma-separated) | |
| for (let i = 0; i < node.children.length; i++) { | |
| if (i > 0) out.write(','); | |
| // eslint-disable-next-line no-await-in-loop | |
| await writeNode(node.children[i]); | |
| } | |
| // close this node | |
| out.write(']}'); | |
| } | |
| // write top-level children | |
| for (let i = 0; i < root.children.length; i++) { | |
| if (i > 0) out.write(','); | |
| // eslint-disable-next-line no-await-in-loop | |
| await writeNode(root.children[i]); | |
| } | |
| // close the JSON | |
| out.write(']}'); | |
| async function writeChunk(chunk: string): Promise<void> { | |
| if (!out.write(chunk)) { | |
| await new Promise<void>((resolve, reject) => { | |
| out.once('drain', resolve); | |
| out.once('error', reject); | |
| }); | |
| } | |
| } | |
| // write the JSON root opening | |
| await writeChunk('{"data":'); | |
| await writeChunk(JSON.stringify(root.data)); | |
| await writeChunk(',"children":['); | |
| // helper to write a node and its subtree | |
| async function writeNode(node: TreeNode<T>): Promise<void> { | |
| // open this node | |
| await writeChunk('{"data":'); | |
| await writeChunk(JSON.stringify(node.data)); | |
| await writeChunk(',"children":['); | |
| // write each child (comma-separated) | |
| for (let i = 0; i < node.children.length; i++) { | |
| if (i > 0) await writeChunk(','); | |
| // eslint-disable-next-line no-await-in-loop | |
| await writeNode(node.children[i]); | |
| } | |
| // close this node | |
| await writeChunk(']}'); | |
| } | |
| // write top-level children | |
| for (let i = 0; i < root.children.length; i++) { | |
| if (i > 0) await writeChunk(','); | |
| // eslint-disable-next-line no-await-in-loop | |
| await writeNode(root.children[i]); | |
| } | |
| // close the JSON | |
| await writeChunk(']}'); |
| const chunks: Buffer[] = []; | ||
| const collector = new Transform({ | ||
| transform(chunk, _enc, callback) { | ||
| chunks.push(Buffer.from(chunk)); |
There was a problem hiding this comment.
Buffer.from(chunk) will copy when chunk is already a Buffer, increasing memory overhead for large payloads. Prefer pushing the existing Buffer when possible (and only convert/copy when chunk is a string/typed array).
| chunks.push(Buffer.from(chunk)); | |
| if (Buffer.isBuffer(chunk)) { | |
| chunks.push(chunk); | |
| } else { | |
| chunks.push(Buffer.from(chunk)); | |
| } |
| } | ||
|
|
||
| /* | ||
| * Streamly serialize a TreeNode<T> into a Buffer of JSON or gzipped JSON |
There was a problem hiding this comment.
Correct typo: 'Streamly' should be 'Stream-serialize' (or similar).
| * Streamly serialize a TreeNode<T> into a Buffer of JSON or gzipped JSON | |
| * Stream-serialize a TreeNode<T> into a Buffer of JSON or gzipped JSON |
| /** | ||
| * These functions serialize and deserialize a TreeNode<T> to/from | ||
| * a file or Buffer. The serialized format is JSON, and it can be | ||
| * optionally gzipped for smaller size. |
There was a problem hiding this comment.
The module header describes file/buffer JSON (+ gzip) but this file also exports ZIP serialization (serializeTreeToZipFile). Consider updating the top-level doc comment to mention ZIP output and the JSON-in-ZIP format so callers understand the full API surface.
| export async function serializeTreeToZipFile<T>({ | ||
| root, | ||
| filePath, | ||
| entryName, | ||
| }: SerializeZipFileOptions<T>): Promise<void> { |
There was a problem hiding this comment.
This new ZIP export path has no direct unit test coverage in the lib layer (e.g., verifying the zip entry name/content or error propagation). Consider adding tests for serializeTreeToZipFile (and ideally buffer/file serialize/deserialize) using a temp directory and reading back the produced ZIP to validate the JSON entry.



Summary
Adds a scene-trees command category with two subcommands for working with a scene's item tree:
scene-trees:view <sceneId>— Fetches all scene items for a scene, builds them into a tree, and printsa visual ASCII tree dump to stdout.
scene-trees:export <sceneId> [--output <path>]— Fetches the scene item tree and serializes it as aJSON file inside a ZIP archive (defaults to
<sceneId>.zip).New supporting library code in
src/lib/:tree-node.ts— GenericTreeNode<T>class.tree-builder.ts— Utility methods for building a tree from a flat array.tree-serializer.ts— Utility methods for serializing aTreeNodestructure to JSON/ZIP.scene-items.ts— Helper methods for fetching scene items for a scene.Dependencies: Added
archiver(ZIP creation) and@types/archiver.Test Plan
View the tree in the terminal
./bin/run scene-trees:view <sceneId>Export to a ZIP (default path:
<sceneId>.zip)./bin/run scene-trees:export <sceneId>Export to a custom path
./bin/run scene-trees:export <sceneId> --output my-tree.zip<sceneId>.jsonto verify the tree structure.Release Notes
New commands:
scene-trees:viewandscene-trees:exportYou can now inspect and export the scene item tree for any scene directly from the CLI.
vertex scene-trees:view <sceneId>prints a hierarchical ASCII view of the scene item tree to your terminal.vertex scene-trees:export <sceneId>downloads the full scene item tree and saves it as a JSON file inside a ZIP archive. Use--output <path>to control the destination.Possible Regressions
Dependencies
None.