Integrate FAH Local Builds with Universal Maker#10382
Conversation
There was a problem hiding this comment.
Code Review
This pull request adds support for 'Universal Maker' local builds, integrates Secret Manager secret resolution for local environments, and refactors ABIU configuration. Key feedback includes fixing a blocking issue in the build watcher, adhering to style guides for YAML parsing and error handling, and optimizing the bundling process to avoid slow file copies of node_modules.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces the "Universal Maker" for App Hosting local builds, enabling standalone binary builds through a new experiment. It also enhances secret management by implementing a utility to resolve secrets from Secret Manager during local builds, adding a security confirmation prompt, and introducing the --allow-local-build-secrets flag. Furthermore, the PR removes explicit "Automatic Base Image Updates" (ABIU) configuration flags, integrating ABIU status into the runtime selection instead. Review feedback suggests strengthening test assertions for binary execution, removing unknown type assertions to comply with the style guide, and generalizing build artifact packaging logic to remove framework-specific assumptions.
|
/gemini review |
There was a problem hiding this comment.
Code Review
This pull request introduces support for a new 'Universal Maker' standalone binary for local builds in App Hosting, controlled via an experimental flag. It adds the runUniversalMaker function to handle the binary execution, artifact management, and metadata parsing, while also updating the tar archive creation logic to support flattened directory structures for .apphosting outputs. Feedback focuses on improving the robustness of the build process by checking the binary's exit status, ensuring proper cleanup of stale artifacts and temporary directories, and removing hardcoded environment variables that might conflict with user configurations.
| childProcess.spawnSync( | ||
| process.env.UNIVERSAL_MAKER_BINARY, | ||
| ["-application_dir", projectRoot, "-output_dir", projectRoot, "-output_format", "json"], | ||
| { | ||
| env: { | ||
| ...process.env, | ||
| X_GOOGLE_TARGET_PLATFORM: "fah", | ||
| FIREBASE_OUTPUT_BUNDLE_DIR: "bundle_output", | ||
| NPM_CONFIG_REGISTRY: "https://registry.npmjs.org/", | ||
| }, | ||
| stdio: "inherit", | ||
| }, | ||
| ); |
There was a problem hiding this comment.
The exit status of the spawnSync command should be checked. If the build fails, the process should stop and throw a FirebaseError to prevent the CLI from attempting to use incomplete or missing build artifacts. Additionally, consider removing the hardcoded NPM_CONFIG_REGISTRY as it may interfere with users' custom registry configurations.
const result = childProcess.spawnSync(
process.env.UNIVERSAL_MAKER_BINARY,
["-application_dir", projectRoot, "-output_dir", projectRoot, "-output_format", "json"],
{
env: {
...process.env,
X_GOOGLE_TARGET_PLATFORM: "fah",
FIREBASE_OUTPUT_BUNDLE_DIR: "bundle_output",
},
stdio: "inherit",
},
);
if (result.status !== 0) {
throw new FirebaseError(`Universal Maker build failed with status ${result.status}`, {
exit: result.status ?? 1,
});
}| if (fs.existsSync(bundleOutput)) { | ||
| if (!fs.existsSync(targetAppHosting)) { | ||
| fs.mkdirSync(targetAppHosting, { recursive: true }); | ||
| } | ||
| const files = fs.readdirSync(bundleOutput); | ||
| for (const file of files) { | ||
| fs.renameSync(path.join(bundleOutput, file), path.join(targetAppHosting, file)); | ||
| } | ||
| fs.rmdirSync(bundleOutput); | ||
| } |
There was a problem hiding this comment.
The .apphosting directory should be cleared before moving new artifacts to prevent stale files from previous builds from persisting. Also, using fs.rmSync with recursive: true is more robust than fs.rmdirSync for cleaning up the temporary output directory.
if (fs.existsSync(bundleOutput)) {
fs.rmSync(targetAppHosting, { recursive: true, force: true });
fs.mkdirSync(targetAppHosting, { recursive: true });
const files = fs.readdirSync(bundleOutput);
for (const file of files) {
fs.renameSync(path.join(bundleOutput, file), path.join(targetAppHosting, file));
}
fs.rmSync(bundleOutput, { recursive: true, force: true });
}| const outputRaw = fs.readFileSync(outputFilePath, "utf-8"); | ||
| let umOutput: UniversalMakerOutput; | ||
| try { | ||
| umOutput = JSON.parse(outputRaw) as UniversalMakerOutput; | ||
| } catch (e) { | ||
| throw new FirebaseError(`Failed to parse build_output.json: ${(e as Error).message}`); | ||
| } |
There was a problem hiding this comment.
The build_output.json file is a temporary metadata file and should be deleted after it has been read and parsed to keep the project directory clean.
| const outputRaw = fs.readFileSync(outputFilePath, "utf-8"); | |
| let umOutput: UniversalMakerOutput; | |
| try { | |
| umOutput = JSON.parse(outputRaw) as UniversalMakerOutput; | |
| } catch (e) { | |
| throw new FirebaseError(`Failed to parse build_output.json: ${(e as Error).message}`); | |
| } | |
| const outputRaw = fs.readFileSync(outputFilePath, "utf-8"); | |
| let umOutput: UniversalMakerOutput; | |
| try { | |
| umOutput = JSON.parse(outputRaw) as UniversalMakerOutput; | |
| } catch (e) { | |
| throw new FirebaseError(`Failed to parse build_output.json: ${(e as Error).message}`); | |
| } finally { | |
| fs.rmSync(outputFilePath, { force: true }); | |
| } |
| ): Promise<string> { | ||
| const tmpFile = tmp.fileSync({ prefix: `${config.backendId}-`, postfix: ".tar.gz" }).name; | ||
|
|
||
| const targetDir = targetSubDir ? path.join(rootDir, targetSubDir) : rootDir; | ||
| const isAppHostingDir = | ||
| targetSubDir === ".apphosting" || | ||
| (!!targetSubDir && path.basename(targetSubDir) === ".apphosting"); | ||
| const targetDir = targetSubDir | ||
| ? path.isAbsolute(targetSubDir) | ||
| ? targetSubDir | ||
| : path.join(rootDir, targetSubDir) | ||
| : rootDir; | ||
| const ignore = ["firebase-debug.log", "firebase-debug.*.log", ".git"]; | ||
| const rdrFiles = await fsAsync.readdirRecursive({ | ||
| path: targetDir, | ||
| ignore: ignore, | ||
| isGitIgnore: true, | ||
| }); | ||
| const allFiles: string[] = rdrFiles.map((rdrf) => path.relative(rootDir, rdrf.name)); | ||
|
|
||
| if (targetSubDir) { | ||
| const defaultFiles = fs.readdirSync(rootDir).filter((file) => { | ||
| return APPHOSTING_YAML_FILE_REGEX.test(file); | ||
| let archiveCwd = rootDir; | ||
| let pathsToPack: string[]; | ||
|
|
||
| if (isAppHostingDir) { | ||
| // create temporary directory to bundle things flattened | ||
| const tempDir = tmp.dirSync({ unsafeCleanup: true }).name; | ||
| fs.cpSync(targetDir, tempDir, { recursive: true }); | ||
|
|
||
| const rootPackageJson = path.join(rootDir, "package.json"); | ||
| if (fs.existsSync(rootPackageJson)) { | ||
| fs.copyFileSync(rootPackageJson, path.join(tempDir, "package.json")); | ||
| } | ||
| const rootFiles = fs.readdirSync(rootDir); | ||
| for (const file of rootFiles) { | ||
| if (APPHOSTING_YAML_FILE_REGEX.test(file)) { | ||
| fs.copyFileSync(path.join(rootDir, file), path.join(tempDir, file)); | ||
| } | ||
| } | ||
|
|
||
| const rdrFiles = await fsAsync.readdirRecursive({ | ||
| path: tempDir, | ||
| ignore: ignore, | ||
| isGitIgnore: false, | ||
| }); | ||
| pathsToPack = rdrFiles.map((rdrf) => path.relative(tempDir, rdrf.name)); | ||
| archiveCwd = tempDir; | ||
| } else { | ||
| const rdrFiles = await fsAsync.readdirRecursive({ | ||
| path: targetDir, | ||
| ignore: ignore, | ||
| isGitIgnore: !targetSubDir, // Disable gitignore if we are anchored to a build output subdirectory | ||
| }); | ||
| for (const file of defaultFiles) { | ||
| if (!allFiles.includes(file)) { | ||
| allFiles.push(file); | ||
| pathsToPack = rdrFiles.map((rdrf) => path.relative(rootDir, rdrf.name)); | ||
|
|
||
| if (targetSubDir) { | ||
| const defaultFiles = fs.readdirSync(rootDir).filter((file) => { | ||
| return APPHOSTING_YAML_FILE_REGEX.test(file); | ||
| }); | ||
| for (const file of defaultFiles) { | ||
| const relativePath = path.relative(rootDir, path.join(rootDir, file)); | ||
| if (!pathsToPack.includes(relativePath)) { | ||
| pathsToPack.push(relativePath); | ||
| } | ||
| } | ||
| } | ||
| } | ||
|
|
||
| // `tar` returns a `TypeError` if `allFiles` is empty. Let's check a feww things. | ||
| try { | ||
| fs.statSync(rootDir); | ||
| fs.statSync(archiveCwd); | ||
| } catch (err: unknown) { | ||
| if (err instanceof Error && "code" in err && err.code === "ENOENT") { | ||
| throw new FirebaseError(`Could not read directory "${rootDir}"`); | ||
| throw new FirebaseError(`Could not read directory "${archiveCwd}"`); | ||
| } | ||
| throw err; | ||
| } | ||
| if (!allFiles.length) { | ||
| throw new FirebaseError(`Cannot create a tar archive with 0 files from directory "${rootDir}"`); | ||
| if (!pathsToPack.length) { | ||
| throw new FirebaseError( | ||
| `Cannot create a tar archive with 0 files from directory "${archiveCwd}"`, | ||
| ); | ||
| } | ||
|
|
||
| await tar.create( | ||
| { | ||
| gzip: true, | ||
| file: tmpFile, | ||
| cwd: rootDir, | ||
| cwd: archiveCwd, | ||
| portable: true, | ||
| }, | ||
| allFiles, | ||
| pathsToPack, | ||
| ); | ||
| return tmpFile; | ||
| } |
There was a problem hiding this comment.
The temporary directory created by tmp.dirSync should be explicitly cleaned up using the removeCallback provided by the tmp library. This ensures that temporary files are removed even if the process stays alive for other tasks. Wrapping the logic in a try...finally block is the recommended pattern for this.
export async function createLocalBuildTarArchive(
config: AppHostingSingle,
rootDir: string,
targetSubDir?: string,
): Promise<string> {
const tmpFile = tmp.fileSync({ prefix: `${config.backendId}-`, postfix: ".tar.gz" }).name;
const isAppHostingDir =
targetSubDir === ".apphosting" ||
(!!targetSubDir && path.basename(targetSubDir) === ".apphosting");
const targetDir = targetSubDir
? path.isAbsolute(targetSubDir)
? targetSubDir
: path.join(rootDir, targetSubDir)
: rootDir;
const ignore = ["firebase-debug.log", "firebase-debug.*.log", ".git"];
let archiveCwd = rootDir;
let pathsToPack: string[];
let tempDirObj: tmp.DirResult | undefined;
try {
if (isAppHostingDir) {
// create temporary directory to bundle things flattened
tempDirObj = tmp.dirSync({ unsafeCleanup: true });
const tempDir = tempDirObj.name;
fs.cpSync(targetDir, tempDir, { recursive: true });
const rootPackageJson = path.join(rootDir, "package.json");
if (fs.existsSync(rootPackageJson)) {
fs.copyFileSync(rootPackageJson, path.join(tempDir, "package.json"));
}
const rootFiles = fs.readdirSync(rootDir);
for (const file of rootFiles) {
if (APPHOSTING_YAML_FILE_REGEX.test(file)) {
fs.copyFileSync(path.join(rootDir, file), path.join(tempDir, file));
}
}
const rdrFiles = await fsAsync.readdirRecursive({
path: tempDir,
ignore: ignore,
isGitIgnore: false,
});
pathsToPack = rdrFiles.map((rdrf) => path.relative(tempDir, rdrf.name));
archiveCwd = tempDir;
} else {
const rdrFiles = await fsAsync.readdirRecursive({
path: targetDir,
ignore: ignore,
isGitIgnore: !targetSubDir, // Disable gitignore if we are anchored to a build output subdirectory
});
pathsToPack = rdrFiles.map((rdrf) => path.relative(rootDir, rdrf.name));
if (targetSubDir) {
const defaultFiles = fs.readdirSync(rootDir).filter((file) => {
return APPHOSTING_YAML_FILE_REGEX.test(file);
});
for (const file of defaultFiles) {
const relativePath = path.relative(rootDir, path.join(rootDir, file));
if (!pathsToPack.includes(relativePath)) {
pathsToPack.push(relativePath);
}
}
}
}
try {
fs.statSync(archiveCwd);
} catch (err: unknown) {
if (err instanceof Error && "code" in err && err.code === "ENOENT") {
throw new FirebaseError(`Could not read directory "${archiveCwd}"`);
}
throw err;
}
if (!pathsToPack.length) {
throw new FirebaseError(
`Cannot create a tar archive with 0 files from directory "${archiveCwd}"`,
);
}
await tar.create(
{
gzip: true,
file: tmpFile,
cwd: archiveCwd,
portable: true,
},
pathsToPack,
);
return tmpFile;
} finally {
tempDirObj?.removeCallback();
}
}
\
Description
This takes the existing local builds solution and uses the Universal Maker binary (which runs all relevant buildpacks) instead of hackily running the apphosting adapter manually.
The Universal Maker is a more well-supported tool and behaves more similar to Cloud Builds so we can have more confidence about framework support and fidelity.
This CL is for quick and dirty testing
Scenarios Tested
Created a local build with the
Sample Commands