Skip to content

feat: --source flag added #45

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 7 commits into
base: main
Choose a base branch
from
Open

Conversation

aidankmcalister
Copy link
Member

@aidankmcalister aidankmcalister commented Aug 21, 2025

Summary by CodeRabbit

  • New Features

    • Database creation now derives a source from project environment variables and includes it as source/utm_source in JSON output, claim URLs and analytics.
  • UX Improvements

    • Derived source is propagated across interactive, flag, and JSON flows for consistent attribution.
    • Analytics events include the derived user agent when available; analytics failures are logged for troubleshooting.
  • Chores

    • Analytics now require explicit host and API key from the environment (no implicit fallbacks); public API signatures unchanged.

Copy link

cloudflare-workers-and-pages bot commented Aug 21, 2025

Deploying with  Cloudflare Workers  Cloudflare Workers

The latest updates on your project. Learn more about integrating Git with Workers.

Status Name Latest Commit Preview URL Updated (UTC)
✅ Deployment successful!
View logs
claim-db-worker 76d7ec7 Commit Preview URL

Branch Preview URL
Aug 22 2025, 08:03 PM

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17134940199:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17134940199.

Copy link

coderabbitai bot commented Aug 21, 2025

Walkthrough

Reads PRISMA_ACTOR_NAME and PRISMA_ACTOR_PROJECT from a project .env to derive an optional userAgent and propagate it through region selection, database creation requests (utm_source), analytics events, claim URL, and JSON output; also tightens PostHog config to require explicit host/key from env and disables analytics if missing.

Changes

Cohort / File(s) Summary
Env reading & source derivation
create-db/index.js
Added readUserEnvFile() to parse a project .env (returns a map or empty map). Derives userAgent as PRISMA_ACTOR_NAME/PRISMA_ACTOR_PROJECT when both exist.
Source propagation through control flow
create-db/index.js
Compute userAgent early in main() after args parsing and pass it to promptForRegion(defaultRegion, userAgent) and createDatabase(name, region, userAgent, returnJson) across interactive, flag, and JSON flows.
API request & output shaping
create-db/index.js
Request bodies and claim URL use utm_source: userAgent (fallback to existing CLI name when absent). JSON output includes a source field when userAgent is present.
Analytics integration
create-db/index.js
Analytics events (cli_command_ran, region_selected, database_creation_failed, invalid_json, etc.) now include userAgent when present; cli_command_ran logs has-source-from-env. Console.error added for analytics capture failures.
PostHog config and error handling
create-db/analytics.js
Require both POSTHOG_API_HOST and POSTHOG_API_KEY to enable analytics; build POSTHOG_CAPTURE_URL as trimmed POSTHOG_API_HOST + "/capture" with no proxy fallback; remove default API key fallback; warn and early-return when missing. Error logging restricted to development.
Public API / exports
create-db/*.js
No changes to exported/public function signatures (e.g., getRegions, validateRegion unchanged).

Sequence Diagram(s)

sequenceDiagram
  autonumber
  actor U as User
  participant C as CLI (create-db/index.js)
  participant E as EnvReader (.env)
  participant A as Analytics (create-db/analytics.js)
  participant S as API Service
  participant B as Browser (Claim URL)

  U->>C: run create-db [--json] [region/flags...]
  C->>C: parse args
  C->>E: readUserEnvFile()
  alt .env contains both keys
    E-->>C: {PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT}
    C->>C: userAgent = "NAME/PROJECT"
  else no userAgent
    E-->>C: {}
    C->>C: userAgent = undefined
  end

  C->>A: cli_command_ran {has-source-from-env?, userAgent?}

  alt interactive
    C->>U: promptForRegion(defaultRegion, userAgent)
    U-->>C: region
    C->>A: region_selected {region, userAgent?}
  else region provided via flag/JSON
    C->>A: region_selected {region, userAgent?}
  end

  C->>S: createDatabase {name, region, utm_source: userAgent || CLI_NAME}
  alt success
    S-->>C: db info + claimUrl(utm_source)
    C-->>U: output (JSON includes source when userAgent present)
    C->>B: open claimUrl (utm_source)
  else failure
    S-->>C: error
    C->>A: database_creation_failed {reason, userAgent?}
    C-->>U: error
  end
Loading

Tip

🔌 Remote MCP (Model Context Protocol) integration is now available!

Pro plan users can now connect to remote MCP servers from the Integrations page. Connect with popular remote MCPs such as Notion and Linear to add more context to your reviews and chats.

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch DC-4829-source-flag

🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR/Issue comments)

Type @coderabbitai help to get the list of available commands.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Status, Documentation and Community

  • Visit our Status Page to check the current availability of CodeRabbit.
  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17135144702:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17135144702.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
create-db/index.js (3)

96-116: Help output doesn’t document the new --source flag.

Users won’t discover the feature or its behavior from --help. Add the flag to Options and Examples.

   ${chalk.yellow("--json, -j")}                      Output machine-readable JSON and exit
   ${chalk.yellow("--list-regions")}                  List available regions and exit
   ${chalk.yellow("--help, -h")}                      Show this help message
+  ${chalk.yellow("--source, -s")}                    Derive utm_source from CTA_* in your project .env

 ...
   ${chalk.gray(`npx ${CLI_NAME} --json --region us-east-1`)}
+  ${chalk.gray(`npx ${CLI_NAME} --source --region us-east-1`)}

318-329: utm_source: avoid undefined, keep backward-compat, and URL-encode.

  • If source is unset, the request body omits utm_source and the claim URL renders utm_source=undefined. That breaks existing attribution and pollutes links.
  • Fallback to CLI_NAME and URL-encode the value in the claim URL.
 async function createDatabase(name, region, source, returnJson = false ) {
   let s;
   if (!returnJson) {
     s = spinner();
     s.start("Creating your database...");
   }
 
+  const utmSource = source || CLI_NAME;
+
   const resp = await fetch(`${CREATE_DB_WORKER_URL}/create`, {
     method: "POST",
     headers: { "Content-Type": "application/json" },
-    body: JSON.stringify({ region, name, utm_source: source }),
+    body: JSON.stringify({ region, name, utm_source: utmSource }),
   });
 ...
-  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${source}&utm_medium=cli`;
+  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${encodeURIComponent(utmSource)}&utm_medium=cli`;

Also applies to: 422-422


543-557: Analytics: compute has-source-flag from parsed flags, not argv text.

rawArgs.includes("-s") misses combined short forms (e.g., -rs). Use Boolean(flags.source).

-        "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"),
+        "has-source-flag": Boolean(flags.source),
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 1caa532 and a3733fe.

📒 Files selected for processing (1)
  • create-db/index.js (17 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (1)
create-db/index.js (1)

631-636: LGTM: source plumbed through interactive/JSON/non-interactive flows.

Passing source into promptForRegion and createDatabase across code paths looks consistent.

Also applies to: 660-666

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
create-db/index.js (2)

102-115: Update help text to document the new --source flag (and example).

Users won’t discover the flag via --help. Add an option line and an example.

   ${chalk.yellow("--json, -j")}                      Output machine-readable JSON and exit
+  ${chalk.yellow("--source, -s")}                    Derive a UTM source from CTA_* vars in your .env and include in analytics and API calls
   ${chalk.yellow("--list-regions")}                  List available regions and exit
@@
   ${chalk.gray(`npx ${CLI_NAME} --json --region us-east-1`)}
+  ${chalk.gray(`npx ${CLI_NAME} --interactive --source`)}

318-329: Preserve utm_source when --source is absent.

Currently, utm_source is omitted from the JSON body when source is undefined. To keep existing attribution, fall back to CLI_NAME.

-    body: JSON.stringify({ region, name, utm_source: source }),
+    body: JSON.stringify({ region, name, utm_source: source ?? CLI_NAME }),
♻️ Duplicate comments (6)
create-db/index.js (6)

170-173: Await and return on -h single-short-flag path.

Without await + return, downstream code may still run in some harnesses/mocks where process.exit is stubbed.

-        if (mappedFlag === "help") showHelp();
+        if (mappedFlag === "help") { await showHelp(); return; }

544-565: Analytics: use parsed flags for “has-source-flag” and reuse the helper.

Scanning rawArgs misses combos like -rs. Use Boolean(flags.source) and the helper to attach source.

-      const analyticsProps = {
+      const analyticsProps = {
         command: CLI_NAME,
         "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
         "has-region-flag":
           rawArgs.includes("--region") || rawArgs.includes("-r"),
         "has-interactive-flag":
           rawArgs.includes("--interactive") || rawArgs.includes("-i"),
         "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
         "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
-        "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"),
+        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+        "has-source-flag": Boolean(flags.source),
         "node-version": process.version,
         platform: process.platform,
         arch: process.arch,
       };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:cli_command_ran", analyticsProps);
+      await captureWithSource("create_db:cli_command_ran", analyticsProps, source);

4-5: Use node: protocol for built-ins.

Prefer node: specifiers for core modules to avoid resolution ambiguity and align with Node guidance.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

62-85: Don’t hand-roll a .env parser; rely on process.env (dotenv.config) instead.

This parser will mis-handle comments, whitespace, CRLF, export prefixes, and quoted/multiline values. You already call dotenv.config() at startup, so just derive the source from process.env.

Replace this helper with a simple derivation helper:

-function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, '.env');
-  
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-  
-  const envContent = fs.readFileSync(envPath, 'utf8');
-  const envVars = {};
-  
-  envContent.split('\n').forEach(line => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith('#')) {
-      const [key, ...valueParts] = trimmed.split('=');
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join('=').replace(/^["']|["']$/g, '');
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-  
-  return envVars;
-}
+function deriveSourceFromProcessEnv() {
+  const { CTA_VERSION, CTA_FRAMEWORK, CTA_FRAMEWORK_VERSION } = process.env;
+  const parts = [];
+  if (CTA_VERSION) parts.push(`v${CTA_VERSION}`);
+  if (CTA_FRAMEWORK) parts.push(CTA_FRAMEWORK);
+  if (CTA_FRAMEWORK_VERSION) parts.push(`fv${CTA_FRAMEWORK_VERSION}`);
+  return parts.length ? parts.join("-") : undefined;
+}

302-313: DRY analytics: centralize “attach source if defined”.

The same “conditionally add source” payload logic repeats across 5 blocks. Extract a helper and use it here to reduce duplication and missed cases.

Add near the top (after the analytics import):

+function captureWithSource(event, props, maybeSource) {
+  const payload = maybeSource ? { ...props, source: maybeSource } : props;
+  return analytics.capture(event, payload);
+}

Then refactor these blocks:

Interactive region selection:

-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-    };
-    
-    if (source) {
-      analyticsProps.source = source;
-    }
-    
-    await analytics.capture("create_db:region_selected", analyticsProps);
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      source
+    );

Rate-limit error:

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "rate_limit",
-        "status-code": 429,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 },
+        source
+      );

Invalid JSON error:

-      const analyticsProps = {
-        command: CLI_NAME,
-        region,
-        "error-type": "invalid_json",
-        "status-code": resp.status,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status },
+        source
+      );

API error:

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "api_error",
-        "error-message": result.error.message,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "api_error", "error-message": result.error.message },
+        source
+      );

Region selected via flag:

-        const analyticsProps = {
-          command: CLI_NAME,
-          region: region,
-          "selection-method": "flag",
-        };
-        
-        if (source) {
-          analyticsProps.source = source;
-        }
-        
-        await analytics.capture("create_db:region_selected", analyticsProps);
+        await captureWithSource(
+          "create_db:region_selected",
+          { command: CLI_NAME, region, "selection-method": "flag" },
+          source
+        );

Also applies to: 348-360, 383-395, 453-465, 590-601


523-541: Duplicate source derivation/validation; compute once via process.env and validate once.

You compute source from .env twice and perform two validations. Collapse into a single derivation from process.env (already populated by dotenv.config()), then validate once.

Derivation:

-          let source;
-    if (flags.source) {
-      const userEnvVars = readUserEnvFile();
-      const userCwd = process.cwd();
-      const envPath = path.join(userCwd, '.env');
-      
-      if (fs.existsSync(envPath)) {
-        const ctaVars = [];
-        if (userEnvVars.CTA_VERSION) ctaVars.push(`v${userEnvVars.CTA_VERSION}`);
-        if (userEnvVars.CTA_FRAMEWORK) ctaVars.push(userEnvVars.CTA_FRAMEWORK);
-        if (userEnvVars.CTA_FRAMEWORK_VERSION) ctaVars.push(`fv${userEnvVars.CTA_FRAMEWORK_VERSION}`);
-        
-        if (ctaVars.length > 0) {
-          source = ctaVars.join('-');
-        }
-      }
-    }
+    let source = undefined;
+    if (flags.source) {
+      source = deriveSourceFromProcessEnv();
+    }

Validation:

-    if (flags.source) {
-      const userCwd = process.cwd();
-      const envPath = path.join(userCwd, '.env');
-      
-      if (!fs.existsSync(envPath)) {
-        console.error(chalk.red("Error: Source not configured correctly."));
-        process.exit(1);
-      }
-      
-      const userEnvVars = readUserEnvFile();
-      const ctaVars = [];
-      if (userEnvVars.CTA_VERSION) ctaVars.push(`v${userEnvVars.CTA_VERSION}`);
-      if (userEnvVars.CTA_FRAMEWORK) ctaVars.push(userEnvVars.CTA_FRAMEWORK);
-      if (userEnvVars.CTA_FRAMEWORK_VERSION) ctaVars.push(`fv${userEnvVars.CTA_FRAMEWORK_VERSION}`);
-      
-      if (ctaVars.length === 0) {
-        console.error(chalk.red("Error: Source not configured correctly."));
-        process.exit(1);
-      }
-    }
+    if (flags.source && !source) {
+      console.error(
+        chalk.red(
+          "Error: Source not configured correctly. Expected CTA_VERSION, CTA_FRAMEWORK and/or CTA_FRAMEWORK_VERSION in .env."
+        )
+      );
+      process.exit(1);
+    }

Also applies to: 604-623

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between a3733fe and 4993671.

📒 Files selected for processing (1)
  • create-db/index.js (18 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (4)
create-db/index.js (4)

129-137: Flag plumbing for --source looks good.

Allowed flags and shorthand mapping are correctly extended.


277-277: Signature change to accept source is appropriate.

Passing source into the region prompt enables consistent analytics enrichment across flows.


435-435: JSON output includes source context — good.

Returning source (or null) in JSON response is helpful for end-to-end validation in CI and downstream tooling.


632-636: Propagation of source through interactive/JSON and create calls looks consistent.

Passing source to promptForRegion and createDatabase in all paths ensures consistent analytics and request attribution.

Also applies to: 660-662, 666-666

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17136895926:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17136895926.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17136985429:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17136985429.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
create-db/index.js (2)

310-321: Preserve UTM segmentation when source is absent.

Requests previously used CLI_NAME for utm_source; with this change, the key disappears when source is undefined. If the backend expects or benefits from a default, fall back to CLI_NAME.

-    body: JSON.stringify({ region, name, utm_source: source }),
+    body: JSON.stringify({ region, name, utm_source: source || CLI_NAME }),

524-547: DRY analytics: reuse captureWithSource; add “has-source-flag” if you ship the flag.

Replace manual payload building and optional source-mutation with the helper; also consider tracking has-source-flag separately from has-source-from-env.

-      const analyticsProps = {
-        command: CLI_NAME,
-        "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
-        "has-region-flag":
-          rawArgs.includes("--region") || rawArgs.includes("-r"),
-        "has-interactive-flag":
-          rawArgs.includes("--interactive") || rawArgs.includes("-i"),
-        "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
-        "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
-        "has-source-from-env": !!source,
-        "node-version": process.version,
-        platform: process.platform,
-        arch: process.arch,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:cli_command_ran", analyticsProps);
+      await captureWithSource(
+        "create_db:cli_command_ran",
+        {
+          command: CLI_NAME,
+          "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
+          "has-region-flag": rawArgs.includes("--region") || rawArgs.includes("-r"),
+          "has-interactive-flag": rawArgs.includes("--interactive") || rawArgs.includes("-i"),
+          "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
+          "has-list-regions-flag": rawArgs.includes("--list-regions"),
+          "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+          "has-source-from-env": !!source && !rawArgs.includes("--source") && !rawArgs.includes("-s"),
+          "has-source-flag": rawArgs.includes("--source") || rawArgs.includes("-s"),
+          "node-version": process.version,
+          platform: process.platform,
+          arch: process.arch,
+        },
+        source
+      );
♻️ Duplicate comments (8)
create-db/index.js (8)

166-177: Bug: help short-flag path doesn’t await showHelp() or return.

Single short-flag branch should mirror the combined short-flags branch to avoid falling through.

       if (shorthandMap[short]) {
         const mappedFlag = shorthandMap[short];
-        if (mappedFlag === "help") showHelp();
+        if (mappedFlag === "help") { await showHelp(); return; }
         if (mappedFlag === "region") {
           const region = args[i + 1];
           if (!region || region.startsWith("-"))
             exitWithError("Missing value for -r flag.");
           flags.region = region;
           i++;
         } else {
           flags[mappedFlag] = true;
         }
       }

269-305: DRY analytics capture; introduce captureWithSource helper.

You repeat “attach source if defined” pattern. Centralize to avoid drift.

   try {
-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-    };
-    
-    if (source) {
-      analyticsProps.source = source;
-    }
-    
-    await analytics.capture("create_db:region_selected", analyticsProps);
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      source
+    );
   } catch (error) {}

Add once near the top (after the analytics import):

function captureWithSource(event, props, maybeSource) {
  const payload = maybeSource ? { ...props, source: maybeSource } : props;
  return analytics.capture(event, payload);
}

4-5: Use node: protocol for built-ins.

Prefer node:fs and node:path to match Node’s guidance and avoid resolution ambiguity.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

62-85: Don’t hand-roll .env parsing; use dotenv.parse or process.env.

The custom parser mishandles quotes, CRLF, inline comments, export prefixes, and values containing '='. Since dotenv.config() is already called, prefer process.env; minimally, swap in dotenv.parse.

Minimal, safer fix for this helper:

 function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, '.env');
-  
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-  
-  const envContent = fs.readFileSync(envPath, 'utf8');
-  const envVars = {};
-  
-  envContent.split('\n').forEach(line => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith('#')) {
-      const [key, ...valueParts] = trimmed.split('=');
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join('=').replace(/^["']|["']$/g, '');
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-  
-  return envVars;
+  const envPath = path.join(process.cwd(), ".env");
+  if (!fs.existsSync(envPath)) return {};
+  return dotenv.parse(fs.readFileSync(envPath, "utf8"));
 }

Better: delete this helper and read from process.env everywhere (populated by dotenv.config()).


340-352: Repeat pattern: use captureWithSource helper (rate-limit path).

Apply the helper to keep payload construction consistent.

-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "rate_limit",
-        "status-code": 429,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 },
+        source
+      );

375-387: Repeat pattern: use captureWithSource helper (invalid JSON path).

-      const analyticsProps = {
-        command: CLI_NAME,
-        region,
-        "error-type": "invalid_json",
-        "status-code": resp.status,
-      };
-      
-      if (source) {
-        analyticsProps.source = source;
-      }
-      
-      await analytics.capture("create_db:database_creation_failed", analyticsProps);
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status },
+        source
+      );

570-581: Repeat pattern: use captureWithSource helper (region flag path).

-        const analyticsProps = {
-          command: CLI_NAME,
-          region: region,
-          "selection-method": "flag",
-        };
-        
-        if (source) {
-          analyticsProps.source = source;
-        }
-        
-        await analytics.capture("create_db:region_selected", analyticsProps);
+        await captureWithSource(
+          "create_db:region_selected",
+          { command: CLI_NAME, region, "selection-method": "flag" },
+          source
+        );

414-414: Bug: claim URL produces “utm_source=undefined” when source is missing.

This pollutes analytics and produces a literal “undefined” in links. Fall back to CLI_NAME and ensure proper encoding.

-  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${source}&utm_medium=cli`;
+  const utmSource = encodeURIComponent(source || CLI_NAME);
+  const claimUrl = `${CLAIM_DB_WORKER_URL}?projectID=${projectId}&utm_source=${utmSource}&utm_medium=cli`;
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 4993671 and f628681.

📒 Files selected for processing (1)
  • create-db/index.js (14 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (48-48)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (2)
create-db/index.js (2)

426-427: LGTM: include source in JSON output.

Returning source (or null) in --json aligns with observability/traceability goals.


594-599: LGTM: threaded source through interactive/JSON and default flows.

Passing source into promptForRegion and createDatabase maintains consistent tracing across modes.

Also applies to: 622-627

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17164565333:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17164565333.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
create-db/index.js (1)

310-321: Preserve default attribution when env-derived source is absent.

When userAgent is undefined, utm_source drops from the payload entirely. If you previously attributed to the CLI, you can keep continuity with a fallback.

-    body: JSON.stringify({ region, name, utm_source: userAgent }),
+    body: JSON.stringify({ region, name, utm_source: userAgent || CLI_NAME }),
♻️ Duplicate comments (9)
create-db/index.js (9)

269-305: DRY analytics: factor “attach source if present” into a helper; also standardize field name.

This pattern appears multiple times. Extract a helper to avoid drift and use a consistent key (source) instead of user-agent (which can be confused with the HTTP header).

-  try {
-    const analyticsProps = {
-      command: CLI_NAME,
-      region: region,
-      "selection-method": "interactive",
-    };
-
-    if (userAgent) {
-      analyticsProps["user-agent"] = userAgent;
-    }
-
-    await analytics.capture("create_db:region_selected", analyticsProps);
-  } catch (error) {}
+  try {
+    await captureWithSource(
+      "create_db:region_selected",
+      { command: CLI_NAME, region, "selection-method": "interactive" },
+      userAgent
+    );
+  } catch {}

Add near the top-level (after the analytics import):

function captureWithSource(event, props, maybeSource) {
  const payload = maybeSource ? { ...props, source: maybeSource } : props;
  return analytics.capture(event, payload);
}

538-561: Analytics props: compute “has-*” from parsed flags; keep analytics failures silent for users.

  • Use flags instead of scanning rawArgs to avoid missing combined short flags and quoting edge cases.
  • Avoid logging analytics errors to stdout/stderr except in development, to not confuse CLI users.
-      const analyticsProps = {
+      const analyticsProps = {
         command: CLI_NAME,
         "full-command": `${CLI_NAME} ${rawArgs.join(" ")}`.trim(),
-        "has-region-flag":
-          rawArgs.includes("--region") || rawArgs.includes("-r"),
-        "has-interactive-flag":
-          rawArgs.includes("--interactive") || rawArgs.includes("-i"),
-        "has-help-flag": rawArgs.includes("--help") || rawArgs.includes("-h"),
-        "has-list-regions-flag": rawArgs.includes("--list-regions"),
-        "has-json-flag": rawArgs.includes("--json") || rawArgs.includes("-j"),
+        "has-region-flag": Boolean(flags.region),
+        "has-interactive-flag": Boolean(flags.interactive),
+        "has-help-flag": Boolean(flags.help),
+        "has-list-regions-flag": Boolean(flags["list-regions"]),
+        "has-json-flag": Boolean(flags.json),
         "has-source-from-env": !!userAgent,
         "node-version": process.version,
         platform: process.platform,
         arch: process.arch,
       };
@@
-      await analytics.capture("create_db:cli_command_ran", analyticsProps);
-    } catch (error) {
-      console.error("Error:", error.message);
-    }
+      await analytics.capture("create_db:cli_command_ran", analyticsProps);
+    } catch (error) {
+      if (process.env.NODE_ENV === "development") {
+        console.error("Analytics error:", error.message);
+      }
+    }

4-5: Use node: protocol for built-ins (or drop imports if the custom .env parser is removed).

Follow Node guidance and avoid resolution ambiguity by using node: specifiers.

-import fs from "fs";
-import path from "path";
+import fs from "node:fs";
+import path from "node:path";

62-85: Don’t hand-roll a .env parser; rely on dotenv/process.env and centralize source derivation.

This parser mishandles edge cases (quotes, CRLF, inline comments, export, multiline). You already call dotenv.config(); read from process.env and delete this function. Also, add a tiny helper to compute the source once.

Replace this block with:

-function readUserEnvFile() {
-  const userCwd = process.cwd();
-  const envPath = path.join(userCwd, ".env");
-
-  if (!fs.existsSync(envPath)) {
-    return {};
-  }
-
-  const envContent = fs.readFileSync(envPath, "utf8");
-  const envVars = {};
-
-  envContent.split("\n").forEach((line) => {
-    const trimmed = line.trim();
-    if (trimmed && !trimmed.startsWith("#")) {
-      const [key, ...valueParts] = trimmed.split("=");
-      if (key && valueParts.length > 0) {
-        const value = valueParts.join("=").replace(/^["']|["']$/g, "");
-        envVars[key.trim()] = value.trim();
-      }
-    }
-  });
-
-  return envVars;
-}
+function getSourceFromEnv() {
+  const { PRISMA_ACTOR_NAME, PRISMA_ACTOR_PROJECT } = process.env;
+  return PRISMA_ACTOR_NAME && PRISMA_ACTOR_PROJECT
+    ? `${PRISMA_ACTOR_NAME}/${PRISMA_ACTOR_PROJECT}`
+    : undefined;
+}

339-355: DRY analytics for rate-limit error path.

Use the same captureWithSource helper here.

-    try {
-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "rate_limit",
-        "status-code": 429,
-      };
-
-      if (userAgent) {
-        analyticsProps["user-agent"] = userAgent;
-      }
-
-      await analytics.capture(
-        "create_db:database_creation_failed",
-        analyticsProps
-      );
-    } catch (error) {}
+    try {
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "rate_limit", "status-code": 429 },
+        userAgent
+      );
+    } catch {}

378-393: DRY analytics for invalid JSON error path.

Same duplication; use the helper.

-    try {
-      const analyticsProps = {
-        command: CLI_NAME,
-        region,
-        "error-type": "invalid_json",
-        "status-code": resp.status,
-      };
-
-      if (userAgent) {
-        analyticsProps["user-agent"] = userAgent;
-      }
-
-      await analytics.capture(
-        "create_db:database_creation_failed",
-        analyticsProps
-      );
-    } catch {}
+    try {
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "invalid_json", "status-code": resp.status },
+        userAgent
+      );
+    } catch {}

456-471: DRY analytics for API error path.

Replace manual prop building with the shared helper.

-    try {
-      const analyticsProps = {
-        command: CLI_NAME,
-        region: region,
-        "error-type": "api_error",
-        "error-message": result.error.message,
-      };
-
-      if (userAgent) {
-        analyticsProps["user-agent"] = userAgent;
-      }
-
-      await analytics.capture(
-        "create_db:database_creation_failed",
-        analyticsProps
-      );
-    } catch (error) {}
+    try {
+      await captureWithSource(
+        "create_db:database_creation_failed",
+        { command: CLI_NAME, region, "error-type": "api_error", "error-message": result.error.message },
+        userAgent
+      );
+    } catch {}

529-536: Derive userAgent from process.env (or flags), not by re-reading .env. Also PR title vs code mismatch.

  • Replace readUserEnvFile() usage with a one-liner reading process.env (dotenv already ran).
  • PR title says “--source flag added” but there is no such flag parsed/helped here. Either implement it or update the title.
-    let userAgent;
-    const userEnvVars = readUserEnvFile();
-    if (userEnvVars.PRISMA_ACTOR_NAME && userEnvVars.PRISMA_ACTOR_PROJECT) {
-      userAgent = `${userEnvVars.PRISMA_ACTOR_NAME}/${userEnvVars.PRISMA_ACTOR_PROJECT}`;
-    }
+    // Flag (if implemented) should override ENV; otherwise pull from ENV.
+    let userAgent = /* flags.source ?? */ getSourceFromEnv();

Would you like a follow-up patch to add --source/-s (help text, allowed flags, and parsing) so it overrides the env-derived value?


584-595: DRY analytics for region flag path.

Use the shared helper for consistency with the interactive path.

-      try {
-        const analyticsProps = {
-          command: CLI_NAME,
-          region: region,
-          "selection-method": "flag",
-        };
-
-        if (userAgent) {
-          analyticsProps["user-agent"] = userAgent;
-        }
-
-        await analytics.capture("create_db:region_selected", analyticsProps);
-      } catch (error) {}
+      try {
+        await captureWithSource(
+          "create_db:region_selected",
+          { command: CLI_NAME, region, "selection-method": "flag" },
+          userAgent
+        );
+      } catch {}
📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 8bfa0ae and 4666b46.

📒 Files selected for processing (2)
  • create-db/analytics.js (1 hunks)
  • create-db/index.js (15 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
create-db/index.js (1)
create-db/analytics.js (1)
  • analytics (50-50)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker
🔇 Additional comments (3)
create-db/index.js (3)

424-439: JSON output: field name is consistent and optional — LGTM.

Including source only when provided avoids changing the default contract for existing consumers.


605-610: Propagating userAgent through the JSON path — LGTM.

Passing the source into promptForRegion and createDatabase keeps attribution consistent in non-interactive mode.


634-639: Propagating userAgent through the interactive path — LGTM.

Attribution remains consistent across flows.

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17164722487:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17164722487.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (4)
create-db/analytics.js (4)

24-31: Bound the network call with a short timeout and make it fire-and-forget.

A hung analytics POST can stall CLI UX. Add an AbortController timeout (configurable) and mark the request keepalive.

Apply this diff:

-    try {
-      const response = await fetch(POSTHOG_CAPTURE_URL, {
+    try {
+      const controller = new AbortController();
+      const timeoutMs = Number(process.env.ANALYTICS_TIMEOUT_MS ?? 2000);
+      const timeout = setTimeout(() => controller.abort(), timeoutMs);
+      const response = await fetch(POSTHOG_CAPTURE_URL, {
         method: "POST",
         headers: {
           "Content-Type": "application/json",
         },
-        body: JSON.stringify(payload),
+        body: JSON.stringify(payload),
+        keepalive: true,
+        signal: controller.signal,
       });
+      clearTimeout(timeout);

33-35: Include HTTP status code for easier debugging.

Status text can be empty; include the numeric code for clarity.

Apply this diff:

-      if (!response.ok) {
-        throw new EventCaptureError(eventName, response.statusText);
-      }
+      if (!response.ok) {
+        throw new EventCaptureError(
+          eventName,
+          `${response.status} ${response.statusText || "Unknown"}`
+        );
+      }

17-18: Avoid per-event random distinct_id; prefer a stable, privacy-safe identifier.

A new randomUUID per event prevents session/user-level aggregation. Consider a stable distinct_id (e.g., persisted machine/session ID, or an env-provided userAgent/source when privacy permits). Keep $process_person_profile: false to avoid PII.

I can draft a tiny helper that persists a UUID at ~/.config/prisma/create-db/machine-id (or respects XDG) and falls back to random when unwritable—say the word.


37-41: Optional: add a targeted debug toggle.

Logging only in NODE_ENV=development is fine. If you want opt-in visibility on CI without flipping NODE_ENV, consider also honoring DEBUG=create-db:analytics.

Proposed tweak inside catch:

if (
  process.env.NODE_ENV === "development" ||
  process.env.DEBUG === "create-db:analytics"
) {
  console.error("Analytics error:", error.message);
}
♻️ Duplicate comments (1)
create-db/analytics.js (1)

12-12: Nice: hardcoded PostHog key removed.

This addresses the prior review about shipping a baked-in API key. Good change.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between 4666b46 and be9f976.

📒 Files selected for processing (1)
  • create-db/analytics.js (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: claim-db-worker
  • GitHub Check: Workers Builds: create-db-worker

Copy link

Preview CLIs & Workers are live!

Test the CLIs locally under tag pr45-DC-4829-source-flag-17164819065:

npx create-db@pr45
npx create-pg@pr45
npx create-postgres@$pr45

Worker URLs
• Create-DB Worker:
• Claim-DB Worker:

These will live as long as this PR exists under tag pr45-DC-4829-source-flag-17164819065.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (3)
create-db/analytics.js (3)

24-31: Allow callers to provide a stable distinct_id to improve event correlation.

Right now every event gets a fresh UUID, which fragments user sessions. Prefer a caller-provided distinct_id when available (e.g., a persisted CLI install ID), falling back to a UUID.

Apply this diff:

-      distinct_id: randomUUID(),
+      // Prefer caller-provided distinct_id for stable correlation across events
+      distinct_id: properties?.distinct_id ?? randomUUID(),

33-50: Add a short timeout via AbortController to avoid hanging on network stalls.

Without a timeout, fetch can hang indefinitely and delay CLI exit. Abort after a few seconds; still silent-fail outside development.

Apply this diff:

-    try {
-      const response = await fetch(POSTHOG_CAPTURE_URL, {
+    const controller = new AbortController();
+    const timeoutId = setTimeout(() => controller.abort(), 5000);
+    try {
+      const response = await fetch(POSTHOG_CAPTURE_URL, {
         method: "POST",
         headers: {
           "Content-Type": "application/json",
         },
-        body: JSON.stringify(payload),
+        body: JSON.stringify(payload),
+        signal: controller.signal,
       });
 
       if (!response.ok) {
         throw new EventCaptureError(eventName, response.statusText);
       }
-    } catch (error) {
+    } catch (error) {
       // Silently fail analytics to not disrupt user experience
       if (process.env.NODE_ENV === "development") {
         console.error("Analytics error:", error.message);
       }
-    }
+    } finally {
+      clearTimeout(timeoutId);
+    }

42-44: Include numeric status code; statusText can be empty in Node fetch.

This improves diagnostics while keeping behavior unchanged.

Apply this diff:

-        throw new EventCaptureError(eventName, response.statusText);
+        const text = response.statusText || "Unknown Status";
+        throw new EventCaptureError(
+          eventName,
+          `${response.status} ${text}`.trim()
+        );
♻️ Duplicate comments (1)
create-db/analytics.js (1)

11-21: Resolved: analytics now fail-closed without a baked-in key; thanks for addressing prior concerns.

You now gate on both POSTHOG_API_HOST and POSTHOG_API_KEY and early-return, with a dev-only warning. This removes the committed default key and avoids posting to undefined/capture. Looks good.

📜 Review details

Configuration used: CodeRabbit UI

Review profile: ASSERTIVE

Plan: Pro

💡 Knowledge Base configuration:

  • MCP integration is disabled by default for public repositories
  • Jira integration is disabled by default for public repositories
  • Linear integration is disabled by default for public repositories

You can enable these sources in your CodeRabbit configuration.

📥 Commits

Reviewing files that changed from the base of the PR and between be9f976 and 76d7ec7.

📒 Files selected for processing (1)
  • create-db/analytics.js (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (2)
  • GitHub Check: Workers Builds: create-db-worker
  • GitHub Check: Workers Builds: claim-db-worker
🔇 Additional comments (1)
create-db/analytics.js (1)

34-34: Confirm Node runtime guarantees global fetch; otherwise polyfill or guard.

If the CLI runs on Node < 18, global fetch is undefined. Either enforce engines >= 18 or add a lazy import/polyfill.

Would you verify the repo’s engines.node and runtime target? If engines < 18 or unspecified, I can add a tiny guard like:

-      const response = await fetch(POSTHOG_CAPTURE_URL, {
+      const _fetch = globalThis.fetch ?? (await import("node-fetch")).default;
+      const response = await _fetch(POSTHOG_CAPTURE_URL, {

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant