Skip to content

Fix finance memory leak remove resize listener leak and layout shift#5285

Open
codxbrexx wants to merge 6 commits into
asyncapi:masterfrom
codxbrexx:fix-finance-memory-leak
Open

Fix finance memory leak remove resize listener leak and layout shift#5285
codxbrexx wants to merge 6 commits into
asyncapi:masterfrom
codxbrexx:fix-finance-memory-leak

Conversation

@codxbrexx
Copy link
Copy Markdown
Contributor

@codxbrexx codxbrexx commented Mar 27, 2026

Description

This fixes the Finance page memory leak by removing the page-level resize listener that was not being cleaned up correctly. It also replaces the JavaScript width check with responsive CSS so the page no longer shifts layout after load.

Related issue(s)

Resolves #5114

Summary by CodeRabbit

  • New Features

    • Ambassador profiles now show social links only when available.
  • Refactor

    • Simplified finance page layout to use a fixed, responsive container.
  • Tests

    • Added end-to-end tests covering Ambassadors page rendering and links.
    • Added unit tests for ambassador data enrichment logic.
  • Chores

    • CI pipeline expanded to include build and E2E steps.
    • Updated tooling metadata and adjusted TypeScript config to exclude test tooling directory.

@netlify
Copy link
Copy Markdown

netlify Bot commented Mar 27, 2026

Deploy Preview for asyncapi-website ready!

Built without sensitive environment variables

Name Link
🔨 Latest commit f5160a5
🔍 Latest deploy log https://app.netlify.com/projects/asyncapi-website/deploys/69faf9178450940008b13ac8
😎 Deploy Preview https://deploy-preview-5285--asyncapi-website.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.
🤖 Make changes Run an agent on this branch

To edit notification comments on pull requests, go to your Netlify project configuration.

@asyncapi-bot
Copy link
Copy Markdown
Contributor

asyncapi-bot commented Mar 27, 2026

We require all PRs to follow Conventional Commits specification.
More details 👇🏼

 No release type found in pull request title "Fix finance memory leak remove resize listener leak and layout shift". Add a prefix to indicate what kind of release this pull request corresponds to. For reference, see https://www.conventionalcommits.org/

Available types:
 - feat: A new feature
 - fix: A bug fix
 - docs: Documentation only changes
 - style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
 - refactor: A code change that neither fixes a bug nor adds a feature
 - perf: A code change that improves performance
 - test: Adding missing tests or correcting existing tests
 - build: Changes that affect the build system or external dependencies (example scopes: gulp, broccoli, npm)
 - ci: Changes to our CI configuration files and scripts (example scopes: Travis, Circle, BrowserStack, SauceLabs)
 - chore: Other changes that don't modify src or test files
 - revert: Reverts a previous commit

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Mar 27, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: c5007f62-b5d5-4cb7-bfcd-8cc9ddc87fcf

📥 Commits

Reviewing files that changed from the base of the PR and between 464012d and f5160a5.

📒 Files selected for processing (2)
  • config/tools.json
  • pages/community/ambassadors/index.tsx

📝 Walkthrough

Walkthrough

This PR removes client-side resize handling from the Finance page, adds a reusable ambassadors utility and conditional social-link rendering, introduces unit and E2E tests for ambassadors, updates tsconfig to exclude cypress, adjusts tool names in config/tools.json, and expands the Node.js PR testing workflow to run tests, build, and E2E steps.

Changes

Finance Page Layout Fix

Layer / File(s) Summary
Layout Refactor
pages/finance.tsx
Removed useState, useRef, and useEffect resize tracking and conditional Container rendering; replaced with fixed Tailwind container w-full 2xl:mx-auto 2xl:max-w-screen-xl.

Ambassadors Social Links Enhancement

Layer / File(s) Summary
Data Shape / Utility
utils/ambassadors.ts
New addAdditionalUserInfo(user: Ambassador) clones user and adds githubUrl, linkedinUrl, twitterUrl and sets img to GitHub avatar when appropriate.
Component Integration
pages/community/ambassadors/index.tsx
Replaced in-file helper with import from utils/ambassadors; render Twitter/GitHub/LinkedIn links conditionally only when URLs exist.
Unit Tests
tests/pages/community/ambassadors.test.ts
New unit tests for addAdditionalUserInfo: social URL generation and image handling across scenarios.
E2E Tests
cypress/ambassadors.cy.ts
New Cypress suite validating ambassador card rendering, conditional social links, link attributes, image loading, navigation, layout assertions, and edge cases.
TypeScript Config
tsconfig.json
Added cypress to exclude array to avoid type-checking test files.

Tools Configuration Update

Layer / File(s) Summary
Config Data
config/tools.json
Replaced two "Liquid" entries with "AsyncAPI CLI" in Bielu.AspNetCore.AsyncApi and ZenWave SDK technology lists (data/config-only changes).

CI Workflow Enhancement

Layer / File(s) Summary
Workflow Steps
.github/workflows/if-nodejs-pr-testing.yml
Replaced single "Test" step with three sequential steps: "Run tests" (npm test --if-present), "Build project" (npm run build --if-present), and "Run E2E tests (Cypress)" (npm run test:e2e --if-present); steps gated on package.json existence and matrix conditions where applicable.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

Poem

🐰 I hopped through code with nimble paws,
Replaced the resize with Tailwind laws.
Ambassadors now wear proper links,
Tests chase bugs and CI winks—
A tiny rabbit cheers the cause.

🚥 Pre-merge checks | ✅ 4 | ❌ 2

❌ Failed checks (2 warnings)

Check name Status Explanation Resolution
Out of Scope Changes check ⚠️ Warning Significant out-of-scope changes detected: ambassador component refactoring (moving utility function), new E2E and unit tests, and workflow expansions are unrelated to the finance memory leak issue #5114. Consider separating the ambassador utility extraction and testing changes into a separate PR, keeping this PR focused solely on the finance.tsx memory leak and layout shift fix.
Docstring Coverage ⚠️ Warning Docstring coverage is 64.29% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (4 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed Title is partially related to the changeset; it correctly describes the finance memory leak fix but inadequately represents other significant changes like ambassador refactoring and E2E testing additions.
Linked Issues check ✅ Passed The PR resolves issue #5114 by eliminating the resize listener leak and layout shift in finance.tsx through removing useState/useRef/useEffect resize handling and replacing it with CSS-based responsive layout.
Description check ✅ Passed The PR description accurately explains the core changes made to finance.tsx to address the memory leak and layout shift issues mentioned in #5114.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codecov
Copy link
Copy Markdown

codecov Bot commented Mar 27, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 100.00%. Comparing base (8c71e84) to head (f5160a5).

Additional details and impacted files
@@            Coverage Diff            @@
##            master     #5285   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           22        22           
  Lines          830       830           
  Branches       159       159           
=========================================
  Hits           830       830           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@asyncapi-bot
Copy link
Copy Markdown
Contributor

asyncapi-bot commented Mar 27, 2026

⚡️ Lighthouse report for the changes in this PR:

Category Score
🔴 Performance 45
🟢 Accessibility 98
🟢 Best practices 92
🟢 SEO 100
🔴 PWA 33

Lighthouse ran on https://deploy-preview-5285--asyncapi-website.netlify.app/

@sonarqubecloud
Copy link
Copy Markdown

@princerajpoot20
Copy link
Copy Markdown
Member

@codxbrexx please pay attention to the comments by bot and ai.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 17

Note

Due to the large number of review comments, Critical, Major severity comments were prioritized as inline comments.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (4)
.github/workflows/bounty-program-commands.yml (2)

130-135: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

removeLabel payload uses the wrong type for name.

Line 134 passes an array, but this endpoint expects a single label name string. This will cause label removal to fail at runtime.

Proposed fix
-              github.rest.issues.removeLabel({
+              await github.rest.issues.removeLabel({
                 issue_number: context.issue.number,
                 owner: context.repo.owner,
                 repo: context.repo.repo,
-                name: [BOUNTY_PROGRAM_LABELS[0].name]
+                name: BOUNTY_PROGRAM_LABELS[0].name
               })
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/bounty-program-commands.yml around lines 130 - 135, The
removeLabel call is passing an array for the name parameter which expects a
single string; update the github.rest.issues.removeLabel invocation (the
removeLabel call that uses BOUNTY_PROGRAM_LABELS) to pass a string not an array
by replacing name: [BOUNTY_PROGRAM_LABELS[0].name] with name:
BOUNTY_PROGRAM_LABELS[0].name and optionally guard against undefined (e.g.,
compute const label = BOUNTY_PROGRAM_LABELS[0]?.name and only call removeLabel
when label is a non-empty string).

51-56: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Add await to GitHub REST API calls to ensure requests complete before the step exits.

Without await, the workflow step can complete before these async calls finish, causing flaky command execution. The actions/github-script documentation explicitly requires you to await any async work in the script body.

Three locations need fixes:

  • Line 51: github.rest.issues.createComment()
  • Line 95: github.rest.issues.addLabels()
  • Line 130: github.rest.issues.removeLabel()
Proposed fix
-            github.rest.issues.createComment({
+            await github.rest.issues.createComment({
                 issue_number: context.issue.number,
                 owner: context.repo.owner,
                 repo: context.repo.repo,
                 body: commentText
               })
...
-            github.rest.issues.addLabels({
+            await github.rest.issues.addLabels({
               issue_number: context.issue.number,
               owner: context.repo.owner,
               repo: context.repo.repo,
               labels: [BOUNTY_PROGRAM_LABELS[0].name]
             })
...
-              github.rest.issues.removeLabel({
+              await github.rest.issues.removeLabel({
                 issue_number: context.issue.number,
                 owner: context.repo.owner,
                 repo: context.repo.repo,
                 name: BOUNTY_PROGRAM_LABELS[0].name
               })
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/bounty-program-commands.yml around lines 51 - 56, The
three GitHub REST API calls (github.rest.issues.createComment,
github.rest.issues.addLabels, and github.rest.issues.removeLabel) are invoked
without awaiting their returned promises; prepend each call with await so the
actions/github-script body waits for the requests to finish before the step
exits, i.e., change github.rest.issues.createComment(...),
github.rest.issues.addLabels(...), and github.rest.issues.removeLabel(...) to
await github.rest.issues.createComment(...), await
github.rest.issues.addLabels(...), and await
github.rest.issues.removeLabel(...).
components/tools/ToolsDashboard.tsx (1)

149-162: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

[toolsList] dependency causes the page to scroll back to the hash anchor on every filter change.

toolsList is recomputed each time any filter changes. With [toolsList] as the dependency, this effect fires on every filter interaction — so a user on /tools#CLIs who applies a language filter will have the page jerk-scroll back to #CLIs after each keystroke or toggle.

Since toolsList is a useMemo (evaluated synchronously during render), it is already populated when this effect first runs. The original [] dependency is the correct choice — the effect only needs to fire once, at mount, to honour the initial URL hash.

Based on learnings, if you intentionally keep [] (omitting toolsList), add an eslint-disable comment with a rationale note.

🐛 Proposed fix
- }, [toolsList]);
+ // eslint-disable-next-line react-hooks/exhaustive-deps: intentionally omitting toolsList — it is synchronously
+ // derived via useMemo and is already populated on first render; this effect only needs to run once at mount.
+ }, []);
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@components/tools/ToolsDashboard.tsx` around lines 149 - 162, The useEffect in
ToolsDashboard currently depends on toolsList which is recomputed on every
filter change causing repeated scroll-to-hash; change the effect dependency
array to [] so it only runs once on mount and restore the original mount-only
behavior, and if you intentionally omit toolsList add an inline
eslint-disable-next-line react-hooks/exhaustive-deps comment above the useEffect
with a brief rationale referencing that toolsList is a synchronous useMemo and
the effect only needs to run on initial mount to honor the URL hash.
.github/workflows/scripts/mailchimp/htmlContent.js (1)

1-22: ⚠️ Potential issue | 🟠 Major | ⚡ Quick win

This centrally-managed file will be overwritten—remove or fix it upstream.

The header on this file explicitly states it is centrally managed in asyncapi/.github and any local changes will be overwritten. Adding escapeHtml here means the fix won't persist after the next sync. Since this PR already removed .github/workflows/scripts/mailchimp/index.js, the mailchimp/htmlContent.js file is now dead code with no callers. Either update the central repo first, or delete the orphaned mailchimp template entirely.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/scripts/mailchimp/htmlContent.js around lines 1 - 22, The
file contains a centrally-managed escapeHtml function and a mailchimp template
(escapeHtml and module.exports) that are now orphaned because
.github/workflows/scripts/mailchimp/index.js was removed; either remove this
dead mailchimp/htmlContent.js file from the repo or apply the permanent fix
upstream in the central asyncapi/.github repo so the change persists—locate the
escapeHtml function and the module.exports export in this file and delete the
file locally (or mirror the same deletion/update in the central repository) so
there are no unused, overwritten assets left behind.
🟡 Minor comments (25)
markdown/blog/community-spotlight-hugo.md-20-20 (1)

20-20: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Hyphenate adjectival “open-source” for consistency.

At Line 20 and Line 100, use open-source when used as an adjective (e.g., open-source contributor, open-source values).

Also applies to: 100-100

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/blog/community-spotlight-hugo.md` at line 20, Replace the
unhyphenated adjectival phrases "open source contributor" and "open source
values" with the hyphenated form "open-source contributor" and "open-source
values" respectively; search for the literal strings "open source contributor"
and "open source values" in the markdown (e.g., the sentence containing "Hugo
Guerrero, a seasoned software developer, open source contributor, technical
marketer...") and update them to use the hyphenated adjective form for
consistency.
markdown/blog/publish-subscribe-semantics.md-57-57 (1)

57-57: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Use hyphenation for compound adjective in user-facing docs.

At Line 57, change event driven architecture to event-driven architecture for standard technical writing consistency.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/blog/publish-subscribe-semantics.md` at line 57, Change the phrase
"event driven architecture" to the hyphenated form "event-driven architecture"
in the document (replace the exact token "event driven architecture" at the
occurrence shown); update the user-facing sentence so it reads "In an
event-driven architecture" to follow standard compound-adjective hyphenation
rules.
markdown/blog/community-spotlight-hugo.md-84-84 (1)

84-84: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Markdown emphasis is inconsistent with surrounding interview quotes.

Line 84 uses **Hugo: ...** while adjacent speaker quotes use ***Hugo:** ...*. Align this line to the same pattern to avoid visual inconsistency in rendered output.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/blog/community-spotlight-hugo.md` at line 84, Replace the
inconsistent emphasis on the line that currently reads "**Hugo: We were singing
and dancing on the stage, so you can imagine.**" so it matches the surrounding
speaker-quote pattern (use the same asterisk pattern used by adjacent lines,
e.g. change to "***Hugo:** We were singing and dancing on the stage, so you can
imagine.*") so the rendered emphasis is consistent with other interview quotes.
markdown/docs/reference/bindings/websockets.md-37-38 (1)

37-38: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Correct Reference Object link syntax to use the declared reference.

Lines 37-38 should use reference-style links ([referenceObject]), otherwise the links are invalid and Line 58 remains unused.

Proposed fix
-<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key.
-<a name="operationBindingObjectHeaders"></a>`headers` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) | A Schema object containing the definitions of the HTTP headers to use when establishing the connection. This schema MUST be of type `object` and have a `properties` key.
+<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key.
+<a name="operationBindingObjectHeaders"></a>`headers` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] | A Schema object containing the definitions of the HTTP headers to use when establishing the connection. This schema MUST be of type `object` and have a `properties` key.

Also applies to: 58-58

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/websockets.md` around lines 37 - 38, Update
the two inline Reference Object links so they use the declared reference-style
label instead of inline parentheses: in the `query` and `headers` rows replace
the `(referenceObject)` inline link with the reference-style
`[referenceObject]`, and make the same change for the other occurrence around
the `operationBindingObject` section (the unused Line 58 reference). This
ensures the declared `[referenceObject]` link label is used consistently and the
reference at the bottom is actually resolved.
markdown/docs/reference/bindings/websockets.md-32-32 (1)

32-32: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix heading-level jump in section structure.

Line 32 jumps from ## to #####. Use ### (or #### if there is an intermediate heading) to keep hierarchy consistent.

Proposed fix
-##### Fixed Fields
+### Fixed Fields
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/websockets.md` at line 32, The heading
"##### Fixed Fields" breaks the document hierarchy by jumping from a
higher-level "##" heading; update the heading to "### Fixed Fields" (or "####
Fixed Fields" if there's an intermediate level) so it follows the parent "##"
section; locate the "##### Fixed Fields" token in the markdown and replace it
with the appropriate level (prefer "### Fixed Fields" to restore consistent
structure).
markdown/docs/reference/bindings/solace.md-74-74 (1)

74-74: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Normalize heading style to ATX (remove trailing ##).

These headings use atx_closed, which conflicts with the configured MD003 style.

Proposed fix
-## Example with two destinations ##
+## Example with two destinations
...
-## Example with a wildcard subscription ##
+## Example with a wildcard subscription

Also applies to: 146-146

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/solace.md` at line 74, The level-2 heading
"## Example with two destinations ##" uses an atx_closed style; update it to
standard ATX by removing the trailing "##" so it reads "## Example with two
destinations" and do the same for the other occurrence noted (line 146) to
comply with the MD003 heading style.
markdown/docs/reference/bindings/solace.md-56-61 (1)

56-61: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Use descriptive link text instead of repeated “here.”

These links are user-facing docs and currently fail MD059; replacing generic text improves accessibility and scanability.

Proposed fix
-| `deliveryMode`             | Enum           | 'direct' or 'persistent'. This determines the quality of service for publishing messages as documented [here.](https://docs.solace.com/Get-Started/Core-Concepts-Message-Delivery-Modes.htm) Default is 'persistent'.                |
+| `deliveryMode`             | Enum           | 'direct' or 'persistent'. This determines the quality of service for publishing messages as documented in the [Solace message delivery modes guide](https://docs.solace.com/Get-Started/Core-Concepts-Message-Delivery-Modes.htm). Default is 'persistent'. |
-| `queue.accessType`         | Enum           | 'exclusive' or 'nonexclusive'. This is documented [here.](https://docs.solace.com/Messaging/Guaranteed-Msg/Endpoints.htm#Queues) Only applicable when destinationType is 'queue'.                                                    |
+| `queue.accessType`         | Enum           | 'exclusive' or 'nonexclusive'. See [Solace queue endpoint access types](https://docs.solace.com/Messaging/Guaranteed-Msg/Endpoints.htm#Queues). Only applicable when destinationType is 'queue'. |
-| `queue.maxMsgSpoolSize`    | String         | The maximum amount of message spool that the given queue may use. This is documented [here.](https://docs.solace.com/Messaging/Guaranteed-Msg/Message-Spooling.htm#max-spool-usage) Only applicable when destinationType is 'queue'. |
+| `queue.maxMsgSpoolSize`    | String         | The maximum amount of message spool that the given queue may use. See [Solace max spool usage](https://docs.solace.com/Messaging/Guaranteed-Msg/Message-Spooling.htm#max-spool-usage). Only applicable when destinationType is 'queue'. |
-| `queue.maxTtl`             | String         | The maximum TTL to apply to messages to be spooled. This is documented [here.](https://docs.solace.com/Messaging/Guaranteed-Msg/Configuring-Queues.htm) Only applicable when destinationType is 'queue'.                             |
+| `queue.maxTtl`             | String         | The maximum TTL to apply to messages to be spooled. See [Solace queue TTL configuration](https://docs.solace.com/Messaging/Guaranteed-Msg/Configuring-Queues.htm). Only applicable when destinationType is 'queue'. |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/solace.md` around lines 56 - 61, Replace the
generic link text "here." with descriptive link text for each referenced
resource to satisfy MD059: update the `deliveryMode` link to something like
"Solace message delivery modes", the `queue.accessType` link to "Solace queue
access types", the `queue.maxMsgSpoolSize` link to "Solace message spooling max
spool usage", and the `queue.maxTtl` link to "Solace queue TTL configuration"
(or similarly descriptive phrases) so each table cell uses meaningful,
accessible link text instead of repeated "here." while keeping the existing URLs
unchanged.
markdown/docs/reference/bindings/ibmmq.md-235-235 (1)

235-235: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Correct the spelling from "dependant" to "dependent".

The word "dependant" (noun meaning a person who relies on another) should be "dependent" (adjective meaning contingent or varying based on). The context describes values that vary by API, requiring the adjective form.

📝 Proposed fix
-<a name="messageBindingObjectExpiry"></a>`expiry` |  integer | The recommended setting the client should use for the TTL (Time-To-Live) of the message. This is a period of time expressed in milliseconds and set by the application that puts the message. `expiry` values are API dependant e.g., MQI and JMS use different units of time and default values for *`unlimited`*. General information on IBM MQ message expiry can be found on this [page](https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_latest/com.ibm.mq.ref.dev.doc/q097490_.html) in the IBM MQ Knowledge Center. | OPTIONAL [*`unlimited`*] | `expiry` value MUST be either `zero` (*`unlimited`*) or greater than zero.
+<a name="messageBindingObjectExpiry"></a>`expiry` |  integer | The recommended setting the client should use for the TTL (Time-To-Live) of the message. This is a period of time expressed in milliseconds and set by the application that puts the message. `expiry` values are API dependent e.g., MQI and JMS use different units of time and default values for *`unlimited`*. General information on IBM MQ message expiry can be found on this [page](https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_latest/com.ibm.mq.ref.dev.doc/q097490_.html) in the IBM MQ Knowledge Center. | OPTIONAL [*`unlimited`*] | `expiry` value MUST be either `zero` (*`unlimited`*) or greater than zero.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/ibmmq.md` at line 235, Update the
description text for the `expiry` messageBindingObject to use the correct
adjective form "dependent" instead of "dependant"; locate the string "expiry
values are API dependant e.g., MQI and JMS..." and change it to "expiry values
are API dependent e.g., MQI and JMS..." so the `expiry` field documentation
reads correctly.
markdown/docs/reference/bindings/googlepubsub.md-102-102 (1)

102-102: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix typo: "pertintent" → "pertinent".

📝 Proposed fix
-[PubsubMessage](https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessage) details, alongside with pertintent
+[PubsubMessage](https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessage) details, alongside with pertinent
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/googlepubsub.md` at line 102, Fix the typo
"pertintent" to "pertinent" in the sentence that references [PubsubMessage] in
markdown/docs/reference/bindings/googlepubsub.md so the phrase reads
"PubsubMessage details, alongside with pertinent". Locate the line containing
the [PubsubMessage] link and replace the misspelled word while preserving
surrounding punctuation and formatting.
markdown/docs/reference/bindings/http.md-38-38 (1)

38-38: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix broken reference link.

The link [Reference Object](referenceObject) is missing proper syntax. Use reference-style link format.

🔧 Proposed fix
-<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key.
+<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key.

Apply the same fix to line 78.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/http.md` at line 38, The markdown link for
"Reference Object" is using inline parenthesis syntax incorrectly; replace
`(referenceObject)` with the reference-style link `[referenceObject]` by
changing the text to `[Reference Object][referenceObject]` in the `query` line
(and apply the same change to the analogous occurrence around line 78) so the
reference-style link resolves correctly.
markdown/docs/reference/bindings/googlepubsub.md-119-119 (1)

119-119: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix broken markdown link syntax.

Line 119 has malformed link syntax with duplicate "Schema" text: [Schema]([Schema](https://...)). This will not render correctly.

🔧 Proposed fix
-[Schema]([Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema)) Object with AsyncAPI.
+[Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema) Object with AsyncAPI.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/googlepubsub.md` at line 119, The markdown
link on the line containing "[Schema]([Schema](https://...))" is malformed;
replace the duplicate text with a single proper link by changing the fragment to
a standard markdown link like "Schema" linked to the URL (i.e., convert
"[Schema]([Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema))"
into a single
"[Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema)")
so the AsyncAPI reference renders correctly.
markdown/docs/reference/bindings/mqtt5.md-29-29 (1)

29-29: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix broken reference link.

The link syntax [Reference Object](referenceObject) is missing the proper reference format. It should either use a hash #referenceObject or reference-style link [referenceObject].

🔧 Proposed fix
-<a name="serverBindingObjectSessionExpiryInterval"></a>`sessionExpiryInterval` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) \| integer | Session Expiry Interval in seconds or a Schema Object containing the definition of the interval.
+<a name="serverBindingObjectSessionExpiryInterval"></a>`sessionExpiryInterval` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] \| integer | Session Expiry Interval in seconds or a Schema Object containing the definition of the interval.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/mqtt5.md` at line 29, The markdown link for
the Reference Object is malformed in the sessionExpiryInterval line; update the
link target for the "Reference Object" text (near the `sessionExpiryInterval`
entry) to use the proper reference format — e.g., change `(referenceObject)` to
`(`#referenceObject`)` or use the reference-style `[Reference
Object][referenceObject]` so the "Reference Object" link resolves correctly to
the referenceObject anchor.
markdown/docs/reference/bindings/kafka.md-25-29 (1)

25-29: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix table column count mismatch.

Line 29 has only 4 cells but the header defines 5 columns (Field Name, Type, Description, Applicability, Constraints). The missing "Constraints" column will cause incorrect table rendering.

🔧 Proposed fix
 Field Name | Type | Description | Applicability [default] | Constraints
 ---|:---:|:---:|:---:|---
 `schemaRegistryUrl` | string (url) | API URL for the Schema Registry used when producing Kafka messages (if a Schema Registry was used) | OPTIONAL | -
 `schemaRegistryVendor` | string | The vendor of Schema Registry and Kafka serdes library that should be used (e.g. `apicurio`, `confluent`, `ibm`, or `karapace`) | OPTIONAL | MUST NOT be specified if `schemaRegistryUrl` is not specified
-<a name="serverBindingObjectBindingVersion"></a>`bindingVersion` | string | The version of this binding. | OPTIONAL [`latest`]
+<a name="serverBindingObjectBindingVersion"></a>`bindingVersion` | string | The version of this binding. | OPTIONAL [`latest`] | -
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/kafka.md` around lines 25 - 29, The table
has five header columns but the `bindingVersion` row only has four cells,
breaking rendering; update the `bindingVersion` row (the line with `<a
name="serverBindingObjectBindingVersion"></a>\`bindingVersion\``) to include the
missing fifth "Constraints" cell (e.g., add `| -` or the appropriate constraint
text) so all rows match the header columns (refer to `schemaRegistryUrl` and
`schemaRegistryVendor` for examples of the five-column layout).
markdown/docs/reference/bindings/anypointmq.md-36-37 (1)

36-37: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix duplicate anchor IDs for host and pathname fields.

Both lines define the same anchor ID serverObjectUrlFieldValueAnypointMQ, which will cause broken fragment navigation in the rendered documentation. Each field must have a unique anchor identifier.

🔗 Proposed fix to use unique anchor IDs
-<a name="serverObjectUrlFieldValueAnypointMQ"></a>`host`                         | e.g., `https://mq-us-east-1.anypoint.mulesoft.com`       | **REQUIRED**. MUST be the endpoint HOST part of the URL of the Anypoint MQ Broker REST API. Valid examples are `https://mq-us-east-1.anypoint.mulesoft.com` and `https://mq-eu-central-1.eu1.anypoint.mulesoft.com` (and _not_ `https://mq-us-east-1.anypoint.mulesoft.com/api`, or `https://.../api/v1`).
-<a name="serverObjectUrlFieldValueAnypointMQ"></a>`pathname`                         | e.g., `/api`       | **REQUIRED**. MUST be the endpoint path part of the URL of the Anypoint MQ Broker REST API _excluding_ the final major version indicator (e.g., `v1`). Valid examples are `/api` and (and _not_ `https://.../api/v1`).
+<a name="serverObjectHostFieldValueAnypointMQ"></a>`host`                         | e.g., `https://mq-us-east-1.anypoint.mulesoft.com`       | **REQUIRED**. MUST be the endpoint HOST part of the URL of the Anypoint MQ Broker REST API. Valid examples are `https://mq-us-east-1.anypoint.mulesoft.com` and `https://mq-eu-central-1.eu1.anypoint.mulesoft.com` (and _not_ `https://mq-us-east-1.anypoint.mulesoft.com/api`, or `https://.../api/v1`).
+<a name="serverObjectPathnameFieldValueAnypointMQ"></a>`pathname`                         | e.g., `/api`       | **REQUIRED**. MUST be the endpoint path part of the URL of the Anypoint MQ Broker REST API _excluding_ the final major version indicator (e.g., `v1`). Valid examples are `/api` and (and _not_ `https://.../api/v1`).
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/anypointmq.md` around lines 36 - 37, The two
markdown anchor tags for the `host` and `pathname` field rows both use the same
ID `serverObjectUrlFieldValueAnypointMQ`, causing duplicate fragment IDs; update
one of the anchors to a unique ID (e.g.,
`serverObjectUrlFieldValueAnypointMQ-host` and
`serverObjectUrlFieldValueAnypointMQ-pathname`) so each field (`host`,
`pathname`) has its own distinct anchor, and ensure any internal links or
references (if present) are updated to the new IDs.
markdown/docs/reference/bindings/anypointmq.md-63-63 (1)

63-63: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix grammar and typo in example introduction.

The sentence has a grammatical error (missing subject in "where does not apply") and a typo ("anyointmq" should be "anypointmq").

📝 Proposed fix
-The following example shows two channels where does not apply any bindings and the second one applying `anyointmq` bindings.
+The following example shows two channels where the first does not apply any bindings and the second one applies `anypointmq` bindings.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/anypointmq.md` at line 63, Fix the sentence
in the example introduction: replace the incorrect and ungrammatical phrase "two
channels where does not apply any bindings and the second one applying
`anyointmq` bindings" with a correct version such as "The following example
shows two channels: the first does not apply any bindings, and the second
applies `anypointmq` bindings", and correct the typo `anyointmq` ->
`anypointmq`.
markdown/docs/reference/bindings/ros2.md-250-250 (1)

250-250: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix typo.

"strucute" should be "structure".

📝 Proposed fix
-- There is one file (head-asyncapi.yaml) that references the different standard/custom packages. This packages contains the strucute of its messages.
+- There is one file (head-asyncapi.yaml) that references the different standard/custom packages. This packages contains the structure of its messages.

As per static analysis, this is a spelling error.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/ros2.md` at line 250, Correct the spelling
typo in the documentation sentence that mentions head-asyncapi.yaml: change
"strucute" to "structure" so the line reads that the packages contain the
structure of their messages; update the sentence referencing head-asyncapi.yaml
accordingly.
markdown/docs/reference/bindings/2.x.x.md-165-165 (1)

165-165: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Add alt text to images for accessibility.

The images at lines 165 and 236 are missing alt text, which is important for accessibility (screen readers) and when images fail to load.

♿ Proposed fix
-[<img src="/img/docs/SNS-SQS-Pub-Sub.png" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)
+[<img src="/img/docs/SNS-SQS-Pub-Sub.png" alt="SNS to SQS publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)

For line 236:

-[<img src="/img/docs/SNS-HTTP.png" height = "400" width="600"/>](SNS-HTTP.png)
+[<img src="/img/docs/SNS-HTTP.png" alt="SNS to HTTP publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-HTTP.png)

Also applies to: 236-236

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/2.x.x.md` at line 165, The two inline images
are missing alt text; update the markup for the image at src
"/img/docs/SNS-SQS-Pub-Sub.png" (line 165) and the other image at line 236 to
include descriptive alt attributes (e.g., "SNS–SQS pub/sub architecture diagram"
or similar) so screen readers and failed-load scenarios get meaningful text;
ensure both the <img> tag(s) inside the link wrappers and any standalone
markdown image syntax include the alt text.
markdown/docs/reference/bindings/sqs.md-126-126 (1)

126-126: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Add alt text to images for accessibility.

Images at lines 126 and 178 are missing alt text, which is important for accessibility (screen readers) and when images fail to load.

♿ Proposed fix
-[<img src="/img/docs/SQS-Point-To-Point.png" height = "400" width="600"/>](SQS-Point-To-Point.png)
+[<img src="/img/docs/SQS-Point-To-Point.png" alt="SQS point-to-point pattern diagram" height = "400" width="600"/>](SQS-Point-To-Point.png)

For line 178:

-[<img src="/img/docs/SNS-SQS-Pub-Sub.png" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)
+[<img src="/img/docs/SNS-SQS-Pub-Sub.png" alt="SNS to SQS publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)

Also applies to: 178-178

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/sqs.md` at line 126, The two image embeds
(the img tag referencing SQS-Point-To-Point.png at line 126 and the second image
at line 178) are missing alt text; update each img tag to include a descriptive
alt attribute (e.g., alt="SQS point-to-point diagram" for SQS-Point-To-Point.png
and an appropriate description for the second image) so screen readers and
fallback rendering have meaningful text.
markdown/docs/reference/bindings/3.0.0.md-152-152 (1)

152-152: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix spelling error: "Identifer" → "Identifier".

Line 152 contains the same typo present in the 2.x.x bindings documentation: "Identifer Object" should be "Identifier Object".

🔤 Proposed fix
-We provide an Identifer Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field). 
+We provide an Identifier Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field).
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/3.0.0.md` at line 152, Typo in the docs:
change the phrase "Identifer Object" to "Identifier Object" in the markdown for
the bindings documentation (the sentence that starts "We provide an Identifer
Object...") so the word "Identifier" is spelled correctly wherever that exact
phrase appears in the 3.0.0 bindings reference (mirroring the same fix needed
from 2.x.x documentation).
markdown/docs/reference/bindings/2.x.x.md-152-152 (1)

152-152: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix spelling error: "Identifer" → "Identifier".

Line 152 contains a typo: "Identifer Object" should be "Identifier Object".

🔤 Proposed fix
-We provide an Identifer Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field). 
+We provide an Identifier Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field).
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/2.x.x.md` at line 152, Replace the
misspelled phrase "Identifer Object" with the correct "Identifier Object" in the
markdown section that currently reads "We provide an Identifer Object to support
providing the identifier..." so the sentence reads "We provide an Identifier
Object to support providing the identifier..." (search for the exact phrase
"Identifer Object" to locate the spot).
markdown/docs/reference/bindings/3.0.0.md-165-165 (1)

165-165: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Add alt text to images for accessibility.

The images at lines 165 and 251 are missing alt text, which is important for accessibility (screen readers) and when images fail to load. This is the same issue present in the 2.x.x bindings documentation.

♿ Proposed fix
-[<img src="/img/docs/SNS-SQS-Pub-Sub.png" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)
+[<img src="/img/docs/SNS-SQS-Pub-Sub.png" alt="SNS to SQS publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)

For line 251:

-[<img src="/img/docs/SNS-HTTP.png" height = "400" width="600"/>](SNS-HTTP.png)
+[<img src="/img/docs/SNS-HTTP.png" alt="SNS to HTTP publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-HTTP.png)

Also applies to: 251-251

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/3.0.0.md` at line 165, Add descriptive alt
text to the inline HTML <img> elements used in the markdown image links so
screen readers and broken-image scenarios are supported: update the <img
src="/img/docs/SNS-SQS-Pub-Sub.png" height="400" width="600"/> instance to
include an alt attribute (for example alt="SNS–SQS pub/sub architecture
diagram") and do the same for the other image occurrence referenced in the file
(the second <img .../> instance around line 251) so both image tags include
meaningful alt text.
markdown/docs/reference/bindings/sqs.md-54-54 (1)

54-54: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix broken reference link.

Line 54 contains an empty link [SNS Operation Binding Object]() with no URL. This should reference the SNS bindings documentation.

🔗 Proposed fix

Since this document references SNS Operation Binding, the link should point to the appropriate SNS bindings page:

-| <a name="queueObjectName"></a>`name` | string | **Required.** The name of the queue. When an [SNS Operation Binding Object]() references an SQS queue by name, the identifier should be the one in this field.|
+| <a name="queueObjectName"></a>`name` | string | **Required.** The name of the queue. When an SNS Operation Binding Object references an SQS queue by name, the identifier should be the one in this field.|

Alternatively, add the proper link to the SNS bindings documentation (e.g., ../sns or the full path).

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/sqs.md` at line 54, Replace the broken empty
link `[SNS Operation Binding Object]()` on the `name` field line with a proper
link to the SNS bindings documentation; update the markdown so the anchor text
`[SNS Operation Binding Object]` points to the SNS bindings page (for example
use a relative path like `../sns` or the correct docs path to the SNS bindings
file) so the reference resolves correctly.
markdown/docs/reference/bindings/sqs.md-183-183 (1)

183-183: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix broken reference link.

Line 183 contains an empty link [SNS Binding]() with no URL. This should reference the SNS bindings documentation.

🔗 Proposed fix
-The producer files looks like this (see the [SNS Binding]() for more).
+The producer files looks like this (see the SNS Binding for more).

Or add the proper link to the SNS bindings documentation (e.g., [SNS Binding](../sns) or the appropriate path).

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/sqs.md` at line 183, Replace the empty
markdown link "[SNS Binding]()" in the SQS bindings doc with the correct
relative path to the SNS bindings page (e.g., change "[SNS Binding]()" to "[SNS
Binding](../sns)" or the appropriate docs path) so the reference resolves;
locate the literal "[SNS Binding]()" in the markdown and update it accordingly.
markdown/docs/reference/bindings/sqs.md-113-113 (1)

113-113: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix broken reference link.

Line 113 contains an empty link [SNS Operation Object]() with no URL. This should reference the SNS bindings documentation.

🔗 Proposed fix
-On an Operation Binding Object we support an array of Queue objects. Members of this array may be Queue Objects that define the *endpoint* field required by an [SNS Operation Object]() delivering by the SQS protocol or Queue Objects that define the Dead Letter Queue used by either the Redrive Policy of the SNS Subscription (see the SNS Binding Object) or the [Redrive Policy of the SQS Queue](`#redrive-policy`). The name of the Queue Object is used by an Identifier field on either the *endpoint* field of the SNS Operation Object of *deadLetterQueue* on the Redrive Policy to identify the required member of this array.
+On an Operation Binding Object we support an array of Queue objects. Members of this array may be Queue Objects that define the *endpoint* field required by an SNS Operation Object delivering by the SQS protocol or Queue Objects that define the Dead Letter Queue used by either the Redrive Policy of the SNS Subscription (see the SNS Binding Object) or the [Redrive Policy of the SQS Queue](`#redrive-policy`). The name of the Queue Object is used by an Identifier field on either the *endpoint* field of the SNS Operation Object of *deadLetterQueue* on the Redrive Policy to identify the required member of this array.

Or add the proper link to the SNS bindings documentation.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/sqs.md` at line 113, The link text "[SNS
Operation Object]()" is broken; update the markdown in the Operation Binding
Object description to point to the SNS bindings documentation by replacing the
empty link with the correct relative or absolute URL to the SNS Operation Object
docs (or the anchor for the SNS Binding Object), ensuring the sentence
referencing Queue objects, the *endpoint* field of the SNS Operation Object, and
the *deadLetterQueue* on the Redrive Policy correctly links to the SNS bindings
page so readers can navigate to the SNS Operation Object details.
markdown/docs/reference/bindings/jms.md-37-37 (1)

37-37: ⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Fix typo in protocolVersion valid values list.

The valid values list ends with 3.0`, `3.1. where 3.1. has a trailing period. This should be 3.1 (without the trailing period) to match the format of other version numbers in the list.

📝 Proposed fix
-<a name="serverObjectProtocolVersionFieldValueJMS"></a>`protocolVersion` | e.g., `3.1`                                                   | **OPTIONAL**, defaults to `3.1`. If present MUST be the version indicator of the JMS API. Valid values are `1.0`, `1.0.1`, `1.0.1a`, `1.0.2`, `1.0.2a`, `1.0.2b`, `1.1`, `2.0`, `2.0a`, `2.1`, or `3.0`, `3.1.`.
+<a name="serverObjectProtocolVersionFieldValueJMS"></a>`protocolVersion` | e.g., `3.1`                                                   | **OPTIONAL**, defaults to `3.1`. If present MUST be the version indicator of the JMS API. Valid values are `1.0`, `1.0.1`, `1.0.1a`, `1.0.2`, `1.0.2a`, `1.0.2b`, `1.1`, `2.0`, `2.0a`, `2.1`, `3.0`, or `3.1`.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/jms.md` at line 37, Update the valid-values
list for protocolVersion in the serverObjectProtocolVersionFieldValueJMS entry:
remove the stray trailing period from the last item so the list ends with `3.1`
(not `3.1.`) to match the other version formats; edit the line containing
protocolVersion / serverObjectProtocolVersionFieldValueJMS to replace `3.1.`
with `3.1`.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: c7b31e87-1694-4b60-aa20-92e46201c896

📥 Commits

Reviewing files that changed from the base of the PR and between 3648d65 and 9506e78.

⛔ Files ignored due to path filters (6)
  • .github/workflows/scripts/kit/package-lock.json is excluded by !**/package-lock.json
  • .github/workflows/scripts/mailchimp/package-lock.json is excluded by !**/package-lock.json
  • package-lock.json is excluded by !**/package-lock.json
  • public/img/docs/SNS-HTTP.png is excluded by !**/*.png
  • public/img/docs/SNS-SQS-Pub-Sub.png is excluded by !**/*.png
  • public/img/docs/SQS-Point-To-Point.png is excluded by !**/*.png
📒 Files selected for processing (107)
  • .github/workflows/add-good-first-issue-labels.yml
  • .github/workflows/automerge-for-humans-add-ready-to-merge-or-do-not-merge-label.yml
  • .github/workflows/automerge-for-humans-merging.yml
  • .github/workflows/automerge-for-humans-remove-ready-to-merge-label-on-edit.yml
  • .github/workflows/automerge-orphans.yml
  • .github/workflows/automerge.yml
  • .github/workflows/autoupdate.yml
  • .github/workflows/bounty-program-commands.yml
  • .github/workflows/help-command.yml
  • .github/workflows/issues-prs-notifications.yml
  • .github/workflows/lint-pr-title.yml
  • .github/workflows/notify-tsc-members-mention.yml
  • .github/workflows/please-take-a-look-command.yml
  • .github/workflows/release-announcements.yml
  • .github/workflows/scripts/kit/htmlContent.js
  • .github/workflows/scripts/kit/index.js
  • .github/workflows/scripts/kit/package.json
  • .github/workflows/scripts/mailchimp/htmlContent.js
  • .github/workflows/scripts/mailchimp/index.js
  • .github/workflows/scripts/mailchimp/package.json
  • .github/workflows/stale-issues-prs.yml
  • .github/workflows/transfer-issue.yml
  • .github/workflows/update-maintainers-trigger.yaml
  • .github/workflows/update-pr.yml
  • .github/workflows/welcome-first-time-contrib.yml
  • CODEOWNERS
  • README.md
  • components/docs/Visualizer.tsx
  • components/layout/DocsLayout.tsx
  • components/tools/ToolsDashboard.tsx
  • config/3.1.0.json
  • config/MAINTAINERS.json
  • config/TSC_BOARD_MEMBERS.json
  • config/all-tags.json
  • config/edit-page-config.json
  • config/mailchimp-config.json
  • config/meetings.json
  • config/newsroom_videos.json
  • config/tools-automated.json
  • config/tools.json
  • cypress/ambassadors.cy.js
  • cypress/dashboard.cy.js
  • cypress/events.cy.js
  • cypress/pages/BasePageTools.js
  • cypress/pages/ambassadors.js
  • cypress/pages/dashboard.js
  • cypress/pages/events.js
  • cypress/pages/homepage.js
  • cypress/pages/slack.js
  • cypress/pages/tscpage.js
  • cypress/slackworkspace.cy.js
  • cypress/tscpage.cy.js
  • markdown/blog/2026-february-summary.md
  • markdown/blog/2026-march-summary.md
  • markdown/blog/asyncapi-and-websocket.md
  • markdown/blog/board-nomination.md
  • markdown/blog/community-spotlight-hugo.md
  • markdown/blog/gravitee-sponsorship-2025.md
  • markdown/blog/publish-subscribe-semantics.md
  • markdown/blog/release-notes-3.1.0.md
  • markdown/docs/community/020-governance-and-policies/TSC_VOTING_OVERVIEW.md
  • markdown/docs/reference/bindings/2.x.x.md
  • markdown/docs/reference/bindings/3.0.0.md
  • markdown/docs/reference/bindings/_section.md
  • markdown/docs/reference/bindings/amqp.md
  • markdown/docs/reference/bindings/amqp1.md
  • markdown/docs/reference/bindings/anypointmq.md
  • markdown/docs/reference/bindings/googlepubsub.md
  • markdown/docs/reference/bindings/http.md
  • markdown/docs/reference/bindings/ibmmq.md
  • markdown/docs/reference/bindings/jms.md
  • markdown/docs/reference/bindings/kafka.md
  • markdown/docs/reference/bindings/mercure.md
  • markdown/docs/reference/bindings/mqtt.md
  • markdown/docs/reference/bindings/mqtt5.md
  • markdown/docs/reference/bindings/nats.md
  • markdown/docs/reference/bindings/pulsar.md
  • markdown/docs/reference/bindings/redis.md
  • markdown/docs/reference/bindings/ros2.md
  • markdown/docs/reference/bindings/scripts.md
  • markdown/docs/reference/bindings/solace.md
  • markdown/docs/reference/bindings/sqs.md
  • markdown/docs/reference/bindings/stomp.md
  • markdown/docs/reference/bindings/websockets.md
  • markdown/docs/reference/specification/v3.0.0-explorer.md
  • markdown/docs/reference/specification/v3.1.0-explorer.md
  • markdown/docs/tools/generator/api_components.md
  • netlify/functions/newsletter_subscription.ts
  • package.json
  • pages/community/ambassadors/index.tsx
  • pages/roadmap.tsx
  • public/img/diagrams/asyncapi-openapi.webp
  • public/img/diagrams/ecosystem.webp
  • public/img/diagrams/spec-first.webp
  • public/img/posts/2026-blog-banner/march-banner.webp
  • public/img/posts/2026-blog-banner/nominations.webp
  • public/img/posts/asyncapi-websocket.webp
  • public/img/posts/marketing-images/gravitee-community.webp
  • public/img/posts/marketing-images/hugo-spotlight.webp
  • public/img/posts/simple-chat-api.webp
  • tests/build-docs/addDocButtons.test.ts
  • tests/build-post-list.test.ts
  • tests/dashboard/build-dashboard.test.ts
  • tests/markdown/check-markdown.test.ts
  • tests/pages/finance.test.ts
  • tests/tools/combine-tools.test.ts
  • tests/tools/tools-object.test.ts
💤 Files with no reviewable changes (6)
  • config/mailchimp-config.json
  • markdown/blog/2026-february-summary.md
  • markdown/blog/release-notes-3.1.0.md
  • .github/workflows/scripts/mailchimp/package.json
  • .github/workflows/scripts/mailchimp/index.js
  • .github/workflows/transfer-issue.yml
✅ Files skipped from review due to trivial changes (18)
  • .github/workflows/scripts/kit/package.json
  • cypress/dashboard.cy.js
  • markdown/docs/reference/bindings/_section.md
  • markdown/docs/reference/specification/v3.1.0-explorer.md
  • markdown/blog/board-nomination.md
  • README.md
  • markdown/docs/reference/bindings/mercure.md
  • markdown/docs/reference/bindings/amqp1.md
  • config/newsroom_videos.json
  • tests/markdown/check-markdown.test.ts
  • markdown/docs/reference/bindings/stomp.md
  • pages/roadmap.tsx
  • config/edit-page-config.json
  • tests/tools/combine-tools.test.ts
  • config/meetings.json
  • markdown/blog/gravitee-sponsorship-2025.md
  • markdown/blog/asyncapi-and-websocket.md
  • markdown/docs/community/020-governance-and-policies/TSC_VOTING_OVERVIEW.md

Comment on lines +9 to +14
pull_request:
types:
- synchronize
- edited

permissions: {}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

🧩 Analysis chain

🏁 Script executed:

cat -n .github/workflows/automerge-for-humans-remove-ready-to-merge-label-on-edit.yml

Repository: asyncapi/website

Length of output: 1642


🌐 Web query:

GitHub Actions pull_request fork PR permissions job-level write access

💡 Result:

  1. Job-level GITHUB_TOKEN write access is controlled by the workflow/job permissions key. - You can set permissions at the workflow level or within each job; when set at the job level, actions and run commands in that job that use GITHUB_TOKEN gain exactly the access you specify [1]. 2) For on: pull_request from fork PRs, GitHub deliberately downgrades the token to read-only unless additional settings are enabled. - GitHub’s docs explain that for pull request events from forked repositories (except pull_request_target), write permissions are automatically adjusted to read-only when the repo/organization setting to “Send write tokens to workflows from pull requests” isn’t selected [1]. - This is why you can see behavior like “GITHUB_TOKEN is read” even if you write permissions: in the workflow, unless you also enable the relevant repository/organization setting [1][2]. 3) pull_request_target is the exception: it keeps read/write token permissions. - GitHub’s workflow syntax docs state that when a workflow is triggered by pull_request_target, the GITHUB_TOKEN is granted read/write repository permission (even for public fork PRs), assuming you specify permissions appropriately [1]. - GitHub’s fork PR security model is that pull_request_target runs in the context of the base repository and is used to safely perform trusted metadata actions (like labeling/commenting) [3][4]. 4) You cannot “grant job-level write access” to untrusted code running from a fork via on: pull_request without enabling GitHub’s fork-write setting. - Practically: for fork-origin PRs using on: pull_request, GitHub will reduce the GITHUB_TOKEN write scopes unless the repo/organization enables sending write tokens to fork PR workflows [1][2]. - Even if you configure job-level permissions:, GitHub’s fork-event adjustment can override/convert write to read-only when the setting is not enabled [1]. What you should do (typical correct patterns) A) If you only need to post metadata back to the PR (labels/comments/issues): - Prefer on: pull_request_target with narrowly-scoped job permissions (e.g., pull-requests: write) and do not check out or execute untrusted fork code. This is the intended model for “trusted metadata actions” [3][4]. B) If you need write access for workflows triggered by pull_request from fork PRs: - Enable the repository/organization setting “Send write tokens to workflows from pull requests” (the exact setting name can vary slightly in UI), which allows fork PR workflows to use GITHUB_TOKEN with write permission [2][1]. - Then set job-level permissions to the minimum scopes needed (job-level permissions overrides/controls the token’s effective access for that job) [1]. Example (job-level scoped permissions; only effective for the event/setup described above) on: pull_request jobs: update: runs-on: ubuntu-latest permissions: contents: read pull-requests: write steps: # steps that use GITHUB_TOKEN to update PR work - run: echo "..." Key takeaways (answer to the query) - Job-level permissions determines what GITHUB_TOKEN can do in that job [1]. - But for pull_request events from forks, GitHub often forces GITHUB_TOKEN to read-only (write scopes are downgraded) unless you enable the “send write tokens to workflows from pull requests” setting [1][2]. - If you need guaranteed read/write for fork PRs without relying on that setting, use pull_request_target (with strict least-privilege permissions and without executing fork code) [1][3].

Citations:


Revert to pull_request_target to ensure label removal works on fork PRs.

With the current pull_request trigger, forked PRs receive a read-only GITHUB_TOKEN by default. The job-level permissions: pull-requests: write declaration on line 21 cannot override this GitHub-enforced downgrade. This causes the label removal operation on lines 32–36 to fail when fork contributors push new commits or edit the PR, defeating the safety control this workflow exists to enforce. pull_request_target is the documented event for safely performing labeling and commenting operations on fork PRs from the base repository context.

Suggested fix
 on:
-  pull_request:
+  pull_request_target:
     types:
       - synchronize
       - edited
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
pull_request:
types:
- synchronize
- edited
permissions: {}
pull_request_target:
types:
- synchronize
- edited
permissions: {}
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In
@.github/workflows/automerge-for-humans-remove-ready-to-merge-label-on-edit.yml
around lines 9 - 14, The workflow triggers must use pull_request_target instead
of pull_request so the job runs in the base repository context and can remove
labels on forked PRs; change the event key from "pull_request:" to
"pull_request_target:" (keeping the types ["synchronize","edited"]), and ensure
the job-level permissions include pull-requests: write so the label removal
steps (the lines performing label deletion) can execute successfully on fork
PRs.

Comment thread .github/workflows/help-command.yml

on:
pull_request_target:
pull_request:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | 🏗️ Heavy lift

Switching to pull_request will break commenting on fork PRs.

For workflows triggered by pull_request from a forked repository, GitHub silently downgrades any pull-requests: write declared in permissions: to read-only — this is the documented default and cannot be granted back via workflow YAML ("...if the workflow was triggered by a pull request event other than pull_request_target from a forked repository, and the Send write tokens to workflows from pull requests setting is not selected, the permissions are adjusted to change any write permissions to read only" / "you can use the permissions key to add and remove read permissions for forked repositories, but typically you can't grant write access"). pull_request_target, by contrast, "is granted read/write repository permission, even when it is triggered from a public fork".

Concretely, the marocchino/sticky-pull-request-comment "Comment on PR" step (Lines 32–44) and the "delete the comment" step (Lines 46–52) will fail with 403 on every external-contributor PR — i.e., the exact PRs this lint is most likely to fire on. The original amannn/action-semantic-pull-request step still works because it only reads PR data.

This workflow never checks out PR code, so the typical security motivation for moving off pull_request_target doesn't apply here. Recommend reverting the trigger (or, if the move is intentional, splitting the comment steps into a workflow_run-triggered workflow that executes in base-repo context):

🔧 Suggested revert
 on:
-  pull_request:
+  pull_request_target:
     types: [opened, reopened, synchronize, edited, ready_for_review]
GitHub Actions pull_request from fork pull-requests write permission downgrade

Also applies to: 32-52

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/lint-pr-title.yml at line 7, The workflow trigger was
changed to pull_request which causes write permissions to be downgraded for
forked-PRs and makes the "Comment on PR" step
(marocchino/sticky-pull-request-comment) and the "delete the comment" steps fail
with 403; revert the trigger back to pull_request_target so those steps run with
read/write repo permissions (or alternatively move only the comment/delete steps
into a separate workflow triggered by workflow_run that runs in the base-repo
context), and keep the read-only lint checks (e.g.,
amannn/action-semantic-pull-request) in the original pull_request-triggered
workflow if desired.

Comment on lines +8 to +40
const sanitizeLinkAndTitle = (link, title) => {
// Validate inputs to prevent injection attacks
if (!link || typeof link !== 'string' || link.length > 2000) {
return core.setFailed('Invalid link parameter');
}
if (!title || typeof title !== 'string' || title.length > 500) {
return core.setFailed('Invalid title parameter');
}

let parsedLink;
try {
parsedLink = new URL(link);
} catch (error) {
return core.setFailed('Invalid link parameter');
}

if (parsedLink.protocol !== 'https:') {
return core.setFailed('Link must use https protocol');
}

// Sanitize title by removing control characters and limiting length
const sanitizedTitle = title.replace(/[\x00-\x1F\x7F]/g, '').substring(0, 250);
return { sanitizedLink: parsedLink.toString(), sanitizedTitle };
};

module.exports = async (link, title) => {
const KIT_BASE = 'https://api.kit.com/v4';
const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);

// Schedule 1 minute ahead
const sendAt = new Date(Date.now() + 60 * 1000);

const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

❓ Verification inconclusive

In @actions/core for GitHub Actions, what does core.setFailed(message) return, and does it throw an exception by default?


In @actions/core, core.setFailed(message) returns void (i.e., no value). It sets process.exitCode = 1 (failure) and logs the failure via core.error(message). [1] It does not throw an exception by default. The implementation only sets process.exitCode and calls error(message); it has no throw statement. [1] However, it only affects the action’s exit status when the process eventually exits. If you want the step/job to stop immediately, you typically still need to throw (or otherwise return/exit) in your action code after calling setFailed. This matches the documented behavior that it “sets a failing exit code” rather than terminating the process. [2][3]

Citations:


Script executed:

# Check if the file exists and read the relevant section
if [ -f ".github/workflows/scripts/kit/index.js" ]; then
  echo "=== File exists, reading lines 1-50 ==="
  head -50 ".github/workflows/scripts/kit/index.js" | cat -n
else
  echo "File not found at .github/workflows/scripts/kit/index.js"
  # Try to find it
  find . -name "index.js" -path "*kit*" 2>/dev/null
fi

Repository: asyncapi/website


Repository: asyncapi/website
Exit code: 0

stdout:

=== File exists, reading lines 1-50 ===
     1	/**
     2	 * This code is centrally managed in https://github.com/asyncapi/.github/
     3	 * Don't make changes to this file in this repo as they will be overwritten with changes made to the same file in above mentioned repo
     4	 */
     5	const core = require('@actions/core');
     6	const htmlContent = require('./htmlContent.js');
     7	
     8	const sanitizeLinkAndTitle = (link, title) => {
     9	  // Validate inputs to prevent injection attacks
    10	    if (!link || typeof link !== 'string' || link.length > 2000) {
    11	        return core.setFailed('Invalid link parameter');
    12	    }
    13	    if (!title || typeof title !== 'string' || title.length > 500) {
    14	        return core.setFailed('Invalid title parameter');
    15	    }
    16	
    17	    let parsedLink;
    18	    try {
    19	        parsedLink = new URL(link);
    20	    } catch (error) {
    21	        return core.setFailed('Invalid link parameter');
    22	    }
    23	
    24	    if (parsedLink.protocol !== 'https:') {
    25	        return core.setFailed('Link must use https protocol');
    26	    }
    27	
    28	    // Sanitize title by removing control characters and limiting length
    29	    const sanitizedTitle = title.replace(/[\x00-\x1F\x7F]/g, '').substring(0, 250);
    30	    return { sanitizedLink: parsedLink.toString(), sanitizedTitle };
    31	};
    32	
    33	module.exports = async (link, title) => {
    34	    const KIT_BASE = 'https://api.kit.com/v4';
    35	    const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);
    36	
    37	    // Schedule 1 minute ahead
    38	    const sendAt = new Date(Date.now() + 60 * 1000);
    39	
    40	    const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
    41	
    42	    const res = await fetch(`${KIT_BASE}/broadcasts`, {
    43	        method: 'POST',
    44	        headers: {
    45	            'X-Kit-Api-Key': process.env.KIT_API_KEY,
    46	            'Content-Type': 'application/json'
    47	        },
    48	        body: JSON.stringify({
    49	            subject: `TSC attention required: ${sanitizedTitle}`,
    50	            preview_text: 'Check out the latest topic that TSC members have to be aware of',


Script executed:

# Search for other calls to sanitizeLinkAndTitle function
rg "sanitizeLinkAndTitle" ".github/workflows/scripts/kit/index.js"

# Also read the rest of the function to see the complete picture
echo "=== Reading lines 40-65 ==="
sed -n '40,65p' ".github/workflows/scripts/kit/index.js" | cat -n

Repository: asyncapi/website


Repository: asyncapi/website
Exit code: 0

stdout:

const sanitizeLinkAndTitle = (link, title) => {
    const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
=== Reading lines 40-65 ===
     1	    const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
     2	
     3	    const res = await fetch(`${KIT_BASE}/broadcasts`, {
     4	        method: 'POST',
     5	        headers: {
     6	            'X-Kit-Api-Key': process.env.KIT_API_KEY,
     7	            'Content-Type': 'application/json'
     8	        },
     9	        body: JSON.stringify({
    10	            subject: `TSC attention required: ${sanitizedTitle}`,
    11	            preview_text: 'Check out the latest topic that TSC members have to be aware of',
    12	            content: htmlContent(sanitizedLink, sanitizedTitle),
    13	            description: `TSC notification - ${new Date().toUTCString()}`,
    14	            public: false,
    15	            published_at: null,
    16	            send_at: sendAt.toISOString(),
    17	            subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
    18	        })
    19	    });
    20	
    21	    if (!res.ok) return core.setFailed(`Failed creating broadcast: ${await res.text()}`);
    22	    core.info(`Kit.com TSC broadcast scheduled for ${sendAt.toISOString()}`);
    23	};


Replace core.setFailed() returns with thrown errors.

The sanitizeLinkAndTitle() function returns undefined on validation failures (via core.setFailed()), but line 40 attempts to destructure the result. This causes a TypeError when validation fails, crashing the action instead of failing gracefully.

Wrap the destructuring call in try-catch, throw errors from the sanitizer, and call core.setFailed() only in the catch block.

Proposed fix
 const sanitizeLinkAndTitle = (link, title) => {
     if (!link || typeof link !== 'string' || link.length > 2000) {
-        return core.setFailed('Invalid link parameter');
+        throw new Error('Invalid link parameter');
     }
     if (!title || typeof title !== 'string' || title.length > 500) {
-        return core.setFailed('Invalid title parameter');
+        throw new Error('Invalid title parameter');
     }

     let parsedLink;
     try {
         parsedLink = new URL(link);
     } catch (error) {
-        return core.setFailed('Invalid link parameter');
+        throw new Error('Invalid link parameter');
     }

     if (parsedLink.protocol !== 'https:') {
-        return core.setFailed('Link must use https protocol');
+        throw new Error('Link must use https protocol');
     }

     const sanitizedTitle = title.replace(/[\x00-\x1F\x7F]/g, '').substring(0, 250);
     return { sanitizedLink: parsedLink.toString(), sanitizedTitle };
 };

 module.exports = async (link, title) => {
+    try {
     const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
     
     const res = await fetch(`${KIT_BASE}/broadcasts`, {
         method: 'POST',
         headers: {
             'X-Kit-Api-Key': process.env.KIT_API_KEY,
             'Content-Type': 'application/json'
         },
         body: JSON.stringify({
             subject: `TSC attention required: ${sanitizedTitle}`,
             preview_text: 'Check out the latest topic that TSC members have to be aware of',
             content: htmlContent(sanitizedLink, sanitizedTitle),
             description: `TSC notification - ${new Date().toUTCString()}`,
             public: false,
             published_at: null,
             send_at: sendAt.toISOString(),
             subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
         })
     });

-    if (!res.ok) return core.setFailed(`Failed creating broadcast: ${await res.text()}`);
+    if (!res.ok) throw new Error(`Failed creating broadcast: ${await res.text()}`);
     core.info(`Kit.com TSC broadcast scheduled for ${sendAt.toISOString()}`);
+    } catch (error) {
+        core.setFailed(error.message);
+    }
 };
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/scripts/kit/index.js around lines 8 - 40, The sanitizer
sanitizeLinkAndTitle currently calls core.setFailed(...) and returns undefined
on validation errors, causing a TypeError when the caller destructures {
sanitizedLink, sanitizedTitle }; change sanitizeLinkAndTitle to throw new
Error(...) with the same messages instead of calling core.setFailed, then in the
module.exports caller wrap the call/destructuring of sanitizeLinkAndTitle(link,
title) in a try/catch where you call core.setFailed(error.message) (or
core.setFailed(String(error))) and return/exit; ensure references to
sanitizedLink and sanitizedTitle remain the same and that URL parsing/validation
logic is preserved.

Comment on lines +34 to +57
const KIT_BASE = 'https://api.kit.com/v4';
const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);

// Schedule 1 minute ahead
const sendAt = new Date(Date.now() + 60 * 1000);

const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);

const res = await fetch(`${KIT_BASE}/broadcasts`, {
method: 'POST',
headers: {
'X-Kit-Api-Key': process.env.KIT_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
subject: `TSC attention required: ${sanitizedTitle}`,
preview_text: 'Check out the latest topic that TSC members have to be aware of',
content: htmlContent(sanitizedLink, sanitizedTitle),
description: `TSC notification - ${new Date().toUTCString()}`,
public: false,
published_at: null,
send_at: sendAt.toISOString(),
subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
})
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Validate required environment variables before request construction.

KIT_API_KEY and KIT_TSC_TAG_ID are not validated. Invalid/missing values produce opaque API failures later instead of a clear early failure.

Proposed fix
 module.exports = async (link, title) => {
     const KIT_BASE = 'https://api.kit.com/v4';
     const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);
+    if (!process.env.KIT_API_KEY) {
+        return core.setFailed('Missing KIT_API_KEY');
+    }
+    if (!Number.isInteger(TSC_TAG_ID) || TSC_TAG_ID <= 0) {
+        return core.setFailed('KIT_TSC_TAG_ID must be a positive integer');
+    }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const KIT_BASE = 'https://api.kit.com/v4';
const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);
// Schedule 1 minute ahead
const sendAt = new Date(Date.now() + 60 * 1000);
const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
const res = await fetch(`${KIT_BASE}/broadcasts`, {
method: 'POST',
headers: {
'X-Kit-Api-Key': process.env.KIT_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
subject: `TSC attention required: ${sanitizedTitle}`,
preview_text: 'Check out the latest topic that TSC members have to be aware of',
content: htmlContent(sanitizedLink, sanitizedTitle),
description: `TSC notification - ${new Date().toUTCString()}`,
public: false,
published_at: null,
send_at: sendAt.toISOString(),
subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
})
const KIT_BASE = 'https://api.kit.com/v4';
const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);
if (!process.env.KIT_API_KEY) {
return core.setFailed('Missing KIT_API_KEY');
}
if (!Number.isInteger(TSC_TAG_ID) || TSC_TAG_ID <= 0) {
return core.setFailed('KIT_TSC_TAG_ID must be a positive integer');
}
// Schedule 1 minute ahead
const sendAt = new Date(Date.now() + 60 * 1000);
const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
const res = await fetch(`${KIT_BASE}/broadcasts`, {
method: 'POST',
headers: {
'X-Kit-Api-Key': process.env.KIT_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
subject: `TSC attention required: ${sanitizedTitle}`,
preview_text: 'Check out the latest topic that TSC members have to be aware of',
content: htmlContent(sanitizedLink, sanitizedTitle),
description: `TSC notification - ${new Date().toUTCString()}`,
public: false,
published_at: null,
send_at: sendAt.toISOString(),
subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
})
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/scripts/kit/index.js around lines 34 - 57, Ensure KIT API
key and tag id are validated before building the request: check
process.env.KIT_API_KEY is a non-empty string and that
Number(process.env.KIT_TSC_TAG_ID) yields a valid number (not NaN) and assign it
to TSC_TAG_ID; if either is missing/invalid, throw or log a clear error and
abort before calling fetch (the code that constructs body using TSC_TAG_ID,
sanitizedLink/sanitizedTitle, sendAt and calls fetch to `${KIT_BASE}/broadcasts`
should only run after validation). Ensure the error message names the missing
variable (KIT_API_KEY or KIT_TSC_TAG_ID) so failures are immediately actionable.

Comment on lines +136 to +148
headers:
# JMS protocol specific message headers
required:
- JMSMessageID
properties:
JMSMessageID:
name: JMSMessageID
description: A unique message identifier. This may be set by your JMS Provider on your behalf.
type: string
JMSReplyTo:
name: JMSReplyTo
description: The queue or topic that the message sender expects replies to.
type: string
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical | ⚡ Quick win

Fix YAML indentation: properties must be nested under headers.

The specification states (line 121) that the headers field must be "A Schema object... of type object and have a properties key." However, in the example, properties is indented at the same level as headers, making it a sibling field in the jms binding object rather than part of the headers schema.

The correct structure should nest type, required, and properties under headers:

🐛 Proposed fix for YAML structure
     jms:
       headers:
         # JMS protocol specific message headers
+        type: object
         required:
           - JMSMessageID
-      properties:
-        JMSMessageID:
-          name: JMSMessageID
-          description: A unique message identifier. This may be set by your JMS Provider on your behalf.
-          type: string
-        JMSReplyTo:
-          name: JMSReplyTo
-          description: The queue or topic that the message sender expects replies to.
-          type: string
+        properties:
+          JMSMessageID:
+            description: A unique message identifier. This may be set by your JMS Provider on your behalf.
+            type: string
+          JMSReplyTo:
+            description: The queue or topic that the message sender expects replies to.
+            type: string

Note: Also removed redundant name fields from properties as they're already keyed by name.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
headers:
# JMS protocol specific message headers
required:
- JMSMessageID
properties:
JMSMessageID:
name: JMSMessageID
description: A unique message identifier. This may be set by your JMS Provider on your behalf.
type: string
JMSReplyTo:
name: JMSReplyTo
description: The queue or topic that the message sender expects replies to.
type: string
headers:
# JMS protocol specific message headers
type: object
required:
- JMSMessageID
properties:
JMSMessageID:
description: A unique message identifier. This may be set by your JMS Provider on your behalf.
type: string
JMSReplyTo:
description: The queue or topic that the message sender expects replies to.
type: string
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/jms.md` around lines 136 - 148, The YAML
places properties as a sibling to headers; move the schema fields so headers
contains type, required and properties (i.e. nest type: object, required:
[JMSMessageID], and the properties map under headers) and remove redundant name
fields inside JMSMessageID and JMSReplyTo so the properties are keyed by
JMSMessageID/JMSReplyTo with description and type only; update the jms binding
example to reflect headers: { type: object, required: [...], properties: {
JMSMessageID: {...}, JMSReplyTo: {...} } }.

ros2:
role: subscriber
node: /turtlesim
qosPolicies:
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical | ⚡ Quick win

Fix YAML indentation.

The qosPolicies field has incorrect indentation (8 spaces instead of 6), which will cause a YAML parsing error. It should align with role and node at the same indentation level.

🐛 Proposed fix
       role: subscriber
       node: /turtlesim
-        qosPolicies:
+      qosPolicies:
         history: unknown
         reliability: reliable
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/ros2.md` at line 99, The YAML snippet has
misaligned indentation for the field qosPolicies — it is indented two extra
spaces compared to peer keys like role and node, which will break parsing;
locate the qosPolicies key in the markdown example and reduce its indentation to
match role and node so all three are at the same level (align qosPolicies with
role and node).

Comment on lines +1 to +6
---
title: 'scripts'
weight: 10
---

The entire `scripts` directory is centrally managed in [.github](https://github.com/asyncapi/.github/) repository. Any changes in this folder should be done in central repository. No newline at end of file
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Description: Check the bindings directory structure to verify if "scripts" is a valid AsyncAPI binding
# and examine what other binding docs exist for context.

# List all files in the bindings directory to see the pattern
echo "=== Bindings directory structure ==="
fd . 'markdown/docs/reference/bindings' --type f --exec echo {}

echo -e "\n=== Check for AsyncAPI binding specifications ==="
# Check if there's any reference to "scripts" as a binding in AsyncAPI spec
rg -i "scripts.*binding" --type md -C2

echo -e "\n=== Sample content from other binding docs for comparison ==="
# Show content of a few other binding docs to compare structure
fd . 'markdown/docs/reference/bindings' --type f --max-results 3 --exec head -20 {}

Repository: asyncapi/website

Length of output: 1484


🏁 Script executed:

#!/bin/bash
# Examine metadata files to understand directory structure

echo "=== Content of _section.md ==="
cat -n markdown/docs/reference/bindings/_section.md

echo -e "\n=== Content of index.md ==="
cat -n markdown/docs/reference/bindings/index.md

echo -e "\n=== Content of scripts.md (full) ==="
cat -n markdown/docs/reference/bindings/scripts.md

echo -e "\n=== Check parent directory structure ==="
ls -la markdown/docs/reference/

Repository: asyncapi/website

Length of output: 1756


🏁 Script executed:

#!/bin/bash
# Search for references to scripts directory documentation
# and check the broader docs structure

echo "=== Search for references to scripts.md ==="
rg 'scripts\.md|scripts' markdown/docs/reference/ --type md | head -20

echo -e "\n=== Check broader docs directory structure ==="
fd -type d . 'markdown/docs' --max-depth 2

echo -e "\n=== Look for guides or infrastructure directories ==="
fd -type d '(guides|infrastructure|tools|development)' markdown/docs/

echo -e "\n=== Check if there's a .github directory documentation elsewhere ==="
rg '\.github' markdown/docs/ --type md -l

Repository: asyncapi/website

Length of output: 3618


Move this file out of the bindings directory.

This file documents a scripts directory managed in the .github repository and is infrastructure/tooling documentation, not an AsyncAPI protocol binding specification. It does not belong in markdown/docs/reference/bindings/ alongside binding specifications (HTTP, Kafka, MQTT, etc.).

The appropriate location would be markdown/docs/community/040-guides/ or a similar documentation section for contributor/infrastructure guides, where it can properly document development tooling rather than protocol bindings.

🧰 Tools
🪛 LanguageTool

[uncategorized] ~6-~6: The official name of this software platform is spelled with a capital “H”.
Context: ...ts` directory is centrally managed in .github ...

(GITHUB)

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/scripts.md` around lines 1 - 6, This file
wrongly lives under bindings; move the document describing the centrally-managed
"scripts" directory out of markdown/docs/reference/bindings/ into a
community/infrastructure section such as markdown/docs/community/040-guides/,
update the front-matter (title/weight) as appropriate, adjust any internal links
or sidebar/TOC references that point to the old path, and remove or replace this
bindings entry so bindings listing no longer references the "scripts" doc.

Comment on lines +103 to +105
$ref: '#/channels/address'
messages:
- $ref: '#/channels/address/messages/personEvent'
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

Fix broken channel $ref paths in the first YAML example.

The example references #/channels/address/..., but the defined channel key is person. Copy-pasting this snippet will fail reference resolution.

Proposed fix
-      $ref: '#/channels/address'
+      $ref: '#/channels/person'
...
-      - $ref: '#/channels/address/messages/personEvent'
+      - $ref: '#/channels/person/messages/personEvent'

Also applies to: 125-126

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@markdown/docs/reference/bindings/solace.md` around lines 103 - 105, The YAML
examples use incorrect $ref paths pointing to '#/channels/address' and
'#/channels/address/messages/personEvent' but the channel key is 'person';
update those $ref values to '#/channels/person' and
'#/channels/person/messages/personEvent' respectively in the first example (and
the same two occurrences around the later example) so the references resolve to
the defined channel key 'person'.

Comment thread pages/community/ambassadors/index.tsx
@codxbrexx codxbrexx force-pushed the fix-finance-memory-leak branch from 9506e78 to 94e8b66 Compare May 5, 2026 21:33
codxbrexx and others added 2 commits May 6, 2026 13:30
- Guard GitHub, LinkedIn, and Twitter links with URL existence checks
- Only set image URL when githubUrl is available
- Extract addAdditionalUserInfo() to utils/ambassadors.ts for testability
- Add 15 unit tests covering all social link scenarios
- Add E2E tests for DOM rendering validation
- Update CI/CD workflow to include build verification and E2E tests

Fixes broken links when ambassador profiles lack social media URLs.
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (4)
utils/ambassadors.ts (1)

7-7: 💤 Low value

Add an explicit return type for the public utility.

addAdditionalUserInfo is exported and consumed by pages/community/ambassadors/index.tsx. Annotating the return type as Ambassador makes the contract self-documenting and prevents accidental drift if implementation details change.

♻️ Proposed change
-export function addAdditionalUserInfo(user: Ambassador) {
+export function addAdditionalUserInfo(user: Ambassador): Ambassador {
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@utils/ambassadors.ts` at line 7, The exported utility function
addAdditionalUserInfo lacks an explicit return type; update its signature to
explicitly return Ambassador (e.g., function addAdditionalUserInfo(user:
Ambassador): Ambassador) so callers like pages/community/ambassadors/index.tsx
have a stable, self-documenting contract and the compiler will catch accidental
return-type drift.
tests/pages/community/ambassadors.test.ts (1)

7-17: 💤 Low value

Optional: extract an ambassador factory to reduce boilerplate.

The same 9-field ambassador literal is repeated ~12 times. A small helper would make individual tests focus on the relevant fields and reduce drift if Ambassador gains a new required field.

♻️ Proposed change
+const makeAmbassador = (overrides: Partial<Ambassador> = {}): Ambassador => ({
+  name: 'Test User',
+  img: '',
+  bio: 'Test bio',
+  title: 'Developer',
+  github: '',
+  twitter: '',
+  linkedin: '',
+  company: 'Test Company',
+  contributions: [],
+  ...overrides
+});
+
 describe('addAdditionalUserInfo', () => {
   describe('GitHub URL handling', () => {
     it('should create githubUrl when github field exists', () => {
-      const ambassador: Ambassador = {
-        name: 'Test User',
-        img: '',
-        bio: 'Test bio',
-        title: 'Developer',
-        github: 'testuser',
-        twitter: '',
-        linkedin: '',
-        company: 'Test Company',
-        contributions: []
-      };
-
-      const result = addAdditionalUserInfo(ambassador);
+      const result = addAdditionalUserInfo(makeAmbassador({ github: 'testuser' }));

       expect(result.githubUrl).toBe('https://www.github.com/testuser');
     });
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@tests/pages/community/ambassadors.test.ts` around lines 7 - 17, The repeated
Ambassador object should be replaced by a small factory to reduce boilerplate:
add a helper function (e.g., createAmbassador or makeAmbassador) in
tests/pages/community/ambassadors.test.ts that returns a default Ambassador
object with the nine fields currently used (name, img, bio, title, github,
twitter, linkedin, company, contributions) and accept a partial overrides
parameter to override only fields needed per test; update existing usages of the
ambassador literal to call createAmbassador({ ...overrides }) so tests focus on
relevant fields and automatically adapt if Ambassador gains new required fields.
.github/workflows/if-nodejs-pr-testing.yml (1)

76-78: ⚡ Quick win

Consider restricting Build project to a single OS.

This new build step now runs on ubuntu-latest, macos-latest, and windows-latest for every PR. For a Next.js website that deploys to a single (Linux) target, the macOS and Windows builds add CI minutes without providing additional release-quality signal. Consider gating with matrix.os == 'ubuntu-latest' (matching the existing pattern for Run linter on Line 82 and the E2E step on Line 79).

♻️ Proposed change
-      - if: steps.packagejson.outputs.exists == 'true'
+      - if: steps.packagejson.outputs.exists == 'true' && matrix.os == 'ubuntu-latest'
         name: Build project
         run: npm run build --if-present
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/if-nodejs-pr-testing.yml around lines 76 - 78, The "Build
project" workflow step currently runs on all matrix OSes; restrict it to only
run on Ubuntu by adding the matrix OS guard to its conditional so it only
executes when matrix.os == 'ubuntu-latest' in addition to
steps.packagejson.outputs.exists == 'true' (update the step identified by its
name "Build project" and the existing condition `if:
steps.packagejson.outputs.exists == 'true'` to include the `matrix.os ==
'ubuntu-latest'` check).
cypress/ambassadors.cy.ts (1)

43-43: 💤 Low value

Use optional chaining (per SonarCloud).

Three sites — Lines 43, 59, and 165 — flagged by SonarCloud can be tightened with optional chaining for readability.

♻️ Proposed change
-          if (href && href.includes('github.com')) {
+          if (href?.includes('github.com')) {
-          if (href && href.includes('linkedin.com')) {
+          if (href?.includes('linkedin.com')) {
-            return href && href.includes('github.com') && href !== 'undefined';
+            return href?.includes('github.com') && href !== 'undefined';
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@cypress/ambassadors.cy.ts` at line 43, Replace the explicit truthy check plus
includes call (patterns like "href && href.includes('github.com')") with
optional chaining to simplify and harden the condition (e.g., use
"href?.includes('github.com')"); update all occurrences in this file where "href
&& href.includes(...)" appears (the three conditional checks) so they use
optional chaining instead of the logical-and null check.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

Inline comments:
In @.github/workflows/if-nodejs-pr-testing.yml:
- Around line 79-81: The E2E job runs npm run test:e2e (which currently just
does npx cypress run) without starting the app server, so Cypress cannot reach
http://127.0.0.1:3000; update the workflow or the test:e2e script to orchestrate
the server startup (e.g., use start-server-and-test to run the existing start
script that uses serve@latest out and then run cypress, or modify package.json
test:e2e to start the server and wait before invoking npx cypress run) so that
npm run test:e2e brings up the server on port 3000 before Cypress executes.

In `@cypress/ambassadors.cy.ts`:
- Around line 137-141: Remove the tautological assertions and make them
meaningful: replace expect($detail).to.contain.text('') inside the
cy.get('[data-testid="Ambassadors-members-details"]').each callback with a real
check (e.g., assert the text is not empty or matches a name/country pattern, or
use cy.wrap($detail).invoke('text').should('not.be.empty') and confirm the
expected children via
cy.wrap($detail).find('div').should('have.length.at.least', 2)). Also remove the
conditional guard around expect(hasValidLinks).to.be.true so the test asserts
the boolean directly (i.e., compute hasValidLinks and then call
expect(hasValidLinks).to.be.true without wrapping it in if (hasValidLinks)),
ensuring the assertion can fail when links are invalid.

---

Nitpick comments:
In @.github/workflows/if-nodejs-pr-testing.yml:
- Around line 76-78: The "Build project" workflow step currently runs on all
matrix OSes; restrict it to only run on Ubuntu by adding the matrix OS guard to
its conditional so it only executes when matrix.os == 'ubuntu-latest' in
addition to steps.packagejson.outputs.exists == 'true' (update the step
identified by its name "Build project" and the existing condition `if:
steps.packagejson.outputs.exists == 'true'` to include the `matrix.os ==
'ubuntu-latest'` check).

In `@cypress/ambassadors.cy.ts`:
- Line 43: Replace the explicit truthy check plus includes call (patterns like
"href && href.includes('github.com')") with optional chaining to simplify and
harden the condition (e.g., use "href?.includes('github.com')"); update all
occurrences in this file where "href && href.includes(...)" appears (the three
conditional checks) so they use optional chaining instead of the logical-and
null check.

In `@tests/pages/community/ambassadors.test.ts`:
- Around line 7-17: The repeated Ambassador object should be replaced by a small
factory to reduce boilerplate: add a helper function (e.g., createAmbassador or
makeAmbassador) in tests/pages/community/ambassadors.test.ts that returns a
default Ambassador object with the nine fields currently used (name, img, bio,
title, github, twitter, linkedin, company, contributions) and accept a partial
overrides parameter to override only fields needed per test; update existing
usages of the ambassador literal to call createAmbassador({ ...overrides }) so
tests focus on relevant fields and automatically adapt if Ambassador gains new
required fields.

In `@utils/ambassadors.ts`:
- Line 7: The exported utility function addAdditionalUserInfo lacks an explicit
return type; update its signature to explicitly return Ambassador (e.g.,
function addAdditionalUserInfo(user: Ambassador): Ambassador) so callers like
pages/community/ambassadors/index.tsx have a stable, self-documenting contract
and the compiler will catch accidental return-type drift.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 6b09f5d3-87a0-4bea-b3a9-8103696c424f

📥 Commits

Reviewing files that changed from the base of the PR and between 9506e78 and 464012d.

📒 Files selected for processing (7)
  • .github/workflows/if-nodejs-pr-testing.yml
  • config/tools.json
  • cypress/ambassadors.cy.ts
  • pages/community/ambassadors/index.tsx
  • tests/pages/community/ambassadors.test.ts
  • tsconfig.json
  • utils/ambassadors.ts

Comment on lines +79 to +81
- if: steps.packagejson.outputs.exists == 'true' && matrix.os == 'ubuntu-latest'
name: Run E2E tests (Cypress)
run: npm run test:e2e --if-present
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major | ⚡ Quick win

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify the test:e2e script and any server orchestration in package.json
fd -t f '^package\.json$' -d 2 -x cat {} \; | python3 -c "
import json, sys
pkg = json.load(sys.stdin)
scripts = pkg.get('scripts', {})
for k in ['test:e2e', 'cy:run', 'cypress:run', 'dev', 'start']:
    if k in scripts:
        print(f'{k}: {scripts[k]}')
print('---deps---')
deps = {**pkg.get('dependencies', {}), **pkg.get('devDependencies', {})}
for d in ['cypress', 'start-server-and-test', 'wait-on', 'concurrently']:
    if d in deps:
        print(f'{d}: {deps[d]}')
"

Repository: asyncapi/website

Length of output: 181


🏁 Script executed:

cat .github/workflows/if-nodejs-pr-testing.yml

Repository: asyncapi/website

Length of output: 7998


🏁 Script executed:

fd -t f 'cypress\.config\.(js|ts|mjs|cjs)$'

Repository: asyncapi/website

Length of output: 78


🏁 Script executed:

cat cypress.config.js

Repository: asyncapi/website

Length of output: 566


The test:e2e script does not orchestrate a server, causing E2E tests to fail.

The script is bare npx cypress run without server orchestration, and the workflow has no step to start a server. Cypress is configured to connect to http://127.0.0.1:3000, but nothing serves on that port when tests run. Use start-server-and-test or similar to wrap cypress run with a server launcher (likely the start script which uses serve@latest out), or integrate server startup into the test:e2e script itself.

🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In @.github/workflows/if-nodejs-pr-testing.yml around lines 79 - 81, The E2E job
runs npm run test:e2e (which currently just does npx cypress run) without
starting the app server, so Cypress cannot reach http://127.0.0.1:3000; update
the workflow or the test:e2e script to orchestrate the server startup (e.g., use
start-server-and-test to run the existing start script that uses serve@latest
out and then run cypress, or modify package.json test:e2e to start the server
and wait before invoking npx cypress run) so that npm run test:e2e brings up the
server on port 3000 before Cypress executes.

Comment thread cypress/ambassadors.cy.ts
Comment on lines +137 to +141
cy.get('[data-testid="Ambassadors-members-details"]').each(($detail) => {
expect($detail).to.contain.text('');
// Should have name and country
cy.wrap($detail).find('div').should('have.length.at.least', 2);
});
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor | ⚡ Quick win

Tautological / no-op assertions.

expect($detail).to.contain.text('') is always true (every element "contains" the empty string), so this assertion provides no coverage. Similarly, the GitHub edge-case test at Lines 168-171 wraps expect(hasValidLinks).to.be.true inside if (hasValidLinks), which means the assertion can never fail — if hasValidLinks is false the block is skipped, and if it's true the assertion is trivially satisfied.

🛡️ Proposed fix
-    it('should display ambassador name and country', () => {
-      cy.get('[data-testid="Ambassadors-members-details"]').each(($detail) => {
-        expect($detail).to.contain.text('');
-        // Should have name and country
-        cy.wrap($detail).find('div').should('have.length.at.least', 2);
-      });
-    });
+    it('should display ambassador name and country', () => {
+      cy.get('[data-testid="Ambassadors-members-details"]').each(($detail) => {
+        expect($detail.text().trim()).to.not.equal('');
+        expect($detail.find('div').length).to.be.at.least(2);
+      });
+    });
-          // At least some ambassadors should have GitHub
-          if (hasValidLinks) {
-            expect(hasValidLinks).to.be.true;
-          }
+          expect(hasValidLinks, 'at least one ambassador should expose a GitHub link').to.be.true;
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.

In `@cypress/ambassadors.cy.ts` around lines 137 - 141, Remove the tautological
assertions and make them meaningful: replace expect($detail).to.contain.text('')
inside the cy.get('[data-testid="Ambassadors-members-details"]').each callback
with a real check (e.g., assert the text is not empty or matches a name/country
pattern, or use cy.wrap($detail).invoke('text').should('not.be.empty') and
confirm the expected children via
cy.wrap($detail).find('div').should('have.length.at.least', 2)). Also remove the
conditional guard around expect(hasValidLinks).to.be.true so the test asserts
the boolean directly (i.e., compute hasValidLinks and then call
expect(hasValidLinks).to.be.true without wrapping it in if (hasValidLinks)),
ensuring the assertion can fail when links are invalid.

@sonarqubecloud
Copy link
Copy Markdown

sonarqubecloud Bot commented May 6, 2026

Quality Gate Failed Quality Gate failed

Failed conditions
15.4% Duplication on New Code (required ≤ 3%)

See analysis details on SonarQube Cloud

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: To Be Triaged

Development

Successfully merging this pull request may close these issues.

[BUG]: Memory leak and layout shift in Finance page (pages/finance.tsx)

4 participants