Fix finance memory leak remove resize listener leak and layout shift#5285
Fix finance memory leak remove resize listener leak and layout shift#5285codxbrexx wants to merge 6 commits into
Conversation
✅ Deploy Preview for asyncapi-website ready!Built without sensitive environment variables
To edit notification comments on pull requests, go to your Netlify project configuration. |
|
We require all PRs to follow Conventional Commits specification. |
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (2)
📝 WalkthroughWalkthroughThis PR removes client-side resize handling from the Finance page, adds a reusable ambassadors utility and conditional social-link rendering, introduces unit and E2E tests for ambassadors, updates tsconfig to exclude cypress, adjusts tool names in config/tools.json, and expands the Node.js PR testing workflow to run tests, build, and E2E steps. ChangesFinance Page Layout Fix
Ambassadors Social Links Enhancement
Tools Configuration Update
CI Workflow Enhancement
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~25 minutes Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 2❌ Failed checks (2 warnings)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #5285 +/- ##
=========================================
Coverage 100.00% 100.00%
=========================================
Files 22 22
Lines 830 830
Branches 159 159
=========================================
Hits 830 830 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
⚡️ Lighthouse report for the changes in this PR:
Lighthouse ran on https://deploy-preview-5285--asyncapi-website.netlify.app/ |
|
|
@codxbrexx please pay attention to the comments by bot and ai. |
There was a problem hiding this comment.
Actionable comments posted: 17
Note
Due to the large number of review comments, Critical, Major severity comments were prioritized as inline comments.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (4)
.github/workflows/bounty-program-commands.yml (2)
130-135:⚠️ Potential issue | 🟠 Major | ⚡ Quick win
removeLabelpayload uses the wrong type forname.Line 134 passes an array, but this endpoint expects a single label name string. This will cause label removal to fail at runtime.
Proposed fix
- github.rest.issues.removeLabel({ + await github.rest.issues.removeLabel({ issue_number: context.issue.number, owner: context.repo.owner, repo: context.repo.repo, - name: [BOUNTY_PROGRAM_LABELS[0].name] + name: BOUNTY_PROGRAM_LABELS[0].name })🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In @.github/workflows/bounty-program-commands.yml around lines 130 - 135, The removeLabel call is passing an array for the name parameter which expects a single string; update the github.rest.issues.removeLabel invocation (the removeLabel call that uses BOUNTY_PROGRAM_LABELS) to pass a string not an array by replacing name: [BOUNTY_PROGRAM_LABELS[0].name] with name: BOUNTY_PROGRAM_LABELS[0].name and optionally guard against undefined (e.g., compute const label = BOUNTY_PROGRAM_LABELS[0]?.name and only call removeLabel when label is a non-empty string).
51-56:⚠️ Potential issue | 🟠 Major | ⚡ Quick winAdd
awaitto GitHub REST API calls to ensure requests complete before the step exits.Without
await, the workflow step can complete before these async calls finish, causing flaky command execution. Theactions/github-scriptdocumentation explicitly requires you toawaitany async work in the script body.Three locations need fixes:
- Line 51:
github.rest.issues.createComment()- Line 95:
github.rest.issues.addLabels()- Line 130:
github.rest.issues.removeLabel()Proposed fix
- github.rest.issues.createComment({ + await github.rest.issues.createComment({ issue_number: context.issue.number, owner: context.repo.owner, repo: context.repo.repo, body: commentText }) ... - github.rest.issues.addLabels({ + await github.rest.issues.addLabels({ issue_number: context.issue.number, owner: context.repo.owner, repo: context.repo.repo, labels: [BOUNTY_PROGRAM_LABELS[0].name] }) ... - github.rest.issues.removeLabel({ + await github.rest.issues.removeLabel({ issue_number: context.issue.number, owner: context.repo.owner, repo: context.repo.repo, name: BOUNTY_PROGRAM_LABELS[0].name })🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In @.github/workflows/bounty-program-commands.yml around lines 51 - 56, The three GitHub REST API calls (github.rest.issues.createComment, github.rest.issues.addLabels, and github.rest.issues.removeLabel) are invoked without awaiting their returned promises; prepend each call with await so the actions/github-script body waits for the requests to finish before the step exits, i.e., change github.rest.issues.createComment(...), github.rest.issues.addLabels(...), and github.rest.issues.removeLabel(...) to await github.rest.issues.createComment(...), await github.rest.issues.addLabels(...), and await github.rest.issues.removeLabel(...).components/tools/ToolsDashboard.tsx (1)
149-162:⚠️ Potential issue | 🟠 Major | ⚡ Quick win
[toolsList]dependency causes the page to scroll back to the hash anchor on every filter change.
toolsListis recomputed each time any filter changes. With[toolsList]as the dependency, this effect fires on every filter interaction — so a user on/tools#CLIswho applies a language filter will have the page jerk-scroll back to#CLIsafter each keystroke or toggle.Since
toolsListis auseMemo(evaluated synchronously during render), it is already populated when this effect first runs. The original[]dependency is the correct choice — the effect only needs to fire once, at mount, to honour the initial URL hash.Based on learnings, if you intentionally keep
[](omittingtoolsList), add aneslint-disablecomment with a rationale note.🐛 Proposed fix
- }, [toolsList]); + // eslint-disable-next-line react-hooks/exhaustive-deps: intentionally omitting toolsList — it is synchronously + // derived via useMemo and is already populated on first render; this effect only needs to run once at mount. + }, []);🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@components/tools/ToolsDashboard.tsx` around lines 149 - 162, The useEffect in ToolsDashboard currently depends on toolsList which is recomputed on every filter change causing repeated scroll-to-hash; change the effect dependency array to [] so it only runs once on mount and restore the original mount-only behavior, and if you intentionally omit toolsList add an inline eslint-disable-next-line react-hooks/exhaustive-deps comment above the useEffect with a brief rationale referencing that toolsList is a synchronous useMemo and the effect only needs to run on initial mount to honor the URL hash..github/workflows/scripts/mailchimp/htmlContent.js (1)
1-22:⚠️ Potential issue | 🟠 Major | ⚡ Quick winThis centrally-managed file will be overwritten—remove or fix it upstream.
The header on this file explicitly states it is centrally managed in
asyncapi/.githuband any local changes will be overwritten. AddingescapeHtmlhere means the fix won't persist after the next sync. Since this PR already removed.github/workflows/scripts/mailchimp/index.js, themailchimp/htmlContent.jsfile is now dead code with no callers. Either update the central repo first, or delete the orphaned mailchimp template entirely.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In @.github/workflows/scripts/mailchimp/htmlContent.js around lines 1 - 22, The file contains a centrally-managed escapeHtml function and a mailchimp template (escapeHtml and module.exports) that are now orphaned because .github/workflows/scripts/mailchimp/index.js was removed; either remove this dead mailchimp/htmlContent.js file from the repo or apply the permanent fix upstream in the central asyncapi/.github repo so the change persists—locate the escapeHtml function and the module.exports export in this file and delete the file locally (or mirror the same deletion/update in the central repository) so there are no unused, overwritten assets left behind.
🟡 Minor comments (25)
markdown/blog/community-spotlight-hugo.md-20-20 (1)
20-20:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winHyphenate adjectival “open-source” for consistency.
At Line 20 and Line 100, use
open-sourcewhen used as an adjective (e.g.,open-source contributor,open-source values).Also applies to: 100-100
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/blog/community-spotlight-hugo.md` at line 20, Replace the unhyphenated adjectival phrases "open source contributor" and "open source values" with the hyphenated form "open-source contributor" and "open-source values" respectively; search for the literal strings "open source contributor" and "open source values" in the markdown (e.g., the sentence containing "Hugo Guerrero, a seasoned software developer, open source contributor, technical marketer...") and update them to use the hyphenated adjective form for consistency.markdown/blog/publish-subscribe-semantics.md-57-57 (1)
57-57:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winUse hyphenation for compound adjective in user-facing docs.
At Line 57, change
event driven architecturetoevent-driven architecturefor standard technical writing consistency.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/blog/publish-subscribe-semantics.md` at line 57, Change the phrase "event driven architecture" to the hyphenated form "event-driven architecture" in the document (replace the exact token "event driven architecture" at the occurrence shown); update the user-facing sentence so it reads "In an event-driven architecture" to follow standard compound-adjective hyphenation rules.markdown/blog/community-spotlight-hugo.md-84-84 (1)
84-84:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winMarkdown emphasis is inconsistent with surrounding interview quotes.
Line 84 uses
**Hugo: ...**while adjacent speaker quotes use***Hugo:** ...*. Align this line to the same pattern to avoid visual inconsistency in rendered output.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/blog/community-spotlight-hugo.md` at line 84, Replace the inconsistent emphasis on the line that currently reads "**Hugo: We were singing and dancing on the stage, so you can imagine.**" so it matches the surrounding speaker-quote pattern (use the same asterisk pattern used by adjacent lines, e.g. change to "***Hugo:** We were singing and dancing on the stage, so you can imagine.*") so the rendered emphasis is consistent with other interview quotes.markdown/docs/reference/bindings/websockets.md-37-38 (1)
37-38:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winCorrect
Reference Objectlink syntax to use the declared reference.Lines 37-38 should use reference-style links (
[referenceObject]), otherwise the links are invalid and Line 58 remains unused.Proposed fix
-<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key. -<a name="operationBindingObjectHeaders"></a>`headers` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) | A Schema object containing the definitions of the HTTP headers to use when establishing the connection. This schema MUST be of type `object` and have a `properties` key. +<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key. +<a name="operationBindingObjectHeaders"></a>`headers` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] | A Schema object containing the definitions of the HTTP headers to use when establishing the connection. This schema MUST be of type `object` and have a `properties` key.Also applies to: 58-58
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/websockets.md` around lines 37 - 38, Update the two inline Reference Object links so they use the declared reference-style label instead of inline parentheses: in the `query` and `headers` rows replace the `(referenceObject)` inline link with the reference-style `[referenceObject]`, and make the same change for the other occurrence around the `operationBindingObject` section (the unused Line 58 reference). This ensures the declared `[referenceObject]` link label is used consistently and the reference at the bottom is actually resolved.markdown/docs/reference/bindings/websockets.md-32-32 (1)
32-32:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix heading-level jump in section structure.
Line 32 jumps from
##to#####. Use###(or####if there is an intermediate heading) to keep hierarchy consistent.Proposed fix
-##### Fixed Fields +### Fixed Fields🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/websockets.md` at line 32, The heading "##### Fixed Fields" breaks the document hierarchy by jumping from a higher-level "##" heading; update the heading to "### Fixed Fields" (or "#### Fixed Fields" if there's an intermediate level) so it follows the parent "##" section; locate the "##### Fixed Fields" token in the markdown and replace it with the appropriate level (prefer "### Fixed Fields" to restore consistent structure).markdown/docs/reference/bindings/solace.md-74-74 (1)
74-74:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winNormalize heading style to ATX (remove trailing
##).These headings use
atx_closed, which conflicts with the configured MD003 style.Proposed fix
-## Example with two destinations ## +## Example with two destinations ... -## Example with a wildcard subscription ## +## Example with a wildcard subscriptionAlso applies to: 146-146
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/solace.md` at line 74, The level-2 heading "## Example with two destinations ##" uses an atx_closed style; update it to standard ATX by removing the trailing "##" so it reads "## Example with two destinations" and do the same for the other occurrence noted (line 146) to comply with the MD003 heading style.markdown/docs/reference/bindings/solace.md-56-61 (1)
56-61:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winUse descriptive link text instead of repeated “here.”
These links are user-facing docs and currently fail MD059; replacing generic text improves accessibility and scanability.
Proposed fix
-| `deliveryMode` | Enum | 'direct' or 'persistent'. This determines the quality of service for publishing messages as documented [here.](https://docs.solace.com/Get-Started/Core-Concepts-Message-Delivery-Modes.htm) Default is 'persistent'. | +| `deliveryMode` | Enum | 'direct' or 'persistent'. This determines the quality of service for publishing messages as documented in the [Solace message delivery modes guide](https://docs.solace.com/Get-Started/Core-Concepts-Message-Delivery-Modes.htm). Default is 'persistent'. | -| `queue.accessType` | Enum | 'exclusive' or 'nonexclusive'. This is documented [here.](https://docs.solace.com/Messaging/Guaranteed-Msg/Endpoints.htm#Queues) Only applicable when destinationType is 'queue'. | +| `queue.accessType` | Enum | 'exclusive' or 'nonexclusive'. See [Solace queue endpoint access types](https://docs.solace.com/Messaging/Guaranteed-Msg/Endpoints.htm#Queues). Only applicable when destinationType is 'queue'. | -| `queue.maxMsgSpoolSize` | String | The maximum amount of message spool that the given queue may use. This is documented [here.](https://docs.solace.com/Messaging/Guaranteed-Msg/Message-Spooling.htm#max-spool-usage) Only applicable when destinationType is 'queue'. | +| `queue.maxMsgSpoolSize` | String | The maximum amount of message spool that the given queue may use. See [Solace max spool usage](https://docs.solace.com/Messaging/Guaranteed-Msg/Message-Spooling.htm#max-spool-usage). Only applicable when destinationType is 'queue'. | -| `queue.maxTtl` | String | The maximum TTL to apply to messages to be spooled. This is documented [here.](https://docs.solace.com/Messaging/Guaranteed-Msg/Configuring-Queues.htm) Only applicable when destinationType is 'queue'. | +| `queue.maxTtl` | String | The maximum TTL to apply to messages to be spooled. See [Solace queue TTL configuration](https://docs.solace.com/Messaging/Guaranteed-Msg/Configuring-Queues.htm). Only applicable when destinationType is 'queue'. |🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/solace.md` around lines 56 - 61, Replace the generic link text "here." with descriptive link text for each referenced resource to satisfy MD059: update the `deliveryMode` link to something like "Solace message delivery modes", the `queue.accessType` link to "Solace queue access types", the `queue.maxMsgSpoolSize` link to "Solace message spooling max spool usage", and the `queue.maxTtl` link to "Solace queue TTL configuration" (or similarly descriptive phrases) so each table cell uses meaningful, accessible link text instead of repeated "here." while keeping the existing URLs unchanged.markdown/docs/reference/bindings/ibmmq.md-235-235 (1)
235-235:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winCorrect the spelling from "dependant" to "dependent".
The word "dependant" (noun meaning a person who relies on another) should be "dependent" (adjective meaning contingent or varying based on). The context describes values that vary by API, requiring the adjective form.
📝 Proposed fix
-<a name="messageBindingObjectExpiry"></a>`expiry` | integer | The recommended setting the client should use for the TTL (Time-To-Live) of the message. This is a period of time expressed in milliseconds and set by the application that puts the message. `expiry` values are API dependant e.g., MQI and JMS use different units of time and default values for *`unlimited`*. General information on IBM MQ message expiry can be found on this [page](https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_latest/com.ibm.mq.ref.dev.doc/q097490_.html) in the IBM MQ Knowledge Center. | OPTIONAL [*`unlimited`*] | `expiry` value MUST be either `zero` (*`unlimited`*) or greater than zero. +<a name="messageBindingObjectExpiry"></a>`expiry` | integer | The recommended setting the client should use for the TTL (Time-To-Live) of the message. This is a period of time expressed in milliseconds and set by the application that puts the message. `expiry` values are API dependent e.g., MQI and JMS use different units of time and default values for *`unlimited`*. General information on IBM MQ message expiry can be found on this [page](https://www.ibm.com/support/knowledgecenter/en/SSFKSJ_latest/com.ibm.mq.ref.dev.doc/q097490_.html) in the IBM MQ Knowledge Center. | OPTIONAL [*`unlimited`*] | `expiry` value MUST be either `zero` (*`unlimited`*) or greater than zero.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/ibmmq.md` at line 235, Update the description text for the `expiry` messageBindingObject to use the correct adjective form "dependent" instead of "dependant"; locate the string "expiry values are API dependant e.g., MQI and JMS..." and change it to "expiry values are API dependent e.g., MQI and JMS..." so the `expiry` field documentation reads correctly.markdown/docs/reference/bindings/googlepubsub.md-102-102 (1)
102-102:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix typo: "pertintent" → "pertinent".
📝 Proposed fix
-[PubsubMessage](https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessage) details, alongside with pertintent +[PubsubMessage](https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessage) details, alongside with pertinent🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/googlepubsub.md` at line 102, Fix the typo "pertintent" to "pertinent" in the sentence that references [PubsubMessage] in markdown/docs/reference/bindings/googlepubsub.md so the phrase reads "PubsubMessage details, alongside with pertinent". Locate the line containing the [PubsubMessage] link and replace the misspelled word while preserving surrounding punctuation and formatting.markdown/docs/reference/bindings/http.md-38-38 (1)
38-38:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix broken reference link.
The link
[Reference Object](referenceObject)is missing proper syntax. Use reference-style link format.🔧 Proposed fix
-<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key. +<a name="operationBindingObjectQuery"></a>`query` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] | A Schema object containing the definitions for each query parameter. This schema MUST be of type `object` and have a `properties` key.Apply the same fix to line 78.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/http.md` at line 38, The markdown link for "Reference Object" is using inline parenthesis syntax incorrectly; replace `(referenceObject)` with the reference-style link `[referenceObject]` by changing the text to `[Reference Object][referenceObject]` in the `query` line (and apply the same change to the analogous occurrence around line 78) so the reference-style link resolves correctly.markdown/docs/reference/bindings/googlepubsub.md-119-119 (1)
119-119:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix broken markdown link syntax.
Line 119 has malformed link syntax with duplicate "Schema" text:
[Schema]([Schema](https://...)). This will not render correctly.🔧 Proposed fix
-[Schema]([Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema)) Object with AsyncAPI. +[Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema) Object with AsyncAPI.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/googlepubsub.md` at line 119, The markdown link on the line containing "[Schema]([Schema](https://...))" is malformed; replace the duplicate text with a single proper link by changing the fragment to a standard markdown link like "Schema" linked to the URL (i.e., convert "[Schema]([Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema))" into a single "[Schema](https://cloud.google.com/pubsub/docs/reference/rest/v1/projects.schemas#Schema)") so the AsyncAPI reference renders correctly.markdown/docs/reference/bindings/mqtt5.md-29-29 (1)
29-29:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix broken reference link.
The link syntax
[Reference Object](referenceObject)is missing the proper reference format. It should either use a hash#referenceObjector reference-style link[referenceObject].🔧 Proposed fix
-<a name="serverBindingObjectSessionExpiryInterval"></a>`sessionExpiryInterval` | [Schema Object][schemaObject] \| [Reference Object](referenceObject) \| integer | Session Expiry Interval in seconds or a Schema Object containing the definition of the interval. +<a name="serverBindingObjectSessionExpiryInterval"></a>`sessionExpiryInterval` | [Schema Object][schemaObject] \| [Reference Object][referenceObject] \| integer | Session Expiry Interval in seconds or a Schema Object containing the definition of the interval.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/mqtt5.md` at line 29, The markdown link for the Reference Object is malformed in the sessionExpiryInterval line; update the link target for the "Reference Object" text (near the `sessionExpiryInterval` entry) to use the proper reference format — e.g., change `(referenceObject)` to `(`#referenceObject`)` or use the reference-style `[Reference Object][referenceObject]` so the "Reference Object" link resolves correctly to the referenceObject anchor.markdown/docs/reference/bindings/kafka.md-25-29 (1)
25-29:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix table column count mismatch.
Line 29 has only 4 cells but the header defines 5 columns (Field Name, Type, Description, Applicability, Constraints). The missing "Constraints" column will cause incorrect table rendering.
🔧 Proposed fix
Field Name | Type | Description | Applicability [default] | Constraints ---|:---:|:---:|:---:|--- `schemaRegistryUrl` | string (url) | API URL for the Schema Registry used when producing Kafka messages (if a Schema Registry was used) | OPTIONAL | - `schemaRegistryVendor` | string | The vendor of Schema Registry and Kafka serdes library that should be used (e.g. `apicurio`, `confluent`, `ibm`, or `karapace`) | OPTIONAL | MUST NOT be specified if `schemaRegistryUrl` is not specified -<a name="serverBindingObjectBindingVersion"></a>`bindingVersion` | string | The version of this binding. | OPTIONAL [`latest`] +<a name="serverBindingObjectBindingVersion"></a>`bindingVersion` | string | The version of this binding. | OPTIONAL [`latest`] | -🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/kafka.md` around lines 25 - 29, The table has five header columns but the `bindingVersion` row only has four cells, breaking rendering; update the `bindingVersion` row (the line with `<a name="serverBindingObjectBindingVersion"></a>\`bindingVersion\``) to include the missing fifth "Constraints" cell (e.g., add `| -` or the appropriate constraint text) so all rows match the header columns (refer to `schemaRegistryUrl` and `schemaRegistryVendor` for examples of the five-column layout).markdown/docs/reference/bindings/anypointmq.md-36-37 (1)
36-37:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix duplicate anchor IDs for
hostandpathnamefields.Both lines define the same anchor ID
serverObjectUrlFieldValueAnypointMQ, which will cause broken fragment navigation in the rendered documentation. Each field must have a unique anchor identifier.🔗 Proposed fix to use unique anchor IDs
-<a name="serverObjectUrlFieldValueAnypointMQ"></a>`host` | e.g., `https://mq-us-east-1.anypoint.mulesoft.com` | **REQUIRED**. MUST be the endpoint HOST part of the URL of the Anypoint MQ Broker REST API. Valid examples are `https://mq-us-east-1.anypoint.mulesoft.com` and `https://mq-eu-central-1.eu1.anypoint.mulesoft.com` (and _not_ `https://mq-us-east-1.anypoint.mulesoft.com/api`, or `https://.../api/v1`). -<a name="serverObjectUrlFieldValueAnypointMQ"></a>`pathname` | e.g., `/api` | **REQUIRED**. MUST be the endpoint path part of the URL of the Anypoint MQ Broker REST API _excluding_ the final major version indicator (e.g., `v1`). Valid examples are `/api` and (and _not_ `https://.../api/v1`). +<a name="serverObjectHostFieldValueAnypointMQ"></a>`host` | e.g., `https://mq-us-east-1.anypoint.mulesoft.com` | **REQUIRED**. MUST be the endpoint HOST part of the URL of the Anypoint MQ Broker REST API. Valid examples are `https://mq-us-east-1.anypoint.mulesoft.com` and `https://mq-eu-central-1.eu1.anypoint.mulesoft.com` (and _not_ `https://mq-us-east-1.anypoint.mulesoft.com/api`, or `https://.../api/v1`). +<a name="serverObjectPathnameFieldValueAnypointMQ"></a>`pathname` | e.g., `/api` | **REQUIRED**. MUST be the endpoint path part of the URL of the Anypoint MQ Broker REST API _excluding_ the final major version indicator (e.g., `v1`). Valid examples are `/api` and (and _not_ `https://.../api/v1`).🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/anypointmq.md` around lines 36 - 37, The two markdown anchor tags for the `host` and `pathname` field rows both use the same ID `serverObjectUrlFieldValueAnypointMQ`, causing duplicate fragment IDs; update one of the anchors to a unique ID (e.g., `serverObjectUrlFieldValueAnypointMQ-host` and `serverObjectUrlFieldValueAnypointMQ-pathname`) so each field (`host`, `pathname`) has its own distinct anchor, and ensure any internal links or references (if present) are updated to the new IDs.markdown/docs/reference/bindings/anypointmq.md-63-63 (1)
63-63:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix grammar and typo in example introduction.
The sentence has a grammatical error (missing subject in "where does not apply") and a typo ("anyointmq" should be "anypointmq").
📝 Proposed fix
-The following example shows two channels where does not apply any bindings and the second one applying `anyointmq` bindings. +The following example shows two channels where the first does not apply any bindings and the second one applies `anypointmq` bindings.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/anypointmq.md` at line 63, Fix the sentence in the example introduction: replace the incorrect and ungrammatical phrase "two channels where does not apply any bindings and the second one applying `anyointmq` bindings" with a correct version such as "The following example shows two channels: the first does not apply any bindings, and the second applies `anypointmq` bindings", and correct the typo `anyointmq` -> `anypointmq`.markdown/docs/reference/bindings/ros2.md-250-250 (1)
250-250:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix typo.
"strucute" should be "structure".
📝 Proposed fix
-- There is one file (head-asyncapi.yaml) that references the different standard/custom packages. This packages contains the strucute of its messages. +- There is one file (head-asyncapi.yaml) that references the different standard/custom packages. This packages contains the structure of its messages.As per static analysis, this is a spelling error.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/ros2.md` at line 250, Correct the spelling typo in the documentation sentence that mentions head-asyncapi.yaml: change "strucute" to "structure" so the line reads that the packages contain the structure of their messages; update the sentence referencing head-asyncapi.yaml accordingly.markdown/docs/reference/bindings/2.x.x.md-165-165 (1)
165-165:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winAdd alt text to images for accessibility.
The images at lines 165 and 236 are missing alt text, which is important for accessibility (screen readers) and when images fail to load.
♿ Proposed fix
-[<img src="/img/docs/SNS-SQS-Pub-Sub.png" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png) +[<img src="/img/docs/SNS-SQS-Pub-Sub.png" alt="SNS to SQS publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)For line 236:
-[<img src="/img/docs/SNS-HTTP.png" height = "400" width="600"/>](SNS-HTTP.png) +[<img src="/img/docs/SNS-HTTP.png" alt="SNS to HTTP publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-HTTP.png)Also applies to: 236-236
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/2.x.x.md` at line 165, The two inline images are missing alt text; update the markup for the image at src "/img/docs/SNS-SQS-Pub-Sub.png" (line 165) and the other image at line 236 to include descriptive alt attributes (e.g., "SNS–SQS pub/sub architecture diagram" or similar) so screen readers and failed-load scenarios get meaningful text; ensure both the <img> tag(s) inside the link wrappers and any standalone markdown image syntax include the alt text.markdown/docs/reference/bindings/sqs.md-126-126 (1)
126-126:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winAdd alt text to images for accessibility.
Images at lines 126 and 178 are missing alt text, which is important for accessibility (screen readers) and when images fail to load.
♿ Proposed fix
-[<img src="/img/docs/SQS-Point-To-Point.png" height = "400" width="600"/>](SQS-Point-To-Point.png) +[<img src="/img/docs/SQS-Point-To-Point.png" alt="SQS point-to-point pattern diagram" height = "400" width="600"/>](SQS-Point-To-Point.png)For line 178:
-[<img src="/img/docs/SNS-SQS-Pub-Sub.png" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png) +[<img src="/img/docs/SNS-SQS-Pub-Sub.png" alt="SNS to SQS publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)Also applies to: 178-178
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/sqs.md` at line 126, The two image embeds (the img tag referencing SQS-Point-To-Point.png at line 126 and the second image at line 178) are missing alt text; update each img tag to include a descriptive alt attribute (e.g., alt="SQS point-to-point diagram" for SQS-Point-To-Point.png and an appropriate description for the second image) so screen readers and fallback rendering have meaningful text.markdown/docs/reference/bindings/3.0.0.md-152-152 (1)
152-152:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix spelling error: "Identifer" → "Identifier".
Line 152 contains the same typo present in the 2.x.x bindings documentation: "Identifer Object" should be "Identifier Object".
🔤 Proposed fix
-We provide an Identifer Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field). +We provide an Identifier Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field).🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/3.0.0.md` at line 152, Typo in the docs: change the phrase "Identifer Object" to "Identifier Object" in the markdown for the bindings documentation (the sentence that starts "We provide an Identifer Object...") so the word "Identifier" is spelled correctly wherever that exact phrase appears in the 3.0.0 bindings reference (mirroring the same fix needed from 2.x.x documentation).markdown/docs/reference/bindings/2.x.x.md-152-152 (1)
152-152:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix spelling error: "Identifer" → "Identifier".
Line 152 contains a typo: "Identifer Object" should be "Identifier Object".
🔤 Proposed fix
-We provide an Identifer Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field). +We provide an Identifier Object to support providing the identifier of an externally defined endpoint for this SNS *publication* to target, or an endpoint on another binding against this Operation Object (via the name field).🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/2.x.x.md` at line 152, Replace the misspelled phrase "Identifer Object" with the correct "Identifier Object" in the markdown section that currently reads "We provide an Identifer Object to support providing the identifier..." so the sentence reads "We provide an Identifier Object to support providing the identifier..." (search for the exact phrase "Identifer Object" to locate the spot).markdown/docs/reference/bindings/3.0.0.md-165-165 (1)
165-165:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winAdd alt text to images for accessibility.
The images at lines 165 and 251 are missing alt text, which is important for accessibility (screen readers) and when images fail to load. This is the same issue present in the 2.x.x bindings documentation.
♿ Proposed fix
-[<img src="/img/docs/SNS-SQS-Pub-Sub.png" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png) +[<img src="/img/docs/SNS-SQS-Pub-Sub.png" alt="SNS to SQS publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-SQS-Pub-Sub.png)For line 251:
-[<img src="/img/docs/SNS-HTTP.png" height = "400" width="600"/>](SNS-HTTP.png) +[<img src="/img/docs/SNS-HTTP.png" alt="SNS to HTTP publish-subscribe pattern diagram" height = "400" width="600"/>](SNS-HTTP.png)Also applies to: 251-251
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/3.0.0.md` at line 165, Add descriptive alt text to the inline HTML <img> elements used in the markdown image links so screen readers and broken-image scenarios are supported: update the <img src="/img/docs/SNS-SQS-Pub-Sub.png" height="400" width="600"/> instance to include an alt attribute (for example alt="SNS–SQS pub/sub architecture diagram") and do the same for the other image occurrence referenced in the file (the second <img .../> instance around line 251) so both image tags include meaningful alt text.markdown/docs/reference/bindings/sqs.md-54-54 (1)
54-54:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix broken reference link.
Line 54 contains an empty link
[SNS Operation Binding Object]()with no URL. This should reference the SNS bindings documentation.🔗 Proposed fix
Since this document references SNS Operation Binding, the link should point to the appropriate SNS bindings page:
-| <a name="queueObjectName"></a>`name` | string | **Required.** The name of the queue. When an [SNS Operation Binding Object]() references an SQS queue by name, the identifier should be the one in this field.| +| <a name="queueObjectName"></a>`name` | string | **Required.** The name of the queue. When an SNS Operation Binding Object references an SQS queue by name, the identifier should be the one in this field.|Alternatively, add the proper link to the SNS bindings documentation (e.g.,
../snsor the full path).🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/sqs.md` at line 54, Replace the broken empty link `[SNS Operation Binding Object]()` on the `name` field line with a proper link to the SNS bindings documentation; update the markdown so the anchor text `[SNS Operation Binding Object]` points to the SNS bindings page (for example use a relative path like `../sns` or the correct docs path to the SNS bindings file) so the reference resolves correctly.markdown/docs/reference/bindings/sqs.md-183-183 (1)
183-183:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix broken reference link.
Line 183 contains an empty link
[SNS Binding]()with no URL. This should reference the SNS bindings documentation.🔗 Proposed fix
-The producer files looks like this (see the [SNS Binding]() for more). +The producer files looks like this (see the SNS Binding for more).Or add the proper link to the SNS bindings documentation (e.g.,
[SNS Binding](../sns)or the appropriate path).🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/sqs.md` at line 183, Replace the empty markdown link "[SNS Binding]()" in the SQS bindings doc with the correct relative path to the SNS bindings page (e.g., change "[SNS Binding]()" to "[SNS Binding](../sns)" or the appropriate docs path) so the reference resolves; locate the literal "[SNS Binding]()" in the markdown and update it accordingly.markdown/docs/reference/bindings/sqs.md-113-113 (1)
113-113:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix broken reference link.
Line 113 contains an empty link
[SNS Operation Object]()with no URL. This should reference the SNS bindings documentation.🔗 Proposed fix
-On an Operation Binding Object we support an array of Queue objects. Members of this array may be Queue Objects that define the *endpoint* field required by an [SNS Operation Object]() delivering by the SQS protocol or Queue Objects that define the Dead Letter Queue used by either the Redrive Policy of the SNS Subscription (see the SNS Binding Object) or the [Redrive Policy of the SQS Queue](`#redrive-policy`). The name of the Queue Object is used by an Identifier field on either the *endpoint* field of the SNS Operation Object of *deadLetterQueue* on the Redrive Policy to identify the required member of this array. +On an Operation Binding Object we support an array of Queue objects. Members of this array may be Queue Objects that define the *endpoint* field required by an SNS Operation Object delivering by the SQS protocol or Queue Objects that define the Dead Letter Queue used by either the Redrive Policy of the SNS Subscription (see the SNS Binding Object) or the [Redrive Policy of the SQS Queue](`#redrive-policy`). The name of the Queue Object is used by an Identifier field on either the *endpoint* field of the SNS Operation Object of *deadLetterQueue* on the Redrive Policy to identify the required member of this array.Or add the proper link to the SNS bindings documentation.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/sqs.md` at line 113, The link text "[SNS Operation Object]()" is broken; update the markdown in the Operation Binding Object description to point to the SNS bindings documentation by replacing the empty link with the correct relative or absolute URL to the SNS Operation Object docs (or the anchor for the SNS Binding Object), ensuring the sentence referencing Queue objects, the *endpoint* field of the SNS Operation Object, and the *deadLetterQueue* on the Redrive Policy correctly links to the SNS bindings page so readers can navigate to the SNS Operation Object details.markdown/docs/reference/bindings/jms.md-37-37 (1)
37-37:⚠️ Potential issue | 🟡 Minor | ⚡ Quick winFix typo in protocolVersion valid values list.
The valid values list ends with
3.0`, `3.1.where3.1.has a trailing period. This should be3.1(without the trailing period) to match the format of other version numbers in the list.📝 Proposed fix
-<a name="serverObjectProtocolVersionFieldValueJMS"></a>`protocolVersion` | e.g., `3.1` | **OPTIONAL**, defaults to `3.1`. If present MUST be the version indicator of the JMS API. Valid values are `1.0`, `1.0.1`, `1.0.1a`, `1.0.2`, `1.0.2a`, `1.0.2b`, `1.1`, `2.0`, `2.0a`, `2.1`, or `3.0`, `3.1.`. +<a name="serverObjectProtocolVersionFieldValueJMS"></a>`protocolVersion` | e.g., `3.1` | **OPTIONAL**, defaults to `3.1`. If present MUST be the version indicator of the JMS API. Valid values are `1.0`, `1.0.1`, `1.0.1a`, `1.0.2`, `1.0.2a`, `1.0.2b`, `1.1`, `2.0`, `2.0a`, `2.1`, `3.0`, or `3.1`.🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@markdown/docs/reference/bindings/jms.md` at line 37, Update the valid-values list for protocolVersion in the serverObjectProtocolVersionFieldValueJMS entry: remove the stray trailing period from the last item so the list ends with `3.1` (not `3.1.`) to match the other version formats; edit the line containing protocolVersion / serverObjectProtocolVersionFieldValueJMS to replace `3.1.` with `3.1`.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: c7b31e87-1694-4b60-aa20-92e46201c896
⛔ Files ignored due to path filters (6)
.github/workflows/scripts/kit/package-lock.jsonis excluded by!**/package-lock.json.github/workflows/scripts/mailchimp/package-lock.jsonis excluded by!**/package-lock.jsonpackage-lock.jsonis excluded by!**/package-lock.jsonpublic/img/docs/SNS-HTTP.pngis excluded by!**/*.pngpublic/img/docs/SNS-SQS-Pub-Sub.pngis excluded by!**/*.pngpublic/img/docs/SQS-Point-To-Point.pngis excluded by!**/*.png
📒 Files selected for processing (107)
.github/workflows/add-good-first-issue-labels.yml.github/workflows/automerge-for-humans-add-ready-to-merge-or-do-not-merge-label.yml.github/workflows/automerge-for-humans-merging.yml.github/workflows/automerge-for-humans-remove-ready-to-merge-label-on-edit.yml.github/workflows/automerge-orphans.yml.github/workflows/automerge.yml.github/workflows/autoupdate.yml.github/workflows/bounty-program-commands.yml.github/workflows/help-command.yml.github/workflows/issues-prs-notifications.yml.github/workflows/lint-pr-title.yml.github/workflows/notify-tsc-members-mention.yml.github/workflows/please-take-a-look-command.yml.github/workflows/release-announcements.yml.github/workflows/scripts/kit/htmlContent.js.github/workflows/scripts/kit/index.js.github/workflows/scripts/kit/package.json.github/workflows/scripts/mailchimp/htmlContent.js.github/workflows/scripts/mailchimp/index.js.github/workflows/scripts/mailchimp/package.json.github/workflows/stale-issues-prs.yml.github/workflows/transfer-issue.yml.github/workflows/update-maintainers-trigger.yaml.github/workflows/update-pr.yml.github/workflows/welcome-first-time-contrib.ymlCODEOWNERSREADME.mdcomponents/docs/Visualizer.tsxcomponents/layout/DocsLayout.tsxcomponents/tools/ToolsDashboard.tsxconfig/3.1.0.jsonconfig/MAINTAINERS.jsonconfig/TSC_BOARD_MEMBERS.jsonconfig/all-tags.jsonconfig/edit-page-config.jsonconfig/mailchimp-config.jsonconfig/meetings.jsonconfig/newsroom_videos.jsonconfig/tools-automated.jsonconfig/tools.jsoncypress/ambassadors.cy.jscypress/dashboard.cy.jscypress/events.cy.jscypress/pages/BasePageTools.jscypress/pages/ambassadors.jscypress/pages/dashboard.jscypress/pages/events.jscypress/pages/homepage.jscypress/pages/slack.jscypress/pages/tscpage.jscypress/slackworkspace.cy.jscypress/tscpage.cy.jsmarkdown/blog/2026-february-summary.mdmarkdown/blog/2026-march-summary.mdmarkdown/blog/asyncapi-and-websocket.mdmarkdown/blog/board-nomination.mdmarkdown/blog/community-spotlight-hugo.mdmarkdown/blog/gravitee-sponsorship-2025.mdmarkdown/blog/publish-subscribe-semantics.mdmarkdown/blog/release-notes-3.1.0.mdmarkdown/docs/community/020-governance-and-policies/TSC_VOTING_OVERVIEW.mdmarkdown/docs/reference/bindings/2.x.x.mdmarkdown/docs/reference/bindings/3.0.0.mdmarkdown/docs/reference/bindings/_section.mdmarkdown/docs/reference/bindings/amqp.mdmarkdown/docs/reference/bindings/amqp1.mdmarkdown/docs/reference/bindings/anypointmq.mdmarkdown/docs/reference/bindings/googlepubsub.mdmarkdown/docs/reference/bindings/http.mdmarkdown/docs/reference/bindings/ibmmq.mdmarkdown/docs/reference/bindings/jms.mdmarkdown/docs/reference/bindings/kafka.mdmarkdown/docs/reference/bindings/mercure.mdmarkdown/docs/reference/bindings/mqtt.mdmarkdown/docs/reference/bindings/mqtt5.mdmarkdown/docs/reference/bindings/nats.mdmarkdown/docs/reference/bindings/pulsar.mdmarkdown/docs/reference/bindings/redis.mdmarkdown/docs/reference/bindings/ros2.mdmarkdown/docs/reference/bindings/scripts.mdmarkdown/docs/reference/bindings/solace.mdmarkdown/docs/reference/bindings/sqs.mdmarkdown/docs/reference/bindings/stomp.mdmarkdown/docs/reference/bindings/websockets.mdmarkdown/docs/reference/specification/v3.0.0-explorer.mdmarkdown/docs/reference/specification/v3.1.0-explorer.mdmarkdown/docs/tools/generator/api_components.mdnetlify/functions/newsletter_subscription.tspackage.jsonpages/community/ambassadors/index.tsxpages/roadmap.tsxpublic/img/diagrams/asyncapi-openapi.webppublic/img/diagrams/ecosystem.webppublic/img/diagrams/spec-first.webppublic/img/posts/2026-blog-banner/march-banner.webppublic/img/posts/2026-blog-banner/nominations.webppublic/img/posts/asyncapi-websocket.webppublic/img/posts/marketing-images/gravitee-community.webppublic/img/posts/marketing-images/hugo-spotlight.webppublic/img/posts/simple-chat-api.webptests/build-docs/addDocButtons.test.tstests/build-post-list.test.tstests/dashboard/build-dashboard.test.tstests/markdown/check-markdown.test.tstests/pages/finance.test.tstests/tools/combine-tools.test.tstests/tools/tools-object.test.ts
💤 Files with no reviewable changes (6)
- config/mailchimp-config.json
- markdown/blog/2026-february-summary.md
- markdown/blog/release-notes-3.1.0.md
- .github/workflows/scripts/mailchimp/package.json
- .github/workflows/scripts/mailchimp/index.js
- .github/workflows/transfer-issue.yml
✅ Files skipped from review due to trivial changes (18)
- .github/workflows/scripts/kit/package.json
- cypress/dashboard.cy.js
- markdown/docs/reference/bindings/_section.md
- markdown/docs/reference/specification/v3.1.0-explorer.md
- markdown/blog/board-nomination.md
- README.md
- markdown/docs/reference/bindings/mercure.md
- markdown/docs/reference/bindings/amqp1.md
- config/newsroom_videos.json
- tests/markdown/check-markdown.test.ts
- markdown/docs/reference/bindings/stomp.md
- pages/roadmap.tsx
- config/edit-page-config.json
- tests/tools/combine-tools.test.ts
- config/meetings.json
- markdown/blog/gravitee-sponsorship-2025.md
- markdown/blog/asyncapi-and-websocket.md
- markdown/docs/community/020-governance-and-policies/TSC_VOTING_OVERVIEW.md
| pull_request: | ||
| types: | ||
| - synchronize | ||
| - edited | ||
|
|
||
| permissions: {} |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
cat -n .github/workflows/automerge-for-humans-remove-ready-to-merge-label-on-edit.ymlRepository: asyncapi/website
Length of output: 1642
🌐 Web query:
GitHub Actions pull_request fork PR permissions job-level write access
💡 Result:
- Job-level GITHUB_TOKEN write access is controlled by the workflow/job
permissionskey. - You can setpermissionsat the workflow level or within each job; when set at the job level, actions and run commands in that job that useGITHUB_TOKENgain exactly the access you specify [1]. 2) Foron: pull_requestfrom fork PRs, GitHub deliberately downgrades the token to read-only unless additional settings are enabled. - GitHub’s docs explain that for pull request events from forked repositories (exceptpull_request_target), write permissions are automatically adjusted to read-only when the repo/organization setting to “Send write tokens to workflows from pull requests” isn’t selected [1]. - This is why you can see behavior like “GITHUB_TOKENis read” even if you writepermissions:in the workflow, unless you also enable the relevant repository/organization setting [1][2]. 3)pull_request_targetis the exception: it keeps read/write token permissions. - GitHub’s workflow syntax docs state that when a workflow is triggered bypull_request_target, theGITHUB_TOKENis granted read/write repository permission (even for public fork PRs), assuming you specify permissions appropriately [1]. - GitHub’s fork PR security model is thatpull_request_targetruns in the context of the base repository and is used to safely perform trusted metadata actions (like labeling/commenting) [3][4]. 4) You cannot “grant job-level write access” to untrusted code running from a fork viaon: pull_requestwithout enabling GitHub’s fork-write setting. - Practically: for fork-origin PRs usingon: pull_request, GitHub will reduce theGITHUB_TOKENwrite scopes unless the repo/organization enables sending write tokens to fork PR workflows [1][2]. - Even if you configure job-levelpermissions:, GitHub’s fork-event adjustment can override/convert write to read-only when the setting is not enabled [1]. What you should do (typical correct patterns) A) If you only need to post metadata back to the PR (labels/comments/issues): - Preferon: pull_request_targetwith narrowly-scoped job permissions (e.g.,pull-requests: write) and do not check out or execute untrusted fork code. This is the intended model for “trusted metadata actions” [3][4]. B) If you need write access for workflows triggered bypull_requestfrom fork PRs: - Enable the repository/organization setting “Send write tokens to workflows from pull requests” (the exact setting name can vary slightly in UI), which allows fork PR workflows to useGITHUB_TOKENwith write permission [2][1]. - Then set job-levelpermissionsto the minimum scopes needed (job-levelpermissionsoverrides/controls the token’s effective access for that job) [1]. Example (job-level scoped permissions; only effective for the event/setup described above) on: pull_request jobs: update: runs-on: ubuntu-latest permissions: contents: read pull-requests: write steps: # steps that use GITHUB_TOKEN to update PR work - run: echo "..." Key takeaways (answer to the query) - Job-levelpermissionsdetermines whatGITHUB_TOKENcan do in that job [1]. - But forpull_requestevents from forks, GitHub often forcesGITHUB_TOKENto read-only (write scopes are downgraded) unless you enable the “send write tokens to workflows from pull requests” setting [1][2]. - If you need guaranteed read/write for fork PRs without relying on that setting, usepull_request_target(with strict least-privilege permissions and without executing fork code) [1][3].
Citations:
- 1: https://docs.github.com/actions/using-workflows/workflow-syntax-for-github-actions
- 2: https://stackoverflow.com/questions/75655770/github-actions-requiring-secrets-on-a-fork-origin-pr
- 3: https://github.blog/news-insights/product-news/github-actions-improvements-for-fork-and-pull-request-workflows/
- 4: https://stackoverflow.com/questions/75744849/permission-to-repo-denied-to-github-actions-for-pr-coming-from-a-fork-only
Revert to pull_request_target to ensure label removal works on fork PRs.
With the current pull_request trigger, forked PRs receive a read-only GITHUB_TOKEN by default. The job-level permissions: pull-requests: write declaration on line 21 cannot override this GitHub-enforced downgrade. This causes the label removal operation on lines 32–36 to fail when fork contributors push new commits or edit the PR, defeating the safety control this workflow exists to enforce. pull_request_target is the documented event for safely performing labeling and commenting operations on fork PRs from the base repository context.
Suggested fix
on:
- pull_request:
+ pull_request_target:
types:
- synchronize
- edited📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| pull_request: | |
| types: | |
| - synchronize | |
| - edited | |
| permissions: {} | |
| pull_request_target: | |
| types: | |
| - synchronize | |
| - edited | |
| permissions: {} |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In
@.github/workflows/automerge-for-humans-remove-ready-to-merge-label-on-edit.yml
around lines 9 - 14, The workflow triggers must use pull_request_target instead
of pull_request so the job runs in the base repository context and can remove
labels on forked PRs; change the event key from "pull_request:" to
"pull_request_target:" (keeping the types ["synchronize","edited"]), and ensure
the job-level permissions include pull-requests: write so the label removal
steps (the lines performing label deletion) can execute successfully on fork
PRs.
|
|
||
| on: | ||
| pull_request_target: | ||
| pull_request: |
There was a problem hiding this comment.
Switching to pull_request will break commenting on fork PRs.
For workflows triggered by pull_request from a forked repository, GitHub silently downgrades any pull-requests: write declared in permissions: to read-only — this is the documented default and cannot be granted back via workflow YAML ("...if the workflow was triggered by a pull request event other than pull_request_target from a forked repository, and the Send write tokens to workflows from pull requests setting is not selected, the permissions are adjusted to change any write permissions to read only" / "you can use the permissions key to add and remove read permissions for forked repositories, but typically you can't grant write access"). pull_request_target, by contrast, "is granted read/write repository permission, even when it is triggered from a public fork".
Concretely, the marocchino/sticky-pull-request-comment "Comment on PR" step (Lines 32–44) and the "delete the comment" step (Lines 46–52) will fail with 403 on every external-contributor PR — i.e., the exact PRs this lint is most likely to fire on. The original amannn/action-semantic-pull-request step still works because it only reads PR data.
This workflow never checks out PR code, so the typical security motivation for moving off pull_request_target doesn't apply here. Recommend reverting the trigger (or, if the move is intentional, splitting the comment steps into a workflow_run-triggered workflow that executes in base-repo context):
🔧 Suggested revert
on:
- pull_request:
+ pull_request_target:
types: [opened, reopened, synchronize, edited, ready_for_review]GitHub Actions pull_request from fork pull-requests write permission downgrade
Also applies to: 32-52
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In @.github/workflows/lint-pr-title.yml at line 7, The workflow trigger was
changed to pull_request which causes write permissions to be downgraded for
forked-PRs and makes the "Comment on PR" step
(marocchino/sticky-pull-request-comment) and the "delete the comment" steps fail
with 403; revert the trigger back to pull_request_target so those steps run with
read/write repo permissions (or alternatively move only the comment/delete steps
into a separate workflow triggered by workflow_run that runs in the base-repo
context), and keep the read-only lint checks (e.g.,
amannn/action-semantic-pull-request) in the original pull_request-triggered
workflow if desired.
| const sanitizeLinkAndTitle = (link, title) => { | ||
| // Validate inputs to prevent injection attacks | ||
| if (!link || typeof link !== 'string' || link.length > 2000) { | ||
| return core.setFailed('Invalid link parameter'); | ||
| } | ||
| if (!title || typeof title !== 'string' || title.length > 500) { | ||
| return core.setFailed('Invalid title parameter'); | ||
| } | ||
|
|
||
| let parsedLink; | ||
| try { | ||
| parsedLink = new URL(link); | ||
| } catch (error) { | ||
| return core.setFailed('Invalid link parameter'); | ||
| } | ||
|
|
||
| if (parsedLink.protocol !== 'https:') { | ||
| return core.setFailed('Link must use https protocol'); | ||
| } | ||
|
|
||
| // Sanitize title by removing control characters and limiting length | ||
| const sanitizedTitle = title.replace(/[\x00-\x1F\x7F]/g, '').substring(0, 250); | ||
| return { sanitizedLink: parsedLink.toString(), sanitizedTitle }; | ||
| }; | ||
|
|
||
| module.exports = async (link, title) => { | ||
| const KIT_BASE = 'https://api.kit.com/v4'; | ||
| const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID); | ||
|
|
||
| // Schedule 1 minute ahead | ||
| const sendAt = new Date(Date.now() + 60 * 1000); | ||
|
|
||
| const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title); |
There was a problem hiding this comment.
❓ Verification inconclusive
In @actions/core for GitHub Actions, what does core.setFailed(message) return, and does it throw an exception by default?
In @actions/core, core.setFailed(message) returns void (i.e., no value). It sets process.exitCode = 1 (failure) and logs the failure via core.error(message). [1] It does not throw an exception by default. The implementation only sets process.exitCode and calls error(message); it has no throw statement. [1] However, it only affects the action’s exit status when the process eventually exits. If you want the step/job to stop immediately, you typically still need to throw (or otherwise return/exit) in your action code after calling setFailed. This matches the documented behavior that it “sets a failing exit code” rather than terminating the process. [2][3]
Citations:
- 1: https://github.com/actions/toolkit/blob/main/packages/core/src/core.ts
- 2: core.setFailed and core.error won't fail the workflow actions/toolkit#1385
- 3: https://docs.github.com/actions/creating-actions/setting-exit-codes-for-actions
Script executed:
# Check if the file exists and read the relevant section
if [ -f ".github/workflows/scripts/kit/index.js" ]; then
echo "=== File exists, reading lines 1-50 ==="
head -50 ".github/workflows/scripts/kit/index.js" | cat -n
else
echo "File not found at .github/workflows/scripts/kit/index.js"
# Try to find it
find . -name "index.js" -path "*kit*" 2>/dev/null
fiRepository: asyncapi/website
Repository: asyncapi/website
Exit code: 0
stdout:
=== File exists, reading lines 1-50 ===
1 /**
2 * This code is centrally managed in https://github.com/asyncapi/.github/
3 * Don't make changes to this file in this repo as they will be overwritten with changes made to the same file in above mentioned repo
4 */
5 const core = require('@actions/core');
6 const htmlContent = require('./htmlContent.js');
7
8 const sanitizeLinkAndTitle = (link, title) => {
9 // Validate inputs to prevent injection attacks
10 if (!link || typeof link !== 'string' || link.length > 2000) {
11 return core.setFailed('Invalid link parameter');
12 }
13 if (!title || typeof title !== 'string' || title.length > 500) {
14 return core.setFailed('Invalid title parameter');
15 }
16
17 let parsedLink;
18 try {
19 parsedLink = new URL(link);
20 } catch (error) {
21 return core.setFailed('Invalid link parameter');
22 }
23
24 if (parsedLink.protocol !== 'https:') {
25 return core.setFailed('Link must use https protocol');
26 }
27
28 // Sanitize title by removing control characters and limiting length
29 const sanitizedTitle = title.replace(/[\x00-\x1F\x7F]/g, '').substring(0, 250);
30 return { sanitizedLink: parsedLink.toString(), sanitizedTitle };
31 };
32
33 module.exports = async (link, title) => {
34 const KIT_BASE = 'https://api.kit.com/v4';
35 const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);
36
37 // Schedule 1 minute ahead
38 const sendAt = new Date(Date.now() + 60 * 1000);
39
40 const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
41
42 const res = await fetch(`${KIT_BASE}/broadcasts`, {
43 method: 'POST',
44 headers: {
45 'X-Kit-Api-Key': process.env.KIT_API_KEY,
46 'Content-Type': 'application/json'
47 },
48 body: JSON.stringify({
49 subject: `TSC attention required: ${sanitizedTitle}`,
50 preview_text: 'Check out the latest topic that TSC members have to be aware of',
Script executed:
# Search for other calls to sanitizeLinkAndTitle function
rg "sanitizeLinkAndTitle" ".github/workflows/scripts/kit/index.js"
# Also read the rest of the function to see the complete picture
echo "=== Reading lines 40-65 ==="
sed -n '40,65p' ".github/workflows/scripts/kit/index.js" | cat -nRepository: asyncapi/website
Repository: asyncapi/website
Exit code: 0
stdout:
const sanitizeLinkAndTitle = (link, title) => {
const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
=== Reading lines 40-65 ===
1 const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
2
3 const res = await fetch(`${KIT_BASE}/broadcasts`, {
4 method: 'POST',
5 headers: {
6 'X-Kit-Api-Key': process.env.KIT_API_KEY,
7 'Content-Type': 'application/json'
8 },
9 body: JSON.stringify({
10 subject: `TSC attention required: ${sanitizedTitle}`,
11 preview_text: 'Check out the latest topic that TSC members have to be aware of',
12 content: htmlContent(sanitizedLink, sanitizedTitle),
13 description: `TSC notification - ${new Date().toUTCString()}`,
14 public: false,
15 published_at: null,
16 send_at: sendAt.toISOString(),
17 subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
18 })
19 });
20
21 if (!res.ok) return core.setFailed(`Failed creating broadcast: ${await res.text()}`);
22 core.info(`Kit.com TSC broadcast scheduled for ${sendAt.toISOString()}`);
23 };
Replace core.setFailed() returns with thrown errors.
The sanitizeLinkAndTitle() function returns undefined on validation failures (via core.setFailed()), but line 40 attempts to destructure the result. This causes a TypeError when validation fails, crashing the action instead of failing gracefully.
Wrap the destructuring call in try-catch, throw errors from the sanitizer, and call core.setFailed() only in the catch block.
Proposed fix
const sanitizeLinkAndTitle = (link, title) => {
if (!link || typeof link !== 'string' || link.length > 2000) {
- return core.setFailed('Invalid link parameter');
+ throw new Error('Invalid link parameter');
}
if (!title || typeof title !== 'string' || title.length > 500) {
- return core.setFailed('Invalid title parameter');
+ throw new Error('Invalid title parameter');
}
let parsedLink;
try {
parsedLink = new URL(link);
} catch (error) {
- return core.setFailed('Invalid link parameter');
+ throw new Error('Invalid link parameter');
}
if (parsedLink.protocol !== 'https:') {
- return core.setFailed('Link must use https protocol');
+ throw new Error('Link must use https protocol');
}
const sanitizedTitle = title.replace(/[\x00-\x1F\x7F]/g, '').substring(0, 250);
return { sanitizedLink: parsedLink.toString(), sanitizedTitle };
};
module.exports = async (link, title) => {
+ try {
const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title);
const res = await fetch(`${KIT_BASE}/broadcasts`, {
method: 'POST',
headers: {
'X-Kit-Api-Key': process.env.KIT_API_KEY,
'Content-Type': 'application/json'
},
body: JSON.stringify({
subject: `TSC attention required: ${sanitizedTitle}`,
preview_text: 'Check out the latest topic that TSC members have to be aware of',
content: htmlContent(sanitizedLink, sanitizedTitle),
description: `TSC notification - ${new Date().toUTCString()}`,
public: false,
published_at: null,
send_at: sendAt.toISOString(),
subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }]
})
});
- if (!res.ok) return core.setFailed(`Failed creating broadcast: ${await res.text()}`);
+ if (!res.ok) throw new Error(`Failed creating broadcast: ${await res.text()}`);
core.info(`Kit.com TSC broadcast scheduled for ${sendAt.toISOString()}`);
+ } catch (error) {
+ core.setFailed(error.message);
+ }
};🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In @.github/workflows/scripts/kit/index.js around lines 8 - 40, The sanitizer
sanitizeLinkAndTitle currently calls core.setFailed(...) and returns undefined
on validation errors, causing a TypeError when the caller destructures {
sanitizedLink, sanitizedTitle }; change sanitizeLinkAndTitle to throw new
Error(...) with the same messages instead of calling core.setFailed, then in the
module.exports caller wrap the call/destructuring of sanitizeLinkAndTitle(link,
title) in a try/catch where you call core.setFailed(error.message) (or
core.setFailed(String(error))) and return/exit; ensure references to
sanitizedLink and sanitizedTitle remain the same and that URL parsing/validation
logic is preserved.
| const KIT_BASE = 'https://api.kit.com/v4'; | ||
| const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID); | ||
|
|
||
| // Schedule 1 minute ahead | ||
| const sendAt = new Date(Date.now() + 60 * 1000); | ||
|
|
||
| const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title); | ||
|
|
||
| const res = await fetch(`${KIT_BASE}/broadcasts`, { | ||
| method: 'POST', | ||
| headers: { | ||
| 'X-Kit-Api-Key': process.env.KIT_API_KEY, | ||
| 'Content-Type': 'application/json' | ||
| }, | ||
| body: JSON.stringify({ | ||
| subject: `TSC attention required: ${sanitizedTitle}`, | ||
| preview_text: 'Check out the latest topic that TSC members have to be aware of', | ||
| content: htmlContent(sanitizedLink, sanitizedTitle), | ||
| description: `TSC notification - ${new Date().toUTCString()}`, | ||
| public: false, | ||
| published_at: null, | ||
| send_at: sendAt.toISOString(), | ||
| subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }] | ||
| }) |
There was a problem hiding this comment.
Validate required environment variables before request construction.
KIT_API_KEY and KIT_TSC_TAG_ID are not validated. Invalid/missing values produce opaque API failures later instead of a clear early failure.
Proposed fix
module.exports = async (link, title) => {
const KIT_BASE = 'https://api.kit.com/v4';
const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID);
+ if (!process.env.KIT_API_KEY) {
+ return core.setFailed('Missing KIT_API_KEY');
+ }
+ if (!Number.isInteger(TSC_TAG_ID) || TSC_TAG_ID <= 0) {
+ return core.setFailed('KIT_TSC_TAG_ID must be a positive integer');
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const KIT_BASE = 'https://api.kit.com/v4'; | |
| const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID); | |
| // Schedule 1 minute ahead | |
| const sendAt = new Date(Date.now() + 60 * 1000); | |
| const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title); | |
| const res = await fetch(`${KIT_BASE}/broadcasts`, { | |
| method: 'POST', | |
| headers: { | |
| 'X-Kit-Api-Key': process.env.KIT_API_KEY, | |
| 'Content-Type': 'application/json' | |
| }, | |
| body: JSON.stringify({ | |
| subject: `TSC attention required: ${sanitizedTitle}`, | |
| preview_text: 'Check out the latest topic that TSC members have to be aware of', | |
| content: htmlContent(sanitizedLink, sanitizedTitle), | |
| description: `TSC notification - ${new Date().toUTCString()}`, | |
| public: false, | |
| published_at: null, | |
| send_at: sendAt.toISOString(), | |
| subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }] | |
| }) | |
| const KIT_BASE = 'https://api.kit.com/v4'; | |
| const TSC_TAG_ID = Number(process.env.KIT_TSC_TAG_ID); | |
| if (!process.env.KIT_API_KEY) { | |
| return core.setFailed('Missing KIT_API_KEY'); | |
| } | |
| if (!Number.isInteger(TSC_TAG_ID) || TSC_TAG_ID <= 0) { | |
| return core.setFailed('KIT_TSC_TAG_ID must be a positive integer'); | |
| } | |
| // Schedule 1 minute ahead | |
| const sendAt = new Date(Date.now() + 60 * 1000); | |
| const { sanitizedLink, sanitizedTitle } = sanitizeLinkAndTitle(link, title); | |
| const res = await fetch(`${KIT_BASE}/broadcasts`, { | |
| method: 'POST', | |
| headers: { | |
| 'X-Kit-Api-Key': process.env.KIT_API_KEY, | |
| 'Content-Type': 'application/json' | |
| }, | |
| body: JSON.stringify({ | |
| subject: `TSC attention required: ${sanitizedTitle}`, | |
| preview_text: 'Check out the latest topic that TSC members have to be aware of', | |
| content: htmlContent(sanitizedLink, sanitizedTitle), | |
| description: `TSC notification - ${new Date().toUTCString()}`, | |
| public: false, | |
| published_at: null, | |
| send_at: sendAt.toISOString(), | |
| subscriber_filter: [{ all: [{ type: 'tag', ids: [TSC_TAG_ID] }] }] | |
| }) |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In @.github/workflows/scripts/kit/index.js around lines 34 - 57, Ensure KIT API
key and tag id are validated before building the request: check
process.env.KIT_API_KEY is a non-empty string and that
Number(process.env.KIT_TSC_TAG_ID) yields a valid number (not NaN) and assign it
to TSC_TAG_ID; if either is missing/invalid, throw or log a clear error and
abort before calling fetch (the code that constructs body using TSC_TAG_ID,
sanitizedLink/sanitizedTitle, sendAt and calls fetch to `${KIT_BASE}/broadcasts`
should only run after validation). Ensure the error message names the missing
variable (KIT_API_KEY or KIT_TSC_TAG_ID) so failures are immediately actionable.
| headers: | ||
| # JMS protocol specific message headers | ||
| required: | ||
| - JMSMessageID | ||
| properties: | ||
| JMSMessageID: | ||
| name: JMSMessageID | ||
| description: A unique message identifier. This may be set by your JMS Provider on your behalf. | ||
| type: string | ||
| JMSReplyTo: | ||
| name: JMSReplyTo | ||
| description: The queue or topic that the message sender expects replies to. | ||
| type: string |
There was a problem hiding this comment.
Fix YAML indentation: properties must be nested under headers.
The specification states (line 121) that the headers field must be "A Schema object... of type object and have a properties key." However, in the example, properties is indented at the same level as headers, making it a sibling field in the jms binding object rather than part of the headers schema.
The correct structure should nest type, required, and properties under headers:
🐛 Proposed fix for YAML structure
jms:
headers:
# JMS protocol specific message headers
+ type: object
required:
- JMSMessageID
- properties:
- JMSMessageID:
- name: JMSMessageID
- description: A unique message identifier. This may be set by your JMS Provider on your behalf.
- type: string
- JMSReplyTo:
- name: JMSReplyTo
- description: The queue or topic that the message sender expects replies to.
- type: string
+ properties:
+ JMSMessageID:
+ description: A unique message identifier. This may be set by your JMS Provider on your behalf.
+ type: string
+ JMSReplyTo:
+ description: The queue or topic that the message sender expects replies to.
+ type: stringNote: Also removed redundant name fields from properties as they're already keyed by name.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| headers: | |
| # JMS protocol specific message headers | |
| required: | |
| - JMSMessageID | |
| properties: | |
| JMSMessageID: | |
| name: JMSMessageID | |
| description: A unique message identifier. This may be set by your JMS Provider on your behalf. | |
| type: string | |
| JMSReplyTo: | |
| name: JMSReplyTo | |
| description: The queue or topic that the message sender expects replies to. | |
| type: string | |
| headers: | |
| # JMS protocol specific message headers | |
| type: object | |
| required: | |
| - JMSMessageID | |
| properties: | |
| JMSMessageID: | |
| description: A unique message identifier. This may be set by your JMS Provider on your behalf. | |
| type: string | |
| JMSReplyTo: | |
| description: The queue or topic that the message sender expects replies to. | |
| type: string |
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@markdown/docs/reference/bindings/jms.md` around lines 136 - 148, The YAML
places properties as a sibling to headers; move the schema fields so headers
contains type, required and properties (i.e. nest type: object, required:
[JMSMessageID], and the properties map under headers) and remove redundant name
fields inside JMSMessageID and JMSReplyTo so the properties are keyed by
JMSMessageID/JMSReplyTo with description and type only; update the jms binding
example to reflect headers: { type: object, required: [...], properties: {
JMSMessageID: {...}, JMSReplyTo: {...} } }.
| ros2: | ||
| role: subscriber | ||
| node: /turtlesim | ||
| qosPolicies: |
There was a problem hiding this comment.
Fix YAML indentation.
The qosPolicies field has incorrect indentation (8 spaces instead of 6), which will cause a YAML parsing error. It should align with role and node at the same indentation level.
🐛 Proposed fix
role: subscriber
node: /turtlesim
- qosPolicies:
+ qosPolicies:
history: unknown
reliability: reliable🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@markdown/docs/reference/bindings/ros2.md` at line 99, The YAML snippet has
misaligned indentation for the field qosPolicies — it is indented two extra
spaces compared to peer keys like role and node, which will break parsing;
locate the qosPolicies key in the markdown example and reduce its indentation to
match role and node so all three are at the same level (align qosPolicies with
role and node).
| --- | ||
| title: 'scripts' | ||
| weight: 10 | ||
| --- | ||
|
|
||
| The entire `scripts` directory is centrally managed in [.github](https://github.com/asyncapi/.github/) repository. Any changes in this folder should be done in central repository. No newline at end of file |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Check the bindings directory structure to verify if "scripts" is a valid AsyncAPI binding
# and examine what other binding docs exist for context.
# List all files in the bindings directory to see the pattern
echo "=== Bindings directory structure ==="
fd . 'markdown/docs/reference/bindings' --type f --exec echo {}
echo -e "\n=== Check for AsyncAPI binding specifications ==="
# Check if there's any reference to "scripts" as a binding in AsyncAPI spec
rg -i "scripts.*binding" --type md -C2
echo -e "\n=== Sample content from other binding docs for comparison ==="
# Show content of a few other binding docs to compare structure
fd . 'markdown/docs/reference/bindings' --type f --max-results 3 --exec head -20 {}Repository: asyncapi/website
Length of output: 1484
🏁 Script executed:
#!/bin/bash
# Examine metadata files to understand directory structure
echo "=== Content of _section.md ==="
cat -n markdown/docs/reference/bindings/_section.md
echo -e "\n=== Content of index.md ==="
cat -n markdown/docs/reference/bindings/index.md
echo -e "\n=== Content of scripts.md (full) ==="
cat -n markdown/docs/reference/bindings/scripts.md
echo -e "\n=== Check parent directory structure ==="
ls -la markdown/docs/reference/Repository: asyncapi/website
Length of output: 1756
🏁 Script executed:
#!/bin/bash
# Search for references to scripts directory documentation
# and check the broader docs structure
echo "=== Search for references to scripts.md ==="
rg 'scripts\.md|scripts' markdown/docs/reference/ --type md | head -20
echo -e "\n=== Check broader docs directory structure ==="
fd -type d . 'markdown/docs' --max-depth 2
echo -e "\n=== Look for guides or infrastructure directories ==="
fd -type d '(guides|infrastructure|tools|development)' markdown/docs/
echo -e "\n=== Check if there's a .github directory documentation elsewhere ==="
rg '\.github' markdown/docs/ --type md -lRepository: asyncapi/website
Length of output: 3618
Move this file out of the bindings directory.
This file documents a scripts directory managed in the .github repository and is infrastructure/tooling documentation, not an AsyncAPI protocol binding specification. It does not belong in markdown/docs/reference/bindings/ alongside binding specifications (HTTP, Kafka, MQTT, etc.).
The appropriate location would be markdown/docs/community/040-guides/ or a similar documentation section for contributor/infrastructure guides, where it can properly document development tooling rather than protocol bindings.
🧰 Tools
🪛 LanguageTool
[uncategorized] ~6-~6: The official name of this software platform is spelled with a capital “H”.
Context: ...ts` directory is centrally managed in .github ...
(GITHUB)
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@markdown/docs/reference/bindings/scripts.md` around lines 1 - 6, This file
wrongly lives under bindings; move the document describing the centrally-managed
"scripts" directory out of markdown/docs/reference/bindings/ into a
community/infrastructure section such as markdown/docs/community/040-guides/,
update the front-matter (title/weight) as appropriate, adjust any internal links
or sidebar/TOC references that point to the old path, and remove or replace this
bindings entry so bindings listing no longer references the "scripts" doc.
| $ref: '#/channels/address' | ||
| messages: | ||
| - $ref: '#/channels/address/messages/personEvent' |
There was a problem hiding this comment.
Fix broken channel $ref paths in the first YAML example.
The example references #/channels/address/..., but the defined channel key is person. Copy-pasting this snippet will fail reference resolution.
Proposed fix
- $ref: '#/channels/address'
+ $ref: '#/channels/person'
...
- - $ref: '#/channels/address/messages/personEvent'
+ - $ref: '#/channels/person/messages/personEvent'Also applies to: 125-126
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@markdown/docs/reference/bindings/solace.md` around lines 103 - 105, The YAML
examples use incorrect $ref paths pointing to '#/channels/address' and
'#/channels/address/messages/personEvent' but the channel key is 'person';
update those $ref values to '#/channels/person' and
'#/channels/person/messages/personEvent' respectively in the first example (and
the same two occurrences around the later example) so the references resolve to
the defined channel key 'person'.
9506e78 to
94e8b66
Compare
- Guard GitHub, LinkedIn, and Twitter links with URL existence checks - Only set image URL when githubUrl is available - Extract addAdditionalUserInfo() to utils/ambassadors.ts for testability - Add 15 unit tests covering all social link scenarios - Add E2E tests for DOM rendering validation - Update CI/CD workflow to include build verification and E2E tests Fixes broken links when ambassador profiles lack social media URLs.
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (4)
utils/ambassadors.ts (1)
7-7: 💤 Low valueAdd an explicit return type for the public utility.
addAdditionalUserInfois exported and consumed bypages/community/ambassadors/index.tsx. Annotating the return type asAmbassadormakes the contract self-documenting and prevents accidental drift if implementation details change.♻️ Proposed change
-export function addAdditionalUserInfo(user: Ambassador) { +export function addAdditionalUserInfo(user: Ambassador): Ambassador {🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@utils/ambassadors.ts` at line 7, The exported utility function addAdditionalUserInfo lacks an explicit return type; update its signature to explicitly return Ambassador (e.g., function addAdditionalUserInfo(user: Ambassador): Ambassador) so callers like pages/community/ambassadors/index.tsx have a stable, self-documenting contract and the compiler will catch accidental return-type drift.tests/pages/community/ambassadors.test.ts (1)
7-17: 💤 Low valueOptional: extract an ambassador factory to reduce boilerplate.
The same 9-field ambassador literal is repeated ~12 times. A small helper would make individual tests focus on the relevant fields and reduce drift if
Ambassadorgains a new required field.♻️ Proposed change
+const makeAmbassador = (overrides: Partial<Ambassador> = {}): Ambassador => ({ + name: 'Test User', + img: '', + bio: 'Test bio', + title: 'Developer', + github: '', + twitter: '', + linkedin: '', + company: 'Test Company', + contributions: [], + ...overrides +}); + describe('addAdditionalUserInfo', () => { describe('GitHub URL handling', () => { it('should create githubUrl when github field exists', () => { - const ambassador: Ambassador = { - name: 'Test User', - img: '', - bio: 'Test bio', - title: 'Developer', - github: 'testuser', - twitter: '', - linkedin: '', - company: 'Test Company', - contributions: [] - }; - - const result = addAdditionalUserInfo(ambassador); + const result = addAdditionalUserInfo(makeAmbassador({ github: 'testuser' })); expect(result.githubUrl).toBe('https://www.github.com/testuser'); });🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@tests/pages/community/ambassadors.test.ts` around lines 7 - 17, The repeated Ambassador object should be replaced by a small factory to reduce boilerplate: add a helper function (e.g., createAmbassador or makeAmbassador) in tests/pages/community/ambassadors.test.ts that returns a default Ambassador object with the nine fields currently used (name, img, bio, title, github, twitter, linkedin, company, contributions) and accept a partial overrides parameter to override only fields needed per test; update existing usages of the ambassador literal to call createAmbassador({ ...overrides }) so tests focus on relevant fields and automatically adapt if Ambassador gains new required fields..github/workflows/if-nodejs-pr-testing.yml (1)
76-78: ⚡ Quick winConsider restricting
Build projectto a single OS.This new build step now runs on
ubuntu-latest,macos-latest, andwindows-latestfor every PR. For a Next.js website that deploys to a single (Linux) target, the macOS and Windows builds add CI minutes without providing additional release-quality signal. Consider gating withmatrix.os == 'ubuntu-latest'(matching the existing pattern forRun linteron Line 82 and the E2E step on Line 79).♻️ Proposed change
- - if: steps.packagejson.outputs.exists == 'true' + - if: steps.packagejson.outputs.exists == 'true' && matrix.os == 'ubuntu-latest' name: Build project run: npm run build --if-present🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In @.github/workflows/if-nodejs-pr-testing.yml around lines 76 - 78, The "Build project" workflow step currently runs on all matrix OSes; restrict it to only run on Ubuntu by adding the matrix OS guard to its conditional so it only executes when matrix.os == 'ubuntu-latest' in addition to steps.packagejson.outputs.exists == 'true' (update the step identified by its name "Build project" and the existing condition `if: steps.packagejson.outputs.exists == 'true'` to include the `matrix.os == 'ubuntu-latest'` check).cypress/ambassadors.cy.ts (1)
43-43: 💤 Low valueUse optional chaining (per SonarCloud).
Three sites — Lines 43, 59, and 165 — flagged by SonarCloud can be tightened with optional chaining for readability.
♻️ Proposed change
- if (href && href.includes('github.com')) { + if (href?.includes('github.com')) {- if (href && href.includes('linkedin.com')) { + if (href?.includes('linkedin.com')) {- return href && href.includes('github.com') && href !== 'undefined'; + return href?.includes('github.com') && href !== 'undefined';🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the rest with a brief reason, keep changes minimal, and validate. In `@cypress/ambassadors.cy.ts` at line 43, Replace the explicit truthy check plus includes call (patterns like "href && href.includes('github.com')") with optional chaining to simplify and harden the condition (e.g., use "href?.includes('github.com')"); update all occurrences in this file where "href && href.includes(...)" appears (the three conditional checks) so they use optional chaining instead of the logical-and null check.
🤖 Prompt for all review comments with AI agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
Inline comments:
In @.github/workflows/if-nodejs-pr-testing.yml:
- Around line 79-81: The E2E job runs npm run test:e2e (which currently just
does npx cypress run) without starting the app server, so Cypress cannot reach
http://127.0.0.1:3000; update the workflow or the test:e2e script to orchestrate
the server startup (e.g., use start-server-and-test to run the existing start
script that uses serve@latest out and then run cypress, or modify package.json
test:e2e to start the server and wait before invoking npx cypress run) so that
npm run test:e2e brings up the server on port 3000 before Cypress executes.
In `@cypress/ambassadors.cy.ts`:
- Around line 137-141: Remove the tautological assertions and make them
meaningful: replace expect($detail).to.contain.text('') inside the
cy.get('[data-testid="Ambassadors-members-details"]').each callback with a real
check (e.g., assert the text is not empty or matches a name/country pattern, or
use cy.wrap($detail).invoke('text').should('not.be.empty') and confirm the
expected children via
cy.wrap($detail).find('div').should('have.length.at.least', 2)). Also remove the
conditional guard around expect(hasValidLinks).to.be.true so the test asserts
the boolean directly (i.e., compute hasValidLinks and then call
expect(hasValidLinks).to.be.true without wrapping it in if (hasValidLinks)),
ensuring the assertion can fail when links are invalid.
---
Nitpick comments:
In @.github/workflows/if-nodejs-pr-testing.yml:
- Around line 76-78: The "Build project" workflow step currently runs on all
matrix OSes; restrict it to only run on Ubuntu by adding the matrix OS guard to
its conditional so it only executes when matrix.os == 'ubuntu-latest' in
addition to steps.packagejson.outputs.exists == 'true' (update the step
identified by its name "Build project" and the existing condition `if:
steps.packagejson.outputs.exists == 'true'` to include the `matrix.os ==
'ubuntu-latest'` check).
In `@cypress/ambassadors.cy.ts`:
- Line 43: Replace the explicit truthy check plus includes call (patterns like
"href && href.includes('github.com')") with optional chaining to simplify and
harden the condition (e.g., use "href?.includes('github.com')"); update all
occurrences in this file where "href && href.includes(...)" appears (the three
conditional checks) so they use optional chaining instead of the logical-and
null check.
In `@tests/pages/community/ambassadors.test.ts`:
- Around line 7-17: The repeated Ambassador object should be replaced by a small
factory to reduce boilerplate: add a helper function (e.g., createAmbassador or
makeAmbassador) in tests/pages/community/ambassadors.test.ts that returns a
default Ambassador object with the nine fields currently used (name, img, bio,
title, github, twitter, linkedin, company, contributions) and accept a partial
overrides parameter to override only fields needed per test; update existing
usages of the ambassador literal to call createAmbassador({ ...overrides }) so
tests focus on relevant fields and automatically adapt if Ambassador gains new
required fields.
In `@utils/ambassadors.ts`:
- Line 7: The exported utility function addAdditionalUserInfo lacks an explicit
return type; update its signature to explicitly return Ambassador (e.g.,
function addAdditionalUserInfo(user: Ambassador): Ambassador) so callers like
pages/community/ambassadors/index.tsx have a stable, self-documenting contract
and the compiler will catch accidental return-type drift.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
Run ID: 6b09f5d3-87a0-4bea-b3a9-8103696c424f
📒 Files selected for processing (7)
.github/workflows/if-nodejs-pr-testing.ymlconfig/tools.jsoncypress/ambassadors.cy.tspages/community/ambassadors/index.tsxtests/pages/community/ambassadors.test.tstsconfig.jsonutils/ambassadors.ts
| - if: steps.packagejson.outputs.exists == 'true' && matrix.os == 'ubuntu-latest' | ||
| name: Run E2E tests (Cypress) | ||
| run: npm run test:e2e --if-present |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Verify the test:e2e script and any server orchestration in package.json
fd -t f '^package\.json$' -d 2 -x cat {} \; | python3 -c "
import json, sys
pkg = json.load(sys.stdin)
scripts = pkg.get('scripts', {})
for k in ['test:e2e', 'cy:run', 'cypress:run', 'dev', 'start']:
if k in scripts:
print(f'{k}: {scripts[k]}')
print('---deps---')
deps = {**pkg.get('dependencies', {}), **pkg.get('devDependencies', {})}
for d in ['cypress', 'start-server-and-test', 'wait-on', 'concurrently']:
if d in deps:
print(f'{d}: {deps[d]}')
"Repository: asyncapi/website
Length of output: 181
🏁 Script executed:
cat .github/workflows/if-nodejs-pr-testing.ymlRepository: asyncapi/website
Length of output: 7998
🏁 Script executed:
fd -t f 'cypress\.config\.(js|ts|mjs|cjs)$'Repository: asyncapi/website
Length of output: 78
🏁 Script executed:
cat cypress.config.jsRepository: asyncapi/website
Length of output: 566
The test:e2e script does not orchestrate a server, causing E2E tests to fail.
The script is bare npx cypress run without server orchestration, and the workflow has no step to start a server. Cypress is configured to connect to http://127.0.0.1:3000, but nothing serves on that port when tests run. Use start-server-and-test or similar to wrap cypress run with a server launcher (likely the start script which uses serve@latest out), or integrate server startup into the test:e2e script itself.
🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In @.github/workflows/if-nodejs-pr-testing.yml around lines 79 - 81, The E2E job
runs npm run test:e2e (which currently just does npx cypress run) without
starting the app server, so Cypress cannot reach http://127.0.0.1:3000; update
the workflow or the test:e2e script to orchestrate the server startup (e.g., use
start-server-and-test to run the existing start script that uses serve@latest
out and then run cypress, or modify package.json test:e2e to start the server
and wait before invoking npx cypress run) so that npm run test:e2e brings up the
server on port 3000 before Cypress executes.
| cy.get('[data-testid="Ambassadors-members-details"]').each(($detail) => { | ||
| expect($detail).to.contain.text(''); | ||
| // Should have name and country | ||
| cy.wrap($detail).find('div').should('have.length.at.least', 2); | ||
| }); |
There was a problem hiding this comment.
Tautological / no-op assertions.
expect($detail).to.contain.text('') is always true (every element "contains" the empty string), so this assertion provides no coverage. Similarly, the GitHub edge-case test at Lines 168-171 wraps expect(hasValidLinks).to.be.true inside if (hasValidLinks), which means the assertion can never fail — if hasValidLinks is false the block is skipped, and if it's true the assertion is trivially satisfied.
🛡️ Proposed fix
- it('should display ambassador name and country', () => {
- cy.get('[data-testid="Ambassadors-members-details"]').each(($detail) => {
- expect($detail).to.contain.text('');
- // Should have name and country
- cy.wrap($detail).find('div').should('have.length.at.least', 2);
- });
- });
+ it('should display ambassador name and country', () => {
+ cy.get('[data-testid="Ambassadors-members-details"]').each(($detail) => {
+ expect($detail.text().trim()).to.not.equal('');
+ expect($detail.find('div').length).to.be.at.least(2);
+ });
+ });- // At least some ambassadors should have GitHub
- if (hasValidLinks) {
- expect(hasValidLinks).to.be.true;
- }
+ expect(hasValidLinks, 'at least one ambassador should expose a GitHub link').to.be.true;🤖 Prompt for AI Agents
Verify each finding against current code. Fix only still-valid issues, skip the
rest with a brief reason, keep changes minimal, and validate.
In `@cypress/ambassadors.cy.ts` around lines 137 - 141, Remove the tautological
assertions and make them meaningful: replace expect($detail).to.contain.text('')
inside the cy.get('[data-testid="Ambassadors-members-details"]').each callback
with a real check (e.g., assert the text is not empty or matches a name/country
pattern, or use cy.wrap($detail).invoke('text').should('not.be.empty') and
confirm the expected children via
cy.wrap($detail).find('div').should('have.length.at.least', 2)). Also remove the
conditional guard around expect(hasValidLinks).to.be.true so the test asserts
the boolean directly (i.e., compute hasValidLinks and then call
expect(hasValidLinks).to.be.true without wrapping it in if (hasValidLinks)),
ensuring the assertion can fail when links are invalid.
|





Description
This fixes the Finance page memory leak by removing the page-level resize listener that was not being cleaned up correctly. It also replaces the JavaScript width check with responsive CSS so the page no longer shifts layout after load.
Related issue(s)
Resolves #5114
Summary by CodeRabbit
New Features
Refactor
Tests
Chores