Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
19 changes: 14 additions & 5 deletions osidb/mixins.py
Original file line number Diff line number Diff line change
Expand Up @@ -72,12 +72,14 @@ def save(self, *args, auto_timestamps=True, **kwargs):
# updated_dt should never change from the DB version
# otherwise assume that there was a conflicting parallel change
if db_self is not None and db_self.updated_dt != self.updated_dt:
raise DataInconsistencyException(
"Save operation based on an outdated model instance: "
f"Updated datetime in the request {self.updated_dt} "
f"differes from the DB {db_self.updated_dt}. "
"You need to refresh."
actual_event_user = self._get_actual_user()

message = (
f"Save operation based on an outdated model instance by "
f"{actual_event_user} with updated_dt {self.updated_dt} "
f"differs from the DB {db_self.updated_dt} check in history for details."
)
raise DataInconsistencyException(f"{message} You need to refresh.")

# auto-set updated_dt as now on any change
# cut off the microseconds to allow mid-air
Expand All @@ -86,6 +88,13 @@ def save(self, *args, auto_timestamps=True, **kwargs):

super().save(*args, **kwargs)

def _get_actual_user(self):
ctx = getattr(
pghistory.runtime._tracker, "value", None
) # active context for this thread, if any
actual_user = (ctx.metadata.get("user") if ctx else None) or "unknown"
return actual_user

Comment on lines +91 to +97
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the user is already logged on the access logs (check for /var/log/(prod|stage|uat)-access.log on splunk)

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm getting the user so people can tell from the error why their instance is outdated without having to go to the history or the logs.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure I follow the logic of adding the user here. If the message is for the person making the change, would _get_actual_user not just return that person?

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, is true. This was a leftover from the first change where I took the last change user from history to explain the user who or what changed the model.

@costaconrado now that the history is indexed, is it safe to take the last changed from history?

if not it true that it doesn't really make sense.


class TrackingMixinManager(models.Manager):
"""
Expand Down
58 changes: 47 additions & 11 deletions osidb/sync_manager.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,9 @@
import json
from contextlib import contextmanager
from datetime import timedelta
from typing import Optional, Type
from typing import Any, Iterator, Optional, Type

import pghistory
from celery.exceptions import Ignore
from celery.utils.log import get_task_logger
from django.conf import settings
Expand All @@ -15,6 +17,22 @@
logger = get_task_logger(__name__)


@contextmanager
def pghistory_context(
action: str,
celery_task_id: str,
*,
source: str = "celery",
user: str = "celery_task",
**extra_context: Any,
) -> Iterator[None]:
ctx = {"source": source, "user": user, "action": action, **extra_context}
if celery_task_id:
ctx["celery_task_id"] = celery_task_id
with pghistory.context(**ctx):
yield
Comment on lines +20 to +33
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

celery_task_id should be typed as Optional[str].

Every call site passes getattr(getattr(task, "request", None), "id", None), which can legitimately return None, and the body already handles that case with if celery_task_id:. Declaring the parameter as str misrepresents the contract and will trip type checkers/IDE warnings on every caller.

Proposed fix
 `@contextmanager`
 def pghistory_context(
     action: str,
-    celery_task_id: str,
+    celery_task_id: Optional[str],
     *,
     source: str = "celery",
     user: str = "celery_task",
     **extra_context: Any,
 ) -> Iterator[None]:
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@osidb/sync_manager.py` around lines 20 - 33, The pghistory_context signature
declares celery_task_id as str but callers pass None; update the parameter type
to Optional[str] (and import Optional if needed) so it becomes celery_task_id:
Optional[str], keeping the existing conditional handling (if celery_task_id:)
and no other logic changes; reference the pghistory_context function to locate
the change.



class SyncManager(models.Model):
"""
Model to handle synchronization of some OSIDB data with external system like Bugzilla
Expand Down Expand Up @@ -486,7 +504,7 @@ def __str__(self):
return result

@staticmethod
def link_tracker_with_affects(tracker_id):
def link_tracker_with_affects(tracker_id, task="UNKNOWN"):
# Code adapted from collectors.bzimport.convertors.BugzillaTrackerConvertor.affects

from osidb.models import Affect, Flaw, Tracker
Expand Down Expand Up @@ -561,9 +579,13 @@ def link_tracker_with_affects(tracker_id):
affects = list(set(affects))

with transaction.atomic():
tracker.affects.clear()
tracker.affects.add(*affects)
tracker.save(raise_validation_error=False, auto_timestamps=False)
with pghistory_context(
action="link_tracker_with_affects",
celery_task_id=getattr(getattr(task, "request", None), "id", None),
):
tracker.affects.clear()
tracker.affects.add(*affects)
tracker.save(raise_validation_error=False, auto_timestamps=False)

return affects, failed_flaws, failed_affects

Expand All @@ -580,7 +602,9 @@ def sync_task(task, tracker_id, **kwargs):
collector = collectors.BugzillaTrackerCollector()
try:
collector.sync_tracker(tracker_id)
result = BZTrackerDownloadManager.link_tracker_with_affects(tracker_id)
result = BZTrackerDownloadManager.link_tracker_with_affects(
tracker_id, task
)
# Handle link failures
affects, failed_flaws, failed_affects = result
if failed_flaws:
Expand Down Expand Up @@ -819,8 +843,12 @@ def sync_task(task, flaw_id, **kwargs):
set_user_acls(settings.ALL_GROUPS)

try:
flaw = Flaw.objects.get(uuid=flaw_id)
flaw._create_or_update_task()
with pghistory_context(
action="jira_task_sync",
celery_task_id=getattr(getattr(task, "request", None), "id", None),
):
flaw = Flaw.objects.get(uuid=flaw_id)
flaw._create_or_update_task()

except Exception as e:
JiraTaskSyncManager.failed(flaw_id, e)
Expand Down Expand Up @@ -869,8 +897,12 @@ def sync_task(task, flaw_id, **kwargs):
set_user_acls(settings.ALL_GROUPS)

try:
flaw = Flaw.objects.get(uuid=flaw_id)
flaw._transition_task()
with pghistory_context(
action="jira_task_transition",
celery_task_id=getattr(getattr(task, "request", None), "id", None),
):
flaw = Flaw.objects.get(uuid=flaw_id)
flaw._transition_task()

except Exception as e:
JiraTaskTransitionManager.failed(flaw_id, e)
Expand Down Expand Up @@ -990,7 +1022,11 @@ def sync_task(task, tracker_id, **kwargs):
tracker_data = JiraQuerier().get_issue(tracker_id)
tracker = JiraTrackerConvertor(tracker_data).tracker
if tracker:
tracker.save()
with pghistory_context(
action="jira_tracker_download",
celery_task_id=getattr(getattr(task, "request", None), "id", None),
):
tracker.save()

# Link this Jira tracker to OSIDB Affects as part of the sync process.
result = JiraTrackerDownloadManager.link_tracker_with_affects(tracker_id)
Comment thread
Jincxz marked this conversation as resolved.
Comment on lines 1024 to 1032
Copy link
Copy Markdown

@coderabbitai coderabbitai Bot Apr 22, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Jira tracker path loses pghistory attribution for the affects re-link — inconsistent with the BZ flow.

For the BZ flow, BZTrackerDownloadManager.link_tracker_with_affects itself opens a pghistory_context(action="link_tracker_with_affects", celery_task_id=...) around the clear()/add()/save() (lines 581‑588), and sync_task forwards task into it (lines 605‑607). For the Jira flow:

  1. JiraTrackerDownloadManager.link_tracker_with_affects (lines 1004‑1007) performs the same tracker.affects.clear()/add()/save() with no pghistory_context wrapping at all.
  2. Here, link_tracker_with_affects(tracker_id) is called outside the pghistory_context block at line 1032 and is not passed task.

Net effect: every Jira-driven tracker re-link writes history entries without source=celery / action / celery_task_id, which is exactly the attribution gap this PR is meant to close. Please mirror the BZ approach — accept task in JiraTrackerDownloadManager.link_tracker_with_affects and wrap its atomic block in pghistory_context, then forward task from sync_task.

Proposed fix
     `@staticmethod`
-    def link_tracker_with_affects(tracker_id):
+    def link_tracker_with_affects(tracker_id, task=None):
         # Code adapted from collectors.jiraffe.convertors.JiraTrackerConvertor.affects
         ...
         with transaction.atomic():
-            tracker.affects.clear()
-            tracker.affects.add(*affects)
-            tracker.save(raise_validation_error=False, auto_timestamps=False)
+            with pghistory_context(
+                action="link_tracker_with_affects",
+                celery_task_id=getattr(getattr(task, "request", None), "id", None),
+            ):
+                tracker.affects.clear()
+                tracker.affects.add(*affects)
+                tracker.save(raise_validation_error=False, auto_timestamps=False)

And in sync_task:

-            # Link this Jira tracker to OSIDB Affects as part of the sync process.
-            result = JiraTrackerDownloadManager.link_tracker_with_affects(tracker_id)
+            # Link this Jira tracker to OSIDB Affects as part of the sync process.
+            result = JiraTrackerDownloadManager.link_tracker_with_affects(
+                tracker_id, task
+            )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@osidb/sync_manager.py` around lines 1024 - 1032, The Jira flow loses
pghistory attribution because
JiraTrackerDownloadManager.link_tracker_with_affects does not accept the task
nor wrap the affects clear/add/save in pghistory_context; update
JiraTrackerDownloadManager.link_tracker_with_affects to accept a task parameter,
wrap the tracker.affects.clear()/add()/save() block with
pghistory_context(action="link_tracker_with_affects",
celery_task_id=getattr(getattr(task, "request", None), "id", None)), and then
modify the caller in sync_task to pass the current task through when calling
JiraTrackerDownloadManager.link_tracker_with_affects(tracker_id, task) so
history entries include source/action/celery_task_id.

Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It was left like this so only when an actual change in the tracker happens it will use the context.

The linking is happening always so it will only create noise.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@roduran-dev, understood — the deliberate choice to limit pghistory_context to the conditional tracker.save() block (where an actual tracker change occurs) and exclude the always-running link_tracker_with_affects call makes sense. Annotating routine/unconditional operations would reduce the signal-to-noise ratio of the history context. I'll withdraw the suggestion.


📝 Learnings were identified and not saved (knowledge base disabled). Enable

Expand Down
Loading