Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
52aae5b
Disable GarbageCollector task initially
manning-ncsa Apr 27, 2026
a219a74
Implement function argument support in blast_admin.py
manning-ncsa Apr 27, 2026
23b768c
Set status to failed for unhandled exceptions in HostMatch
manning-ncsa Apr 27, 2026
352bc1b
Simplify HostMatch run function
manning-ncsa Apr 27, 2026
5cd6aa4
Avoid creating duplicate Host objects when host matching
manning-ncsa Apr 27, 2026
795deff
Host name should not start with an underscore
manning-ncsa Apr 27, 2026
adaead8
Merge branch 'main' into 400-host-matching
manning-ncsa Apr 27, 2026
81897a3
Code linting
manning-ncsa Apr 27, 2026
7f10ce2
Handle importing archive file with no host match
manning-ncsa Apr 27, 2026
068face
Use the object ID if Prost returns an empty name
manning-ncsa Apr 27, 2026
c5fda89
tasks_initialized is a character field not boolean
manning-ncsa Apr 28, 2026
a765648
Simplify import_transient_list function
manning-ncsa Apr 28, 2026
d2acf8e
Simplify task initialization in transient_workflow function
manning-ncsa Apr 28, 2026
6a22d45
Remove obsolete TODO note
manning-ncsa Apr 28, 2026
0ad4578
Remove obsolete InitializeTransientTasks periodic task
manning-ncsa Apr 28, 2026
249dd09
Remove redundant initialize_all_tasks_status in reprocess_transient
manning-ncsa Apr 28, 2026
a791d70
Improve logging in transient_workflow function
manning-ncsa Apr 28, 2026
f655abe
Do not delete Transient when associated User or Host are deleted
manning-ncsa Apr 29, 2026
c857c7b
Code linting. Remove unused CatalogManager.
manning-ncsa Apr 29, 2026
5fe4992
Enforce Host name rules with validator
manning-ncsa Apr 29, 2026
cf8d9fe
Revert Host name rules
manning-ncsa Apr 29, 2026
65b7a5a
Add fields to Host: object_id, catalog_name, catalog_release
manning-ncsa Apr 29, 2026
52f19d7
Revise Transient name validator function
manning-ncsa Apr 29, 2026
4038d6d
Remove unused TransientImportForm
manning-ncsa Apr 29, 2026
90b1111
Do not requre Host.name to be unique
manning-ncsa Apr 29, 2026
34b2078
Refactor host matching task to
manning-ncsa Apr 29, 2026
bc2a014
Update changelog
manning-ncsa Apr 29, 2026
1761020
Merge branch 'main' into 400-host-matching
manning-ncsa Apr 29, 2026
a7ad98a
Update changelog
manning-ncsa Apr 29, 2026
09c8264
Bump to v1.11.0
manning-ncsa Apr 29, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 39 additions & 5 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,23 +13,57 @@ Types of changes:
- `Fixed`: for any bug fixes.
- `Security`: in case of vulnerabilities.

## [1.11.0]

### Added

- Added three new fields to the Host mode (`object_id`, `catalog_name`, `catalog_release`) to capture the
source catalog information provided by the host matcher.
- Added support for input arguments to custom functions invoked by the `blast_admin` Django custom
management command. See `blast_admin.py` module docstring for details.

### Changed

- The `HostMatch._run_process()` function was altered to treat any unhandled exception as a failure with a
corresponding "failed" status message. Previously, an unhandled exception would result in a misleading
"no host match" status.
- Additional logic was added to the host matching task to avoid creating duplicate `Host` objects: The host
information returned by Prost is compared against existing Host objects by cone search and by catalog information.
- Updated the host information card displayed on result pages to include the new catalog information.
- Removed redundant call to `initialize_all_tasks_status()` in `reprocess_transient()`

### Removed

- Removed the obsolete "Initialize transient task" periodic task. Workflows for new transients ingested
from TNS are now triggered upon discovery. Thus, all pathways for adding new transients now automatically
initialize and trigger workflows immediately, eliminating the need for this periodic task.
- Removed unused TransientImportForm
- Removed unused CatalogManage

### Fixed

- Fixed a bug in the dataset archive importer to support importing transients lacking host information.

## [1.10.0]

### Deprecated

- The `log_age` columns (`log_age_16`, `log_age_50`, `log_age_84`) in the `SEDFittingResult` model are deprecated. They have been replaced by
corresponding `age` columns and will be removed in a future release.
- The `log_age` columns (`log_age_16`, `log_age_50`, `log_age_84`) in the `SEDFittingResult` model are
deprecated. They have been replaced by corresponding `age` columns and will be removed in a future release.

### Changed

- Updated documentation to reflect how the `log_age` columns in the `SEDFittingResult` model will be deprecated and replaced by the `age` columns.
- Updated documentation to reflect how the `log_age` columns in the `SEDFittingResult` model will be
deprecated and replaced by the `age` columns.
- Updated units in documentation for above columns from log years to gigayears.
- Prospector results will feed the age info to the `age` columns as well.
- Updated archive file import algorithm to populate the `age` values from `log_age` values if the `age` keys are missing.
- Updated archive file import algorithm to populate the `age` values from `log_age` values if the `age`
keys are missing.

### Fixed

- Added `age` columns to the `SEDFittingResult` model to accurately reflect how the age is in gigayears and not in `log_10` years as implied by the `log_age` columns.
- Added `age` columns to the `SEDFittingResult` model to accurately reflect how the age is in gigayears
and not in `log_10` years as implied by the `log_age` columns.

## [1.9.5]

Expand Down
2 changes: 1 addition & 1 deletion app/app/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
######################################################################
# Blast application config
#
APP_VERSION = '1.10.0'
APP_VERSION = '1.11.0'
# Data paths
DUSTMAPS_DATA_ROOT = os.environ.get("DUSTMAPS_DATA_ROOT", "/data/dustmaps")
CUTOUT_ROOT = os.environ.get("CUTOUT_ROOT", "/data/cutout_cdn")
Expand Down
1 change: 0 additions & 1 deletion app/host/debug_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,6 @@ def rerun_failed_task(task_register):
GlobalHostSEDFitting(task_register.transient.name),
LocalHostSEDFitting(task_register.transient.name),
TNSDataIngestion(),
InitializeTransientTasks(),
IngestMissedTNSTransients(),
]

Expand Down
4 changes: 0 additions & 4 deletions app/host/forms.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,10 +30,6 @@ def __init__(self, *args, **kwargs):
)


class TransientImportForm(forms.Form):
file = forms.FileField()


class TransientUploadForm(forms.Form):
tns_names = forms.CharField(
widget=forms.Textarea(attrs={
Expand Down
85 changes: 43 additions & 42 deletions app/host/host_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1030,48 +1030,49 @@ def process_transient_dataset(dataset):
# TODO: How concerned should we be about duplicates? Should we perform a cone search instead of assuming
# perfect coordinate matching? Should the redshift values be updated from the imported data if they are
# missing?
host_name = dataset['host']['fields']['name']
ra_deg = dataset['host']['fields']['ra_deg']
dec_deg = dataset['host']['fields']['dec_deg']
cone_search = (Q(ra_deg__gte=ra_deg - ARCSEC_RA_IN_DEG)
& Q(ra_deg__lte=ra_deg + ARCSEC_RA_IN_DEG)
& Q(dec_deg__gte=dec_deg - ARCSEC_DEC_IN_DEG)
& Q(dec_deg__lte=dec_deg + ARCSEC_DEC_IN_DEG))
proximate_hosts = Host.objects.filter(cone_search)
if proximate_hosts:
logger.info(f'''{len(proximate_hosts)} existing hosts were found within an arcsecond of '''
f'''importing host "{host_name}".''')
host = None
# If there is an existing proximate host for an unnamed host, claim this is the same host
if not host_name and proximate_hosts:
host = proximate_hosts[0]
elif host_name:
# Find existing hosts with the same name
host_search = Host.objects.filter(name__exact=host_name)
if host_search:
# If the host name matches, require that the position overlaps
proximity_search = host_search.filter(cone_search)
# Consider the import a failure if there is an inconsistent host definition
if not proximity_search:
record_import_error(transient_name,
f'[{transient_name}] Host with matching name "{host_name}" '
f'exists, but it is in a different location.')
return
# If the name and location match, claim this is the same host
host = proximity_search[0]
# If no host match was found, create a new Host object
if not host:
host = Host.objects.create(
ra_deg=dataset['host']['fields']['ra_deg'],
dec_deg=dataset['host']['fields']['dec_deg'],
name=dataset['host']['fields']['name'],
redshift=dataset['host']['fields']['redshift'],
redshift_err=dataset['host']['fields']['redshift_err'],
photometric_redshift=dataset['host']['fields']['photometric_redshift'],
photometric_redshift_err=dataset['host']['fields']['photometric_redshift_err'],
milkyway_dust_reddening=dataset['host']['fields']['milkyway_dust_reddening'],
software_version=dataset['host']['fields']['software_version'],
)
if dataset['host']:
host_name = dataset['host']['fields']['name']
ra_deg = dataset['host']['fields']['ra_deg']
dec_deg = dataset['host']['fields']['dec_deg']
cone_search = (Q(ra_deg__gte=ra_deg - ARCSEC_RA_IN_DEG)
& Q(ra_deg__lte=ra_deg + ARCSEC_RA_IN_DEG)
& Q(dec_deg__gte=dec_deg - ARCSEC_DEC_IN_DEG)
& Q(dec_deg__lte=dec_deg + ARCSEC_DEC_IN_DEG))
proximate_hosts = Host.objects.filter(cone_search)
if proximate_hosts:
logger.info(f'''{len(proximate_hosts)} existing hosts were found within an arcsecond of '''
f'''importing host "{host_name}".''')
# If there is an existing proximate host for an unnamed host, claim this is the same host
if not host_name and proximate_hosts:
host = proximate_hosts[0]
elif host_name:
# Find existing hosts with the same name
host_search = Host.objects.filter(name__exact=host_name)
if host_search:
# If the host name matches, require that the position overlaps
proximity_search = host_search.filter(cone_search)
# Consider the import a failure if there is an inconsistent host definition
if not proximity_search:
record_import_error(transient_name,
f'[{transient_name}] Host with matching name "{host_name}" '
f'exists, but it is in a different location.')
return
# If the name and location match, claim this is the same host
host = proximity_search[0]
# If no host match was found, create a new Host object
if not host:
host = Host.objects.create(
ra_deg=dataset['host']['fields']['ra_deg'],
dec_deg=dataset['host']['fields']['dec_deg'],
name=dataset['host']['fields']['name'],
redshift=dataset['host']['fields']['redshift'],
redshift_err=dataset['host']['fields']['redshift_err'],
photometric_redshift=dataset['host']['fields']['photometric_redshift'],
photometric_redshift_err=dataset['host']['fields']['photometric_redshift_err'],
milkyway_dust_reddening=dataset['host']['fields']['milkyway_dust_reddening'],
software_version=dataset['host']['fields']['software_version'],
)
# Verify that the Cutout objects do not exist (by name).
for cutout in dataset['cutouts']:
cutout_name = cutout['fields']['name']
Expand Down Expand Up @@ -1268,7 +1269,7 @@ def process_transient_dataset(dataset):
tr_obj.save()
# Calculate workflow progress and mark tasks as initialized so retriggering works.
transient.progress, transient.processing_status = get_processing_status_and_progress(transient)
transient.tasks_initialized = True
transient.tasks_initialized = "True"
transient.save()
# Record successful database import
imported_transient_names.append(transient.name)
Expand Down
45 changes: 35 additions & 10 deletions app/host/management/commands/blast_admin.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,15 @@

The usage syntax is as follows:

python manage.py blast_admin [func_name]
python manage.py blast_admin [func_name] --input_args '{"arg1": "val1", "arg2": "val2"}'

where "func_name()" is defined in either "util.py" (public code) or "local_util.py"
(local scratch ignored by Git).
(local scratch ignored by Git), and --input_args is a JSON-formatted string containing either
a list of scalar values to pass as positional arguments to "func_name" or a dictionary to be
passed as keyword arguments.
"""
from django.core.management.base import BaseCommand
from django.core.management.base import BaseCommand, CommandError
import json
from host.log import get_logger
logger = get_logger(__name__)
from .util import * # noqa: F401,F403
Expand All @@ -25,13 +28,35 @@ class Command(BaseCommand):
help = "Run scratch function"

def add_arguments(self, parser):
parser.add_argument("func_name", type=str)
parser.add_argument('func_name', type=str,
help="Fully-qualified function name to call, e.g. 'myapp.utils.process_item'")
parser.add_argument('--input_args', type=str, default='[]',
help=(
"JSON string representing the function arguments. "
"Use a list for positional args or an object for keyword args. "
))

def handle(self, *args, **options):
func = options['func_name']

# Parse the JSON argument payload.
# Accept either a list (positional args) or dict (keyword args).
raw_args = options['input_args']
try:
eval(f'''{options['func_name']}()''')
except NameError as err:
logger.error(err)
if not scratch_module_exists:
logger.error('''Custom functions must be defined in a local '''
'''"app/host/management/commands/local_util.py" module.''')
parsed = json.loads(raw_args)
except json.JSONDecodeError as err:
raise CommandError(f"Could not parse JSON args: {err}")

if isinstance(parsed, list):
call_args = parsed
call_kwargs = {}
elif isinstance(parsed, dict):
# Decide: if keys are numeric-string and user intended positional args, they should pass a list.
call_args = []
call_kwargs = parsed
else:
# Single scalar -> pass as single positional arg
call_args = [parsed]
call_kwargs = {}

eval(f'''{func}(*{call_args}, **{call_kwargs})''')
6 changes: 0 additions & 6 deletions app/host/managers.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@
logger = get_logger(__name__)



class TransientManager(models.Manager):
def get_by_natural_key(self, name):
return self.get(name=name)
Expand All @@ -29,11 +28,6 @@ def get_by_natural_key(self, name):
return self.get(name=name)


class CatalogManager(models.Manager):
def get_by_natural_key(self, name):
return self.get(name=name)


class FilterManager(models.Manager):
def get_by_natural_key(self, name):
return self.get(name=name)
Expand Down
42 changes: 42 additions & 0 deletions app/host/migrations/0049_host_info.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,42 @@
# Generated by Django 5.1.14 on 2026-04-29 18:17

import django.db.models.deletion
from django.conf import settings
from django.db import migrations, models


class Migration(migrations.Migration):

dependencies = [
('host', '0048_sedfittingresult_age_16_sedfittingresult_age_50_and_more'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]

operations = [
migrations.AddField(
model_name='host',
name='catalog_name',
field=models.CharField(max_length=100, null=True),
),
migrations.AddField(
model_name='host',
name='catalog_release',
field=models.CharField(max_length=100, null=True),
),
migrations.AddField(
model_name='host',
name='object_id',
field=models.CharField(blank=True, max_length=100, null=True),
),
migrations.AlterField(
model_name='transient',
name='added_by',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL,
to=settings.AUTH_USER_MODEL),
),
migrations.AlterField(
model_name='transient',
name='host',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='host.host')
),
]
15 changes: 9 additions & 6 deletions app/host/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
import re

from .managers import ApertureManager
from .managers import CatalogManager
from .managers import CutoutManager
from .managers import FilterManager
from .managers import HostManager
Expand Down Expand Up @@ -92,6 +91,9 @@ class Host(SkyObject):
photometric_redshift = models.FloatField(null=True, blank=True)
photometric_redshift_err = models.FloatField(null=True, blank=True)
milkyway_dust_reddening = models.FloatField(null=True, blank=True)
object_id = models.CharField(max_length=100, blank=True, null=True)
catalog_name = models.CharField(max_length=100, blank=False, null=True)
catalog_release = models.CharField(max_length=100, blank=False, null=True)
objects = HostManager()
software_version = models.CharField(max_length=50, blank=True, null=True)

Expand Down Expand Up @@ -138,24 +140,25 @@ def validate_name(name):
raise ValidationError(f'''Invalid transient identifier: "{name}" must begin and end with alphanumeric '''
'''characters, and may include underscores and hyphens. '''
'''Spaces are not allowed.''')
if name.find('--') > -1 or name.find('__') > -1:
raise ValidationError(f'''Invalid transient identifier: "{name}" may not contain consecutive '''
'''underscores or hyphens.''')
for nonconsecutive_char in '-_':
if name.find(nonconsecutive_char * 2) > -1:
raise ValidationError(f'''Invalid transient identifier: "{name}" may not contain consecutive '''
f'''"{nonconsecutive_char}" characters.''')

name = models.CharField(max_length=64, unique=True, validators=[validate_name])
display_name = models.CharField(null=True, blank=True)
tns_id = models.IntegerField()
tns_prefix = models.CharField(max_length=20)
public_timestamp = models.DateTimeField(null=True, blank=True)
host = models.ForeignKey(Host, on_delete=models.CASCADE, null=True, blank=True)
host = models.ForeignKey(Host, on_delete=models.SET_NULL, null=True, blank=True)
objects = TransientManager()
tasks_initialized = models.CharField(max_length=20, default="False")
redshift = models.FloatField(null=True, blank=True)
spectroscopic_class = models.CharField(max_length=20, null=True, blank=True)
photometric_class = models.CharField(max_length=20, null=True, blank=True)
milkyway_dust_reddening = models.FloatField(null=True, blank=True)
processing_status = models.CharField(max_length=20, default="processing")
added_by = models.ForeignKey(User, null=True, blank=True, on_delete=models.CASCADE)
added_by = models.ForeignKey(User, null=True, blank=True, on_delete=models.SET_NULL)
progress = models.IntegerField(default=0)
software_version = models.CharField(max_length=50, blank=True, null=True)

Expand Down
Loading