Ongoing Training Runs

Research Program for follow up

The rest of this report I am summarizing what follow on research is a good idea to keep pushing the uniGradICON family of models forward

Things that will just work:

Add more data:

Anatomix Synthetic data. I have done this for the current batch of experiments since I anticipate that it's the most important. The rest are candidates for new students

Synthmorph synthetic. Unlike the anatomix synthetic data, this has deformations built in.

AutoPet 2025 longitudinal CT

Architecture and loss improvements:

Spacing aware diffusion regularization.

Data loader sharding: currently their is a copy of the dataset in RAM for each GPU, this is wasteful. Fixing this will allow a bigger dataset.

Train longer

Train using segmentations that are shipped with the various datasets in the uniCARL composite training dataset

Things that might work (Research topics potentially being their own papers)

Try training architectures that won the LUMIR 2024 challenge on the unigradICON or uniCARL dataset.

This paper is a great resource to this end:

https://www.arxiv.org/abs/2505.24160

Add hard data: Comulus-CLEM and Remind2Reg learn2reg challenge datasets

Back to Reports
subdavis.com forrestli.com