Abstract
Affine registration is crucial in medical image analysis but faces challenges when matching sparse features, such as retinal vessels and filamentous collagen fibers in second-harmonic generation (SHG) and bright-field (BF) images. End-to-end learning-based approaches struggle with minimal effective gradients from loss back-propagation of these sparse features, while descriptor matching methods, though helpful, lack fidelity loss and leave the matching process open-looped. To address these issues, we propose Neural Affine Optimization (NeOn), which implicitly approximates discrete optimization using a few neural network layers, combined with a sampling-regression layer to handle affine transformations. NeOn allows iterative refinement with fidelity loss and provides a flexible transition between a purely affine configuration and a linear weighted blend of affine and deformation fields.
Demo: Deformation Change Among Iterations
Example 1: Retinal Vessel Alignment
Moving Image deformable change over time.
Fixed Image.
Example 2: SHG-BF Image Alignment
Moving Image deformable change over time.
Fixed Image.
Problem Statement
Visual illustration of the issue of limited gradient backflow. The comparison shows retinal vessel and SHG images with their corresponding horizontal translation differences, highlighting areas where loss gradient can be effectively back-propagated.
Method
The proposed Neural Affine Optimization (NeOn) framework. Our method combines neural network layers with a sampling-regression approach for handling affine transformations.
Datasets
FIRE Dataset
Mono-modal retinal vessel dataset with 39 subjects and 134 image pairs (2912×2912 pixels, 45° FOV)
CF-FA Dataset
Multi-modal diabetic retinal dataset with 59 subjects, combining Fractional Anisotropy and Color Fundus images (720×576 pixels)
SHG-BF Dataset
Learn2Reg Challenge 2024 Task 3 dataset with 10 image pairs, combining Bright Field microscopy with Second Harmonic Generation imaging
Results
Qualitative comparison of registration results on the FIRE dataset, demonstrating NeOn's superior performance in handling sparse features.
For the quantitative comparison, please refer to the details in the orignal paper.
Implementation
FIRE Dataset
- Feature Extraction:
- Use LWNet (GitHub) to extract vessel features
- Store extracted features in 'FIRE_cnn' folder
- Run Optimization:
python test_neon_fire.py --ori_size '(2912,2912)' --img_size '(1024,1024)' temp=0.001 ks=1 alpha=1
Parameters:
ori_size
: Original image dimensions
img_size
: Input image size for processing
temp
: Temperature parameter
ks
: Kernel size
alpha
: Blending parameter
CF-FA Dataset
- Feature Extraction:
- For fundus images: Use LWNet (GitHub)
- For FA images: Use DeepVesselSeg4FA (GitHub)
- Store all features in 'CFFA_cnn' folder
- Run Optimization:
python test_neon_cffa.py --ori_size '(576,720)' --img_size '(576,720)' temp=0.001 ks=1 alpha=1
SHG-BF Dataset
- Feature Extraction Training:
- We provide contrastive-based COMIR feature extraction with XFeat prealignment
python train_shgbf.py -m tiramisuAndXfeatComplex111Msk1Ps128 -bs 1 --gpu_id 0 \
ti_pretrained=1 enable_grad_xfeat=1 xf_pretrained=1 --is_msk 1 --patch_size 128
- Run Optimization:
python test_neon_shgbf.py -m tiramisuAndXfeatComplex -bs 1 --gpu_id 0 --load_ckpt none --is_first_half 1
Citation
@article{zhang2024neural,
title={Neural Affine Optimization for Image Registration},
author={Zhang, Hang and Wang, Jiacheng and Chen, Xiang and Hu, Renjiu and Liu, Min and
Wang, Yaonan and Wang, Rongguang and Duan, Jinming and Codella, Noel},
journal={IEEE Transactions on Medical Imaging},
year={2024}
}
@article{wang2024fidelity,
title={Fidelity-Imposed Displacement Editing for the Learn2Reg 2024 SHG-BF Challenge},
author={Wang, Jiacheng and Chen, Xiang and Hu, Renjiu and Wang, Rongguang and Liu, Min and
Wang, Yaonan and Wang, Jiazheng and Li, Hao and Zhang, Hang},
journal={arXiv preprint arXiv:2410.20812},
year={2024}
}