Projects
Low-Field to High-Field MRI Super-Resolution with Task-Adaptive Transformers
Spring 2026Independent Neuroinformatics Project GitHub
- Independently designed a full medical imaging pipeline to enhance 64 mT MRI scans into 3 T like images using a transformer-based AMIR architecture
- Built preprocessing and augmentation pipelines, including slice-to-volume reconstruction, spatial resampling, and synthetic low-field generation from external IXI datasets
- Trained a 22M-parameter transformer model on ~200 paired subjects, achieving a mean test PSNR of 18.64 dB and max PSNR of 43.21 dB, SSIM of 0.544, demonstrating significant enhancement in image quality and structural similarity
Stack: Python, PyTorch, Nibabel, NumPy, SciPy, CUDA
Chinchilla-Optimal Transformer Pre-training for Music
Fall 2025Independent ML Systems Project GitHub
- Independently designed and trained decoder-only Transformer models (NanoGPT) on the Lakh MIDI Dataset
- Optimized training throughput on NVIDIA H100 GPUs using BFloat16 mixed precision, Flash Attention, and torch.compile
- Achieved a test perplexity of 2.20 with 100% syntactically valid output
Stack: Python, PyTorch, CUDA, Flash Attention
Toy Load Balancer with Consistent Hashing
2024Systems Engineering Project GitHub
- Built a custom load balancer implementing consistent hashing to distribute traffic across dynamic server nodes
- Containerized the entire architecture (API Gateway, Nodes, Analytics) using Docker Compose for easy deployment
- Implemented a management API to dynamically add/remove servers and visualize request rebalancing in real-time
Stack: Docker, Python, Node.js, Shell