(1)
Domain-Adaptive Pretraining of Transformer-Based Language Models on Medical Texts: A High-Performance Computing Experiment. COMPUTE 2025, 5 (2), 1-9. https://doi.org/10.24018/compute.2025.5.2.149.