[1]
“Domain-Adaptive Pretraining of Transformer-Based Language Models on Medical Texts: A High-Performance Computing Experiment”, COMPUTE, vol. 5, no. 2, pp. 1–9, Apr. 2025, doi: 10.24018/compute.2025.5.2.149.