LLM Distributed Training Engineer (5k+ GPUs)

Periodic Labs

Menlo Park, CA, United States

Full Time

Expires On: 02/14/2026

A cutting-edge AI and science laboratory in California seeks an experienced professional to optimize and develop large-scale distributed LLM training systems. This role involves working with researchers to support AI scientific research and contribute to open-source frameworks. Ideal candidates will have expertise in GPU clusters, parallel training, and distributed training frameworks. Join a rapidly growing team dedicated to scientific innovation. #J-18808-Ljbffr

Apply Now