Shire Jobs

Mobile Shire Logo

Job Information

Meta Software Engineer, Systems ML - Model Optimization (PhD) in Burlingame, California

Summary:

Meta is looking for software engineers to play a pivotal role designed to further enhance and elevate our AI inference infrastructure. As a member of our team, you will play a significant role in improving the latency and power consumption of our AI models and in building user facing APIs for our ML engineers. Your expertise will enable us to reach new heights in enabling efficient model inference. The position requires a combination of expertise in machine learning and software engineering.

Required Skills:

Software Engineer, Systems ML - Model Optimization (PhD) Responsibilities:

  1. Fine tune, quantize and deploy ML models on-device across phones, AR and VR devices.

  2. Optimize models for latency and power consumption.

  3. Enable efficient inference on GPUs.

  4. Build tooling to develop and deploy efficient models for inference.

  5. Partner with teams across meta reality labs to optimize key inference workloads.

Minimum Qualifications:

Minimum Qualifications:

  1. Currently has or is in the process of obtaining a PhD in the field of Computer Science, Computer Engineering or equivalent. Degree must be completed prior to joining Meta.

  2. Specialized experience in the following machine learning/deep learning domains: model quantization, compression, on-device inference, GPU inference, PyTorch.

  3. Currently has, or is in the process of obtaining a Bachelor's degree in Computer Science, Computer Engineering, relevant technical field, or equivalent practical experience. Degree must be completed prior to joining Meta.

  4. Must obtain work authorization in the country of employment at the time of hire and maintain ongoing work authorization during employment.

Preferred Qualifications:

Preferred Qualifications:

  1. Proven record of training, fine tuning, and optimizing models.

  2. 3+ years of experience on accelerating deep learning models for on-device inference.

  3. Optimizing machine learning model inference on NVIDIA GPUs.

  4. Familiarity with on-device inference platforms (ARM, Qualcomm DSP).

  5. Experience with CUDA/Triton.

Public Compensation:

$56.25/hour to $173,000/year + bonus + equity + benefits

Industry: Internet

Equal Opportunity:

Meta is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, sex (including pregnancy, childbirth, or related medical conditions), sexual orientation, gender, gender identity, gender expression, transgender status, sexual stereotypes, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics. We also consider qualified applicants with criminal histories, consistent with applicable federal, state and local law. Meta participates in the E-Verify program in certain locations, as required by law. Please note that Meta may leverage artificial intelligence and machine learning technologies in connection with applications for employment.

Meta is committed to providing reasonable accommodations for candidates with disabilities in our recruiting process. If you need any assistance or accommodations due to a disability, please let us know at accommodations-ext@fb.com.

DirectEmployers