competitive salary
Remote
Engineering, Information Technology, Acceleration / Incubation / Innovation
English
remote
about the company
People.ai delivers the industry's leading revenue operations and intelligence (ROI) platform. Using patented AI technology, it transforms business activity such as email, meetings, and contacts, into account and opportunity management solutions that increase sales rep productivity, accelerate revenue growth, and maximize marketing ROI. Companies such as AppDynamics, DataRobot, Ivanti, Okta, and Zoom rely on People.ai to unlock growth.
diversity statement
"At People.ai, we believe that people enrich the world around them in countless ways."
your area of responsibility
Design and implement highly available large language model infrastructure using open source models such as LLama-2, Mistral, Mixtral.
Design and implement an LLM-focused data processing pipeline that transforms and analyzes business communications at the rate of tens of
millions emails per month.
Implement elements of a scalable and cost effective RAG architecture for business communications and systems of record.
Optimize performance, throughput, and cost of large language model inference at scale using model quantization, batch inference, model selection.
Develop fine-turned large language models including data collection, reward model design, fine-tuning using SFT, DPO, PPO, SPIN, and evaluation.
your profile
An experienced (5+ years) engineer who loves writing code.
Experience building and maintaining distributed data processing systems.
Someone who understands algorithms, and is not afraid of occasional math.
Someone who strives to understand data by studying examples, developing and confirming hypotheses, maintaining vigilance towards results that are either unexpected or too convenient.
Experience working in a major cloud computing environment: AWS, Azure, GCP.
Experience/interest in Large Language Models or Generative AI is a plus.
Should know python, PySpark is a plus.
the benefits
Discover them on our website!