AI Engineer, Business Operations
Job Locations
US-NJ-Paramus
| ID |
2026-1947
|
Category |
Information Technology
|
Type |
Regular Full-Time
|
Overview
The AI Engineer, Biz Ops will build the AIpowered services that form the backbone of our decisionintelligence platform. In this role, you will take AI models developed by AI Scientists and transform them into scalable, productionready applications by designing inference pipelines, APIs, and supporting data flows. You will work closely with Data Engineers to integrate model pipelines with the broader data ecosystem and collaborate with business operations and commercial teams to convert manual, step-driven workflows into AInative services. This includes building reliable batch and realtime inference systems that generate measurable impact across business operations-not limited to any specific domain. This is a highimpact role for engineers who enjoy turning research into products, hardening systems for realworld use, and building the engineering layer that enables AI to operate at scale. While not required, an interest in or exposure to MLOps practices is strongly preferred.
Responsibilities
Productionize AI/ML models into scalable services (e.g., APIs, batch inference, streaming inference) with strong standards for reliability and performance.
- Collaborate with AI Scientists to convert research prototypes into productionready components (feature computation, preprocessing, post-processing, evaluation loops).
- Integrate models with data pipelines built by Data Engineers and ensure seamless endtoend flow from raw data to AIdriven output.
- Build and maintain inference pipelines using Python and orchestration frameworks (e.g., Airflow), supporting deployment across cloud and onprem environments.
- Implement CI/CD, containerization, and automated testing to ensure safe, repeatable, and automated model deployments.
- Establish monitoring and observability for models and services (system metrics, data drift, performance regression, alerting).
- Partner with BizOps and Commercial stakeholders to transform manual workflows into AIenabled services that improve operational decisionmaking.
- Optimize end-to-end model serving latency, throughput, and cost using packaging strategies, scaling policies, caching, and parallelization.
- Contribute to documentation, reusable templates, and engineering best practices to accelerate AI adoption across the organization.
Qualifications
- Education: Bachelor's degree or higher in Computer Science, Engineering, or related technical field.
- Experience: 3+ years of software engineering experience, including building or deploying AI systems in production environments.
- Skills:
- Strong proficiency in Python for services, pipelines, and ML tooling.
- Experience deploying AI models in production across onprem or cloud environments (AWS or Azure).
- Experience with bigdata and orchestration frameworks (e.g., Spark, Airflow) for scalable pipelines.
- Strong understanding of software engineering best practices including CI/CD, containerization (Docker, Kubernetes), automated testing, and version control.
- Experience with model optimization techniques such as ONNX / ONNX Runtime, model quantization, or other performanceoriented inference tooling.
Strongly Preferred
- Interest or exposure to MLOps concepts (model registries, feature stores, experiment tracking, automated retraining, monitoring).
- Master's degree or higher in a relevant field.
- Experience in regulated industries (e.g., biopharma, healthcare, and finance).
- A portfolio of launched AI/ML projects or contributions to production of AI systems.
- Proficiency in SQL and familiarity with modern data warehouses such as Snowflake.
Who Thrives in This Role
- Engineers who enjoy transforming research into resilient, userfacing products.
- Builders who balance rapid iteration with productiongrade engineering standards.
- Collaborators who can partner with business teams to convert manual workflows into scalable AI services.
- Pragmatic problemsolvers who can operate autonomously and drive impact in ambiguous, crossfunctional settings.
|