QA Test Lead / Lead QA Engineer (AI-ML)
At Fluor, we are proud to design and build projects and careers. We are committed to fostering a welcoming and collaborative work environment that encourages big-picture thinking, brings out the best in our employees, and helps us develop innovative solutions that contribute to building a better world together. If this sounds like a culture you would like to work in, you’re invited to apply for this role.
Job Description
Role Overview
We are seeking a Lead QA Engineer / QA Test Lead with 7–10 years of experience to drive quality strategy and testing execution for AI/ML and data products. This role partners closely with data engineers, ML engineers, product owners, and DevOps/MLOps to ensure reliable, secure, and high-performing solutions across data pipelines, ML model lifecycle, APIs, and user-facing analytics applications. You will lead test planning, automation, and governance while mentoring QA engineers and establishing best practices aligned with enterprise quality and delivery standards. The ideal candidate brings strong hands-on testing expertise plus leadership experience in modern agile delivery.
Key Responsibilities
- Lead end-to-end QA strategy for AI/ML and data initiatives, including data validation, model verification, API testing, and UI testing as applicable.
- Define and own test plans, test scenarios, acceptance criteria, and quality gates for features across ETL/ELT pipelines, analytics layers, and ML services.
- Design, build, and maintain scalable test automation frameworks (unit, integration, regression, and end-to-end) integrated into CI/CD pipelines.
- Validate data quality (completeness, accuracy, timeliness, lineage) using profiling, reconciliation, and automated checks across sources, transformations, and consumption layers.
- Test ML systems including model inputs/outputs, feature pipelines, performance metrics, drift/decay signals, and reproducibility across environments.
- Execute non-functional testing: performance, load, reliability, scalability, and resiliency testing for data services and ML inference endpoints.
- Ensure compliance and risk mitigation by incorporating testing for security, access controls, privacy, auditability, and secure SDLC controls in collaboration with security teams.
- Drive defect lifecycle management: triage, prioritization, root-cause analysis, and corrective/preventive actions with engineering teams.
- Mentor and guide QA engineers; establish standards for documentation, reporting dashboards, and continuous quality improvement.
- Collaborate in Agile ceremonies (backlog refinement, sprint planning, demos, retrospectives) and support release readiness decisions.
- Maintain quality metrics (test coverage, defect leakage, automation ROI, cycle time) and communicate risks and status to stakeholders.
- Support production validation (smoke checks, monitoring-based verification) and contribute to incident learnings to prevent recurrence.
Basic Job Requirements
- 7–10 years of overall QA/testing experience with at least 2+ years in a lead role (Test Lead / QA Lead / Lead QA Engineer).
- Bachelor’s degree in Computer Science / Information Technology.
- Strong experience testing data platforms (ETL/ELT, data lakes/warehouses) and APIs/microservices.
- Hands-on expertise with test automation using modern frameworks (e.g., Selenium/Playwright/PyTest).
- Strong SQL skills and experience validating complex datasets, transformations, and reconciliations.
- Experience with CI/CD and integrating automated tests into pipelines (e.g., Azure DevOps).
- Working knowledge of AI/ML concepts and ability to test model workflows (training, inference, monitoring) and data/feature pipelines.
- Excellent communication and stakeholder management skills; ability to translate quality risks into actionable plans.
- Proven ability to work in Agile/Scrum delivery models.
- Certifications (nice to have): ISTQB, Certified Agile Tester
Other Job Requirements
Preferred Qualifications
- Experience testing MLOps workflows and tooling (e.g., MLflow, model registry, feature stores, automated retraining pipelines).
- Exposure to cloud data ecosystems (Azure) and services (e.g., Databricks, Synapse etc.).
- Knowledge of data governance concepts (lineage, catalog, access control) and testing for compliance requirements.
- Experience with performance testing tools (e.g., JMeter) and observability (e.g., logs/metrics/traces).
- Familiarity with containerization and orchestration (Docker, Kubernetes) and service virtualization/mocking.
- Domain exposure to engineering/project delivery environments (EPC, construction, energy/chemicals, manufacturing) is a plus.
To be Considered Candidates:
Must be authorized to work in the country where the position is located.
We are an equal opportunity employer. All qualified individuals will receive consideration for employment without regard to race, color, age, sex, sexual orientation, gender identity, religion, national origin, disability, veteran status, genetic information, or any other criteria protected by governing law.