ML Data Engineer #978695 Job at Dexian, Seffner, FL

WWYrTngyamF0VjhrVlliSExFYjN3ekhyY3c9PQ==
  • Dexian
  • Seffner, FL

Job Description

Job Title: Data Engineer – AI/ML Pipelines

Work Model: Hybrid (on-site 3 days a week)

Location: Seffner, FL

Position Summary

The Data Engineer – AI/ML Pipelines plays a key role in building and optimizing the data infrastructure that powers enterprise analytics and machine learning initiatives. This position focuses on developing robust, scalable, and intelligent data pipelines—from ingestion through feature engineering to model deployment and monitoring.

The ideal candidate has hands-on experience supporting end-to-end ML workflows , integrating operational data from Warehouse Management Systems (WMS) and ERP platforms , and enabling real-time predictive systems . This is a highly collaborative role, working across Data Science, ML Engineering, and Operations to ensure that models are fed with clean, reliable, and production-ready data.

Key Responsibilities

ML-Focused Data Engineering

  • Build and maintain data pipelines optimized for machine learning workflows and real-time model deployment.
  • Partner with data scientists to prepare, version, and monitor feature sets for retraining and evaluation.
  • Design and implement feature stores, data validation layers, and model input pipelines that ensure scalability and reproducibility.

Data Integration from WMS & Operational Systems

  • Ingest, normalize, and enrich data from WMS, ERP , and telemetry platforms.
  • Model operational data to support predictive analytics and AI-driven warehouse automation use cases.
  • Develop integrations that provide high-quality, structured data to data science and business teams.

Pipeline Automation & Orchestration

  • Design, orchestrate, and automate modular pipelines using tools such as Azure Data Factory , Airflow , or Databricks Workflows .
  • Ensure pipeline reliability, scalability, and monitoring for both batch and streaming use cases.
  • Implement CI/CD practices for data pipelines supporting ML deployment.

Data Governance & Quality

  • Establish robust data quality frameworks, anomaly detection, and reconciliation checks.
  • Maintain strong data lineage, versioning, and metadata management to ensure reproducibility and compliance.
  • Contribute to the organization’s broader data governance and MLOps standards.

Cross-Functional Collaboration

  • Collaborate closely with Data Scientists, ML Engineers, Software Engineers , and Operations teams to translate modeling requirements into technical solutions.
  • Serve as the technical liaison between data engineering and business users for ML-related data needs.

Documentation & Mentorship

  • Document data flows, feature transformations, and ML pipeline logic in a reproducible, team-friendly format.
  • Mentor junior data engineers and analysts on ML data architecture and best practices.

Required Qualifications

Technical Skills

  • Proven experience designing and maintaining ML-focused data pipelines and supporting model lifecycle workflows .
  • Proficient in Python , SQL , and data transformation tools such as dbt , Spark , or Delta Lake .
  • Strong understanding of cloud-based data platforms (Azure, Databricks) and data orchestration frameworks.
  • Familiarity with ML pipeline tools such as MLflow , TFX , or Kubeflow.
  • Hands-on experience working with Warehouse Management Systems (WMS) or other operational logistics data.

Experience

  • 5+ years in data engineering , with at least 2 years supporting AI/ML systems .
  • Proven track record building and maintaining production-grade pipelines in cloud environments.
  • Demonstrated collaboration with data scientists and experience turning analytical models into operational data products.

Education

  • Bachelor’s degree in Computer Science, Data Science, Engineering , or related field (Master’s preferred).
  • Relevant certifications are a plus (e.g., Azure AI Engineer , Databricks ML Associate , Google Professional Data Engineer ).

Preferred Qualifications

  • Experience with real-time data ingestion technologies (Kafka, Kinesis, Event Hubs).
  • Exposure to MLOps best practices and CI/CD for ML and data pipelines.
  • Industry experience in logistics, warehouse automation, or supply chain analytics .

Job Tags

3 days per week,

Similar Jobs

Jconnect Infotech Inc

Sterile Processing Tech Job at Jconnect Infotech Inc

 ...Job Title: Sterile Processing Technician Location: York, ME 03909 Contract: 13 weeks | Start: Nov 17, 2025 | 4x10 hrs/week...  ...equivalent Certification preferred Minimum 1 year of experience Knowledge of hospital instruments preferred How... 

24 Seven Inc.

Photographer - Bentonville, AR Job at 24 Seven Inc.

Photographer *** This is a contract position onsite/ 40 hours a week in Bentonville, AR. Must reside in the area.*** DecoPac Inc., a leader in cake decorations and bakery supplies, is seeking a talented Freelance Commercial Photographer to capture the essence of celebration...

Kren Enterprises

Semi Local CDL-A Truck Driver Dedicated Lane Job at Kren Enterprises

 ...Regional CDL-A Truck Driver earning $1411 - $1581 weekly Truck drivers will get home multiple times per week No touch freight Dedicated account with steady freight; Retail Experience Requirements: Must have 3 months CDL-A experience solo Must be... 

New York Life - Northern California

Financial Advisor Job at New York Life - Northern California

 ...clients financial needs and life circumstances Provide financial recommendations to clients for savings plans, life insurance policies, mutual fund investments, and other financial products Qualifications: Strong relationship management skills and/or experience... 

Pyramid Consulting, Inc

Senior Analyst - Document Automation & Integration Job at Pyramid Consulting, Inc

Immediate need for a talented Senior Analyst Document Automation & Integration. This is a 12+ months contract opportunity with long-term potential and is located in Titusville, NJ (Remote). Please review the job description below and contact me ASAP if you are ...