Principal Solutions Architect & Data Engineer

Timothy Pomaville

I bridge the plant floor and the data platform.

25 years in manufacturing operations across chemicals, automotive, and oil and gas. 6 years building data and software infrastructure that works in industrial environments.

Consulting Inquiries
Timothy Pomaville

Timothy Pomaville

Principal Solutions Architect & Data Engineer

I'm a multidisciplinary engineer with roots in Chemical Engineering and a career that's run through automotive, chemical, and aerospace manufacturing. That operational background isn't incidental. It shapes how I build data systems. I understand the processes the data describes, not just the pipelines that move it. I know what it means when a sensor goes offline, or a batch job fails, because I've been in the plant when it happens. This perspective drives me to build data platforms that aren't just technically robust, but operationally resilient and aligned with the realities of manufacturing environments.

Over the past six years I've moved deeper into platform and data engineering, designing and operating Kubernetes-native infrastructure for data-intensive workloads. My stack spans GitOps with ArgoCD, secrets management with Vault, SSO with Keycloak, and a full ML/data platform: Spark, Airflow, MLflow, Trino, JupyterHub, Metabase, deployed on self-hosted Kubernetes and secured end-to-end. I care about the layer that lets teams move fast without breaking things: reproducible deployments, automated security scanning, experiment tracking, and self-service tooling that doesn't require a platform ticket to use.

Most of my recent work sits at the intersection of manufacturing operations and modern data engineering: the space where OT historians, process data, and ML pipelines finally start talking to each other. I run Morning Star Engineering, a consultancy focused on that problem, and I'm always open to interesting work in the same vein.

My Take On AI

AI makes me faster. It doesn't make me smarter. That part is still on me. The real work in data/software engineering has never been writing code. It's understanding what the data represents, why a process behaves the way it does, and what the system actually needs. That requires domain knowledge and engineering judgment built over years, not tokens. I use AI as a tool, the same way I'd use any other. The thinking, the architecture decisions, the trade-offs: those are still mine. And in manufacturing and other domains, where bad assumptions don't just create bugs but can affect safety, quality, and yield, that distinction matters.

TECHNICAL SKILLS

Organized by domain

Data Engineering

PythonPySparkSQLApache SparkDelta LakeDatabricksApache KafkaApache AirflowdbtGreat ExpectationsUnity Catalog

Infrastructure & DevOps

KubernetesDockerTerraformAnsibleGitLab CI/CDGitHub ActionsArgoCDLinux

Industrial / OT

OSI PIAspen IP.21SeeqMES SystemsProcess ControlSCADA

APIs & Backend

FastAPIFlaskRESTful API Design

Cloud

AWSAzureGCP

Analytics & ML

MLflowGrafanaPrometheusMetabaseMachine Learning

EDUCATION & CERTIFICATIONS

Education & Professional Certifications

Bachelor of Science -- Chemical Engineering

University of Michigan Ann Arbor, 2011

GPA: 3.0 / 4.0

Master of Science -- Software Engineering

California State University Fullerton, 2022-2024

GPA 3.9 / 4.0

View Diploma

MCS, Computer Science -- Artificial Intelligence Track

University of Illinois Urbana-Champaign

Expected: 2026

In Progress

Databricks Certified Data Engineer Associate

2025

Completed

View Certificate

Databricks Certified Data Engineer Professional

In Progress

Advanced Analytics Engineer -- Seeq

2024

Completed

View Certificate

Machine Learning Specialization

Stanford / DeepLearning.AI

2024

Completed

View Certificate

Six Sigma Green Belt Project Leader

2015

Completed

EXPERIENCE

Years of professional experience across disciplines

Years Manufacturing Operations Experience

Years Data Engineering Experience

Years Software Engineering Experience

Years Platform Engineering Experience

PROJECTS & WORK

Professional engagements and personal builds

Professional Work

Flare Gas Optimization

Dow Chemical, Plaquemine / 2015

Chronic flare gas losses with no reliable instrumentation or data pipeline to quantify or control them. Mined a decade of OSI PI historian data to establish baselines, instrumented an inline gas chromatograph, and wrote steam-to-hydrocarbon ratio control logic directly into the process control system.

Delivered a measurable, controllable process improvement. Bridged raw plant instrumentation to operational decision-making. Earned Six Sigma Green Belt credential for the project.

OSI PIProcess Control SystemsData Pipeline EngineeringSix Sigma

IFF Data Platform and Kubernetes Migration

IFF (International Flavors and Fragrances)

Architected and led the migration of the analytics team data platform from a legacy environment to a containerized Kubernetes infrastructure. Built Python/SQL ETL and ELT pipelines and Flask/FastAPI data services supporting multiple ML and analytics teams.

Infrastructure migration contributed to cost savings in the millions. Administered global OSI PI historian supporting process and quality data across manufacturing sites.

PythonKubernetesDockerFastAPIFlaskOSI PIGitLab CI/CDSQL

Enterprise Seeq Integration Framework

IT Vizion

Designed a modular, template-driven Python framework for enterprise-scale industrial data integration with Seeq. Architected for reuse and extensibility across multi-site deployments. Established SOLID design principles as the architectural foundation.

Framework reduced per-deployment implementation time and created a repeatable pattern for enterprise clients across multiple manufacturing verticals.

PythonSeeqOSI PIFastAPIFlask

DCS Migration Requirements Engineering

DuPont / 2019 to 2020

Served as requirements lead for a plant-wide DCS migration. Authored and managed the full system requirements baseline from top-level process specifications down to component-level control logic across engineering and operations stakeholders.

Delivered a complete, traceable requirements baseline for a complex multi-system migration, maintaining formal change control through the full project lifecycle.

DCSRequirements EngineeringSystems EngineeringProcess Control

Industrial Data Observatory (Homelab)

Full medallion-architecture data platform running on Proxmox (Lenovo P52) with Kafka, Spark, Delta Lake, Unity Catalog OSS, Airflow, Metabase, Keycloak, Vault, Prometheus, and Grafana. Synthetic manufacturing sensor data demonstrates OT/IT integration patterns.

End-to-end production-grade infrastructure I can architect and operate solo. Used as a development environment for exploring new patterns before recommending them to clients.

Apache KafkaApache SparkDelta LakeKubernetesDockerTerraformAirflowGrafanaPrometheusKeycloakVault

Open Source & Demos

Real-Time Streaming Data Platform

End-to-end reference architecture for manufacturing-adjacent data pipelines. Events flow from Kafka through Flink into PostgreSQL, transformed by dbt, and surfaced in Metabase dashboards.

Apache KafkaApache FlinkPostgreSQLdbtApache AirflowTrinoMetabase

Platform Console

Next.js internal developer portal integrating Keycloak SSO, embedded Grafana dashboards, and direct links to JupyterHub, Airflow, MLflow, and Metabase. Single pane of glass over a full data platform.

Next.jsKeycloak SSOKubernetesArgoCDVaultGrafana Embed

MLflow Experiment Tracking and Model Registry

Self-hosted MLflow on Kubernetes backed by PostgreSQL and MinIO. Provides a Databricks-compatible experiment tracking and model registry layer for JupyterHub workloads.

MLflowPostgreSQLMinIOKubernetesJupyterHubArgoCD

CASE STUDIES

Enterprise platform engineering projects and architectural decisions

Enterprise Data Platform Migration

Project Overview

Built a production-grade, self-hosted data platform on Kubernetes to eliminate dependence on managed cloud services for data-intensive workloads, delivering Databricks-class capabilities (experiment tracking, distributed compute, orchestrated pipelines) at a fraction of the cost, with full control over data residency and security posture.

Architecture
  • +Multi-node Kubernetes cluster with GitOps via ArgoCD App of Apps
  • +Keycloak SSO with Google Workspace as upstream IdP, single login across all platform tools
  • +Vault + External Secrets Operator for zero-secret-in-repo secret management
  • +Prometheus + Grafana for real-time cluster and application monitoring
  • +Data processing stack: Spark, Kafka, Flink, Airflow, Trino, JupyterHub, MLflow
  • +12TB PostgreSQL instance with MinIO object storage for large-scale data processing
  • +dbt + Metabase for data transformation and self-service BI
  • +Unity Catalog for data governance and access control
  • +DevSecOps pipeline with SAST, dependency scanning, container signing, and DAST before every production deployment
Key Achievements
  • Zero-downtime migration from Docker Compose to production Kubernetes
  • Unified SSO across six platform applications via Keycloak + Google Workspace
  • Full ML lifecycle from JupyterHub notebook to MLflow-registered model artifact
  • Automated security gates no image reaches production without passing Trivy, Grype, ZAP, and Nuclei scans
  • Scalable, cost-controlled architecture supporting 12TB+ data without a managed cloud dependency
Business Impact

Replaced a fragile, manually-managed Docker Compose environment with a fully declarative, GitOps-driven platform that any engineer can onboard to in minutes. The result is a reproducible, auditable data infrastructure that supports manufacturing-adjacent analytics workloads with enterprise security controls typically only found in cloud-managed offerings.

LIVE PLATFORMS

Web properties and platforms I have built and operate

Platform Console

Internal developer portal for the Morning Star Engineering data platform. Single sign-on via Keycloak, embedded Grafana dashboards, and direct links to every platform tool — giving engineers one place to start their day.

Next.jsKeycloak SSOKubernetesArgoCDVaultGrafana Embed

morningstareng.com

Morning Star Engineering company website.

ERP Portal

Morning Star Engineering ERP portal for internal operations management.

BOOKS READ BY YEAR

Technical and professional reading, by year

  • Programming in Scala / Martin Odersky2025
  • Introduction to Algorithms / CLRS2025
  • Software Maintenance: Concepts and Practice / Penny Grub and Armstrong A Takang2024
  • Measuring the Software Process / Florac and Carleton2023
  • Software Architecture in Practice / Bass, Clements, and Kazman2023
  • Continuous Delivery / Jez Humble and David Farley2023
  • Practical Software Testing / Burnstein, Ilene2023
  • Agile Project Management / Highsmith, Jim2023
  • CMMI for Development / May Beth Chrissis, Mike Konrad, Sandy Shrum2023
  • Software Requirements / Karl Wiegers and Joy Beatty2023
  • Managing the Software Process / Humphrey2022

Reading target: 4 or more technical books per year.

CONTACT

Two ways to work together

Consulting Inquiries

Industrial data infrastructure, OT/IT integration, platform architecture, and manufacturing analytics through Morning Star Engineering.

morningstareng.com/contact

Engineering Roles

Staff and principal-level data engineering, platform engineering, and solutions architecture roles in industrial or technical environments.

tim@morningstareng.com

PORTFOLIO

  • This is my Seeq add-on project!
  • This is me!
  • This is me and my wife a week before our wedding!