IndustryApplied MLPredictive MLMLOps & Deployment

Over-injecting expensive additives is what you do when you can't measure the right dose.

A multinational mining & metallurgy group was buying quality with excess additive cost - we built the AI that bought it with precision instead.

Nickel refining at industrial scale runs on complex chemistry. Sulfurization and oxygenation steps require expensive additives, and the optimal dose depends on the variability of every batch coming out of the oven. The operating reality was that batch variability outran the team's ability to dose precisely - so additives were systematically over-injected as an insurance policy against quality failure. Profitable quality, but bought at a structural cost. The group had years of process data sitting in historians; what it didn't have was a reliable, data-driven way to translate batch characteristics into the right additive range. The question: could AI close that loop without putting quality at risk?

  1. 01

    Curate the training data, ruthlessly.

    Historical batches got split into those that met quality constraints and those that didn't. Within the quality-compliant set, abnormal additive quantities - the over-insurance - were removed against business-driven thresholds. The judgment call: model on what good looks like, not on what happens. Cleaning the training set was the highest-leverage decision in the project.

  2. 02

    Build one model per chemical step.

    Sulfurization and oxygenation have different chemistries, different feature relationships, and different optimization surfaces. We built dedicated predictive models for each, taking the full set of oven-output features as input and recommending optimal additive ranges as output.

  3. 03

    Anchor the predictions to quality, not just cost.

    The system was never optimizing additive cost alone - it was optimizing additive cost subject to maintained quality. Training was scoped exclusively to quality-compliant batches precisely so no recommendation could degrade output quality. Operators got a tight range, not a single number - preserving their judgment at the dosing station.

A 15% reduction in additive cost across both refining steps, with no compromise to nickel quality. The profitability gain compounded across continuous production. The project was recognized at the BFM Business Grand Prize for Digital Acceleration in Industry 4.0 - independent validation that the approach was best-in-class for industrial AI in 2019. The unlock: a repeatable AI framework for optimizing other chemical processes across the group's operations.

In process industries, AI doesn't replace the operator - it gives them a better dial. Build the recommendation as a range, train only on the good batches, and the cost savings come with the quality preserved.

Buying quality with cost margin because the measurement isn't tight enough? We help industrial operators turn historical process data into tightening recommendation loops - without putting output at risk.

Let's talk

Get In Touch

Have any questions? We'd love to hear from you.