What transformers are available?
Feature-engine, a Python library for feature engineering
2 min read
Published Oct 3 2025
Guide Sections
Guide Comments
Feature-engine divides its functionality into several “families” of transformers. Here’s a high-level view of what’s supported and what each part does:
Category | Purpose / Description | Examples of Transformers |
Missing Data Imputation | Replace missing (NaN) values in numerical or categorical features |
|
Categorical Encoding | Convert categorical (string / object) variables to numeric / encoding schemes |
|
Discretisation / Binning | Convert continuous numerical variables into discrete bins or intervals |
|
Outlier Handling / Capping / Trimming | Identify and control extreme values (outliers) |
|
Variable Transformation | Apply mathematical transformations to numerical features to stabilise variance, reduce skewness, etc. |
|
Feature Creation / Generation | Combine or derive new features from existing ones |
|
Datetime / Time-series Features | Extract or generate useful attributes from datetime or temporal data |
|
Time Series / Windowing / Lags | For use in forecasting / time-series ML: create lag features, rolling windows, expansions |
|
Feature Selection / Dropping / Filtering | Methods to drop or select variables based on statistical properties, model performance, correlation, etc. |
|
Preprocessing / Matching / Wrapping | Utilities to ensure consistency in variable names, categories, or wrap scikit learn transformers |
|
For a full list, please look at the documentation here. In the next few sections we will give you some examples of the common ones.