Enhancing image feature extraction to boost image classification accuracy has been a significant research focus at the MCL lab. Initially, PixelHop++ was developed to efficiently extract image features and perform accurate image classification. Subsequently, the Least-Squares Normal Transform (LNT) was introduced to further enhance image features, improving classification results with PixelHop++ on standard image databases such as MNIST and FMNIST. Despite achieving commendable performance, further refinements remain desirable to push accuracy limits even higher.

To address this, we propose a novel pipeline of four distinct experimental setups involving different pooling strategies—absolute maximum pooling and variance pooling—at hops 1 and 2. We extract LNT features specifically from hops 1, 3, and 4 for each experiment. At hop-1, pooling (either max or variance) generates 10 LNT features per channel, resulting in a total of 250 features. Hop-3 involves transforming the (N, 3, 3, Feature) tensor to produce 90 LNT features. From hop-4, 10 additional LNT features are acquired following a DFT-based feature selection. These 350 LNT features from hops 1, 3, and 4 are concatenated alongside selected hop-4 features. Finally, features aggregated from all four experimental setups are combined, and a 10-class classifier is trained on these comprehensive feature sets, demonstrating an improvement in classification performance.