Tuesday, 6 May 2025

The Motif Recognition Paper: Traditional Indian Textiles Classification using Deep Feature Fusion with Curvelet Transforms: Varshney et. al

 🔍 Objective

The paper proposes a hybrid feature fusion approach to classify traditional Indian textile patterns using:

  • Pre-trained CNN models (InceptionResNetV2 and VGG16)

  • Curvelet Transform (to capture curved edges better than wavelets)


📚 Dataset

  • Total images: 1046

  • Textile styles: Batik, Chikankari, Ikat, Kalamkari, Kashida, Madhubani, Warli

  • Image size: 1200x1200

  • Train/Test split: 90% / 10%


🔧 Methodology

  1. Preprocessing: CLAHE (histogram equalization), resizing to 299x299 (IRV2) or 224x224 (VGG16).

  2. Feature Extraction:

    • CNN-based: InceptionResNetV2 and VGG16

    • Handcrafted: Curvelet Transform (4 and 5 scale levels)

  3. Fusion: Combine CNN and Curvelet features.

  4. Classification: Using three models

    • XGBoost (XGB)

    • Gradient Boosting

    • Logistic Regression

  5. Interpretability: Grad-CAM heatmaps to visualize feature focus areas.


📊 Key Results

  • Best combination: Curvelet + InceptionResNetV2 without preprocessing

    • Accuracy: 97.15%

    • Precision: 98.24%

    • Recall/F1: 97.15%

    • Specificity: 99.52%

  • VGG16 + Curvelet + Preprocessing achieved 91.43% accuracy.


🧠 Insights

  • Curvelet transform adds rotational robustness and captures curved motifs better.

  • InceptionResNetV2 with Curvelets performs significantly better than using CNN or Curvelet alone.

  • Preprocessing has mixed impact depending on model used.


🔮 Conclusion

The fusion approach significantly boosts classification accuracy for Indian textile patterns. The method has potential applications in:

  • Automated textile cataloging

  • Design inspiration systems

  • Cultural preservation via AI

Please refer to this link

No comments:

Post a Comment

🧠 You Only Laugh Once: Creativity and Humor in Deep Learning Community

It all started with a simple truth: Attention Is All You Need . Or at least, that’s what the transformers keep whispering at every AI confer...