Login or register to unlock the content
Deep learning has revolutionized fields like computer vision and NLP, but in the case of tabular data, methods like gradient boosting are still holding strong. The past few years have seen innovative amthods based on transformers for tabular data, such as TabNet, SAINT, Hopular, TabFPN. Let’s find out if these methods are gradient boosting killers or overhyped!
– What makes gradient boosting a strong competitor for DL on tabular data in particular
– Which are the concrete innovations brought by the transformer-based models?
– Practical considerations for selecting your tabular data classification/regression algorithm