RT Journal Article T1 Redundancy Is Not Necessarily Detrimental in Classification Problems A1 Grillo, Sebastián Alberto A1 Vázquez Noguera, José Luis A1 Mello Román, Julio César A1 García Torres, Miguel A1 Facon, Jacques A1 Pinto Roa, Diego Pedro A1 Salgueiro, Luis Fernando A1 Bareiro Paniagua, Laura Raquel A1 Leguizamón Correa, Deysi Natalia A2 Universidad Americana (PY) A2 Universidad Nacional de Concepción (PY) AB In feature selection, redundancy is one of the major concerns since the removal of redun dancy in data is connected with dimensionality reduction. Despite the evidence of such a connection, few works present theoretical studies regarding redundancy. In this work, we analyze the effect of redundant features on the performance of classification models. We can summarize the contribution of this work as follows: (i) develop a theoretical framework to analyze feature construction and selection, (ii) show that certain properly defined features are redundant but make the data linearly separable, and (iii) propose a formal criterion to validate feature construction methods. The results of experiments suggest that a large number of redundant features can reduce the classification error. The results imply that it is not enough to analyze features solely using criteria that measure the amount of information provided by such features. YR 2021 FD 2021 LK http://hdl.handle.net/20.500.14066/3777 UL http://hdl.handle.net/20.500.14066/3777 LA eng NO CONACYT - Consejo Nacional de Ciencia y Tecnología DS MINDS@UW RD 03-nov-2024