A. Antos and I. Kontoyiannis, Convergence properties of functional estimates for discrete distributions, Random Struct. Algor, vol.19, pp.163-193, 2001.

G. Brown, A. Pocock, M. Zhao, and M. Luján, Conditional Likelihood Maximisation: A Unifying Framework for Information Theoretic Feature Selection, J. Mach. Learn. Res, vol.13, pp.27-66, 2012.

M. Thomas, J. A. Cover, and . Thomas, Elements of Information Theory, 2006.

M. Luca, J. Ghiringhelli, . Vybiral, V. Sergey, C. Levchenko et al., Big data of materials science: Critical role of the descriptor, Phys. rev. lett, vol.114, p.105503, 2015.

C. Giannella and E. L. Robertson, On approximation measures for functional dependencies, Inform. Syst, vol.29, pp.483-507, 2004.

M. Bryan-r-goldsmith, J. Boley, M. Vreeken, L. Scheffler, and . Ghiringhelli, Uncovering structure-property relationships of materials by subgroup discovery, New J. Phys, vol.19, p.13031, 2017.

I. Guyon and A. Elisseeff, An Introduction to Variable and Feature Selection, J. Mach. Learn. Res, vol.3, pp.1157-1182, 2003.

J. Han, J. Pei, Y. Yin, and R. Mao, Mining Frequent Patterns without Candidate Generation: A Frequent-Pattern Tree Approach, 2004.

, Data Min. Knowl. Discov, vol.8, pp.53-87, 2004.

Y. Huhtala, J. Kärkkäinen, P. Porkka, and H. Toivonen, TANE: An efficient algorithm for discovering functional and approximate dependencies, Computer J, vol.42, pp.100-111, 1999.

P. Mandros, M. Boley, and J. Vreeken, Discovering Reliable Approximate Functional Dependencies, KDD. ACM, pp.355-363, 2017.

P. Mandros, M. Boley, and J. Vreeken, Discovering Reliable Dependencies from Data: Hardness and Improved Algorithms, ICDM, pp.317-326, 2018.

X. Vinh-nguyen, J. Chan, and J. Bailey, Reconsidering Mutual Information Based Feature Selection: A Statistical Significance View, AAAI, pp.2092-2098, 2014.

T. Papenbrock, J. Ehrlich, J. Marten, T. Neubert, J. Rudolph et al., Functional dependency discovery: An experimental evaluation of seven algorithms, VLDB J, vol.8, pp.1082-1093, 2015.

F. Pennerath, An Efficient Algorithm for Computing Entropic Measures of Feature Subsets, ECML-PKDD, vol.11052, pp.483-499, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01897734

S. Romano, X. Nguyen, J. Vinh, K. Bailey, and . Verspoor, A Framework to Adjust Dependency Measure Estimates for Chance, SDM. SIAM, pp.423-431, 2016.

. Mark-s-roulston, Estimating the errors on measured entropy and mutual information, Physica D, vol.125, issue.3, pp.285-294, 1999.

S. Schober, Some worst-case bounds for Bayesian estimators of discrete distributions, ISIT. IEEE, pp.2194-2198, 2013.

T. Schã?rmann and P. Grassberger, Entropy estimation of symbol sequences, Chaos, vol.6, pp.414-427, 1996.

J. Suzuki, A construction of Bayesian networks from databases based on an MDL principle, pp.266-273, 1993.

J. Suzuki, Mutual Information Estimation: Independence Detection and Consistency, ISIT. IEEE, pp.2514-2518, 2019.

I. Tsamardinos, C. Aliferis, A. Statnikov, and E. Statnikov, Algorithms for Large Scale Markov Blanket Discovery, FLAIRS, pp.376-380, 2003.