Feature-to-feature regression for a two-step conditional independence test
The algorithms for causal discovery and more broadly for learning the structure of graphical models require well calibrated and consistent conditional independence (CI) tests. We revisit the CI tests which are based on two-step procedures and involve regression with subsequent (unconditional) indepe...
Main Authors: | , , , |
---|---|
Format: | Conference item |
Jezik: | English |
Izdano: |
Association for Uncertainty in Artificial Intelligence
2017
|
_version_ | 1826307672013537280 |
---|---|
author | Zhang, Q Filippi, S Flaxman, S Sejdinovic, D |
author_facet | Zhang, Q Filippi, S Flaxman, S Sejdinovic, D |
author_sort | Zhang, Q |
collection | OXFORD |
description | The algorithms for causal discovery and more broadly for learning the structure of graphical models require well calibrated and consistent conditional independence (CI) tests. We revisit the CI tests which are based on two-step procedures and involve regression with subsequent (unconditional) independence test (RESIT) on regression residuals and investigate the assumptions under which these tests operate. In particular, we demonstrate that when going beyond simple functional relationships with additive noise, such tests can lead to an inflated number of false discoveries. We study the relationship of these tests with those based on dependence measures using reproducing kernel Hilbert spaces (RKHS) and propose an extension of RESIT which uses RKHS-valued regression. The resulting test inherits the simple two-step testing procedure of RESIT, while giving correct Type I control and competitive power. When used as a component of the PC algorithm, the proposed test is more robust to the case where hidden variables induce a switching behaviour in the associations present in the data. |
first_indexed | 2024-03-07T07:06:39Z |
format | Conference item |
id | oxford-uuid:bc3b78e3-ebe4-4f8d-8de1-8bcd11d660f8 |
institution | University of Oxford |
language | English |
last_indexed | 2024-03-07T07:06:39Z |
publishDate | 2017 |
publisher | Association for Uncertainty in Artificial Intelligence |
record_format | dspace |
spelling | oxford-uuid:bc3b78e3-ebe4-4f8d-8de1-8bcd11d660f82022-05-05T10:02:08ZFeature-to-feature regression for a two-step conditional independence testConference itemhttp://purl.org/coar/resource_type/c_5794uuid:bc3b78e3-ebe4-4f8d-8de1-8bcd11d660f8EnglishSymplectic Elements at OxfordAssociation for Uncertainty in Artificial Intelligence2017Zhang, QFilippi, SFlaxman, SSejdinovic, DThe algorithms for causal discovery and more broadly for learning the structure of graphical models require well calibrated and consistent conditional independence (CI) tests. We revisit the CI tests which are based on two-step procedures and involve regression with subsequent (unconditional) independence test (RESIT) on regression residuals and investigate the assumptions under which these tests operate. In particular, we demonstrate that when going beyond simple functional relationships with additive noise, such tests can lead to an inflated number of false discoveries. We study the relationship of these tests with those based on dependence measures using reproducing kernel Hilbert spaces (RKHS) and propose an extension of RESIT which uses RKHS-valued regression. The resulting test inherits the simple two-step testing procedure of RESIT, while giving correct Type I control and competitive power. When used as a component of the PC algorithm, the proposed test is more robust to the case where hidden variables induce a switching behaviour in the associations present in the data. |
spellingShingle | Zhang, Q Filippi, S Flaxman, S Sejdinovic, D Feature-to-feature regression for a two-step conditional independence test |
title | Feature-to-feature regression for a two-step conditional independence test |
title_full | Feature-to-feature regression for a two-step conditional independence test |
title_fullStr | Feature-to-feature regression for a two-step conditional independence test |
title_full_unstemmed | Feature-to-feature regression for a two-step conditional independence test |
title_short | Feature-to-feature regression for a two-step conditional independence test |
title_sort | feature to feature regression for a two step conditional independence test |
work_keys_str_mv | AT zhangq featuretofeatureregressionforatwostepconditionalindependencetest AT filippis featuretofeatureregressionforatwostepconditionalindependencetest AT flaxmans featuretofeatureregressionforatwostepconditionalindependencetest AT sejdinovicd featuretofeatureregressionforatwostepconditionalindependencetest |