NIPS2009_0331_slide - for such cases as well: 1...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Toward Provably Correct Feature Selection in Arbitrary Domains s Optimal Feature Selection for X b Finding Markov Boundary (min Markov Blanket) of X r Pr( X | all vars ) = Pr( X | boundary of X ) s Many algos for “normal” probability distributions B s Don’t work for corner cases (e.g., parity functions in discrete domains); these contain higher-order interaction (involving more vars) and are hard to detect s We give 2 algorithms and a basic theorem for provably correct , approximate algorithms
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: for such cases as well: 1 deterministic, 1 randomized; parameter called algo’s “ margin ” target variable X target variable X boundary vars boundary vars 4 remaining domain vars 4 remaining domain vars all singletons independent all pairs independent all triplets independent all quads independent (only one) all singletons independent all pairs independent independence unknown for any triplet independence unknown for quad Dimitris Margaritis target variable X target variable X...
View Full Document

Ask a homework question - tutors are online