Spotlight

Looking for a public data mining software? Try Orange, a scriptable component based framework.

Want to use decision support tool, but your computer is just not there when you need it? Check our handheld-based decision-support schema and our PalmPilot software.

Explore what we are doing in functional genomics.


FRI > Biolab > Function Decomposition > Related Work

Related Work

[The following text is adapted from Machine Learning by Function Decomposition (Zupan et al. 1997). Click here to download the full version of the paper.]

The decomposition approach to machine learning was used by a pioneer of artificial intelligence, A. Samuel. He proposed a method based on a signature table system (Samuel, 1967) and successfully used it as an evaluation mechanism for his checkers playing programs. This approach was later improved by Biermann (1982). Their method, however, did not address the problem of deriving the structure of concepts.

A similar approach had been defined even earlier within the area of switching circuit design. Ashenhurst (1952) reported on a unified theory of decomposition of switching functions. His decomposition method was essentially the same as that of Biermann et al., except that it was used to decompose a truth table of a specific Boolean function to be then realized with standard binary gates. Most of other related work of those times is reported and reprinted by Curtis (1962).

Recently, the Ashenhurst-Curtis approach was substantially improved by research groups of M. A. Perkowski, T. Luba, and T. D. Ross. Perkowski (1995) report on the decomposition approach for incompletely specified switching functions. Luba (1995) proposes a method for the decomposition of multi-valued switching functions in which each multi-valued variable is encoded by a set of Boolean variables. The authors identify the potential usefulness of function decomposition for machine learning. Goldman (1995) evaluate FLASH, a Boolean function decomposer, on a set of eight-attribute binary functions and show its robustness in comparison with C4.5 decision tree inducer.

Not much work has been done to extend the algorithm to continuous functions and noisy domains. The only previous work that we are aware of is a one of Ross (1994) who discusses the possible use of the method for functions with real valued outputs and inputs but he does not propose any algorithm for general use.

Feature discovery has been at large investigated by constructive induction (Michalski, 1986) . Perhaps closest to the function decomposition method are the constructive induction systems that use a set of existing attributes and a set of predefined constructive operators to derive new attributes (Pfahringer 1994, Ragavan 1993).

Within machine learning, there are other approaches that are based on problem decomposition, but where the problem is decomposed by the expert and not discovered by a machine. A well-known example is structured induction (a term introduced by Donald Michie) applied by Shapiro (1987). Their approach is based on a manual decomposition of the problem and an expert-assisted selection of examples to construct rules for the concepts in the hierarchy. In comparison with standard decision tree induction techniques, structured induction exhibits about the same classification accuracy with the increased transparency and lower complexity of the developed models. Michie (1995) emphasized the important role of structured induction and listed several real problems that were solved in this way.

The concept hierarchy has also been used by a multi-attribute decision support expert system shell DEX (Bohanec 1990). There, a tree-like structure of variables is defined by a domain expert. DEX has been successfully applied in more than 50 realistic decision making problems.

The function decomposition method and HINT, both presented on these pages, borrow from three different research areas: they share the motivation with structured induction and structured approach to decision support, while their core is based on Ashenhurst-Curtis function decomposition. In comparison with related work, our early work on function decomposition was original in the following aspects: new method for handling multi-valued attributes and classes, improved decomposition heuristics, emphasis on generalization effects of decomposition, paying strong attention to the discovery of meaningful concept hierarchies, and experimental evaluation on machine learning problems. Later on we have focused on noise handling, use of function decomposition in constructive induction, and studied the effects of human intervention on the quality of resulting concept hierarchies.


Ashenhurst, R. L.: 1952, The decomposition of switching functions, Techical report, Bell Laboratories BL-1 (11), 541-602.

Biermann, A. W., Fierfield, J. and Beres, T.: 1982, Signature table systems and learning, IEEE Trans. Syst. Man Cybern. 12(5), 635-648.

Curtis. H. A.: 1962, A New Approach to the Design of Switching Functions, Van Nostrand, Princeton, N. J.

Luba, T.: 1995, Decomposition of multiple-valued functions, 25th Intl. Symposium on multivalued logic, Bloomington, Indiana, 256-261.

Michalski, R. S.: 1986, Understanding the nature of learning: issues and research directions,in R. Michalski, J. Carbonnel and T. Mitchell (eds), Machine LEarning: An ArtificialIntelligence Approach, KAufmann, Los Atlos, CA, 3-25.

Michie D.: 1995, Problem decomposition and the learning of skills, in N. Lavrac and S. Wrobel (eds), Machine Learning: ECML-95, Notes in Artificial Inteligence 912, Springer-Verlag, 17-31.

Perkowski, M. A. et al.: 1995, Unified approach to functional decompositions of switching functions, Technical report, Warsaw University of Technology and Eindhoven University of Technology.

Pfahringer, B.: 1994, Controlling constructive induction in CiPF, in F. Bergadano and L. D. Raedt (eds), Machine Learning: ECML-94, Springer-Verlag, 242-256.

Ragavan, H. and Rendell, L.: 1993, Lookaheah feature construction for learning hard concepts, Proc. Tenth International Machine Learning Conference, Morgan Kaufman, 252-259.

Ross, T. D. et al (1994): On the Decomposition of Real-valued Functions, 3rd International Workshop if Post-Binary VLSI Systems.

Samuel, A.: 1967, Some studied in machine learning using the game of checkers II: Recent progress, IBM J. Res. Develop. 11, 601-617.

Shapiro, A. D.: 1987, Structured induction in expert systems, Turing Institute Press in association with Addison-Wesley Publishing Company.