This paper introduces the theory of bi-decomposition of Boolean functions. This approach optimally exploits functional properties of a Boolean function in order to find an associated multilevel circuit representation having a very short delay by using simple two input gates. The machine learning process is based on the Boolean Differential Calculus and is focused on the aim of detecting the profitable functional properties availablefor the Boolean function. For clear understanding the bi-decomposition of completely specifiedBoolean functions is introduced first. Significantly better chance of successare given for bi-decomposition of incompletely specifiedBoolean functions, discussed secondly. The inclusion of the weak bi-decomposition allows to prove the the completeness of the suggested decomposition method. The basic task for machine learning consists of determining the decomposition type and dedicated sets of variables. Lean on this knowledge a complete recursive design algorithm is suggested. Experimental results over MCNC benchmarks show that the bi-decomposition outperforms SIS and other BDD-based decomposition methods interms of area and delay of the resulting circuits with comparableCPU time. By switching from the ON-set/OFF-set model of Boolean function lattices to their upper- and lower-bound model a new view to the bi-decomposition arises. This new form of the bi-decomposition theorymakes a comprehensible generalization of the bi-decomposition to multivalued function possible.