1% Simple illustration of constructing tree-based models within Aleph 2% To run do the following: 3% a. Load Aleph 4% b. read_all(wedge). 5% c. induce_tree. 6 7% Model trees are constructed by specifying a predicate that 8% will be used for model construction for examples in a leaf. 9% The user has to provide a definition for this predicate that 10% is able to: (a) construct the model; and (b) predict using 11% the model constructed. The trick used is the same as that 12% for lazy evaluation. 13 14% Learning a model tree 15% The function to be learnt is: 16% y = f(x) = x + 1 (x =< 0) 17% = -x + 1 ( x > 0) 18% That is: 19% 20% | 21% /|\ 22% / | \ 23% / | \ 24% -------------------------- 25% 0 26 27% what Aleph actually learns with the data given is: 28% y = f(x) = x + 1 (x =< -0.5) 29% = -x + 1 ( x > -0.5) 30% adding more examples rectifies this: see wedge.f
35:-use_module(library(aleph)). 36:- if(current_predicate(use_rendering/1)). 37:- use_rendering(prolog). 38:- endif. 39:- aleph. 40 41%%%%%%%%%%%%%%%%%%%%%%%%%%%% 42% specify tree type 43 44:- aleph_set(tree_type,model). 45:- aleph_set(evalfn,mse). 46:- aleph_set(minpos,2). % minimum examples in leaf for splitting 47:- aleph_set(mingain,0.01). % toy example needs this to be low 48:- aleph_set(dependent,2). % second argument of f/2 is to predicted 49:- aleph_set(verbosity,10). 50%:- aleph_set(mingain,-1e10). 51 52% specify predicate definition to use for model construction 53:- model(predict/3). 54 55%%%%%%%%%%%%%%%%%%%%%%%%%%%% 56% Mode declarations 57 58:- modeh(1,f(+x,-y)). 59:- modeb(1,lteq(+x,#threshold)). 60:- modeb(1,predict(+x,-y,#params)). 61 62:- determination(f/2,lteq/2). 63:- determination(f/2,predict/3). 64 65:-begin_bg. 66%%%%%%%%%%%%%%%%%%%%%%%%%%%% 67% Type definitions 68 69threshold(-0.5). 70threshold(0.0). 71threshold(0.5). 72 73params([_Slope,_Constant,_Sd]). 74 75list([_|_]). 76 77%%%%%%%%%%%%%%%%%%%%%%%%%%%% 78% Background 79 80lteq(X,Y):- 81 var(Y), !, 82 X = Y. 83lteq(X,Y):- 84 number(X), number(Y), 85 X =< Y. 86 87% definition for model construction (parameter estimation) 88predict(X,Y,[M,C,Sd]):- 89 list(X), 90 list(Y), !, 91 l_regress1(Y,X,M,C,Sd). 92% definition for prediction 93predict(X,Y,[M,C,_]):- 94 number(X), var(Y), !, 95 Y is M*X + C. 96% definition for model checking 97predict(X,Y,[M,C,Sd]):- 98 number(Y), number(X), !, 99 Y1 is M*X + C, 100 Diff is Y - Y1, 101 abs_val(Diff,ADiff), 102 ADiff =< 3*Sd. 103 104% very simple univariate linear regression 105l_regress1([YVals|_],[XVals|_],M,C,0.0):- 106 YVals = [Y1,Y2|_], 107 XVals = [X1,X2|_], 108 M is (Y2-Y1)/(X2-X1), 109 C is Y1 - M*X1. 110 111%%%%%%%%%%%%%%%%%%%%%%%%%%%% 112% Constraints 113 114% remove redundant checks for =< 115% prune((_:-Body)):- 116% has_literal(lteq(X,Y),Body,Left), 117% has_literal(lteq(X1,Y1),Left,_), 118% X == X1, 119% Y1 =< Y. 120% 121% has_literal(L,(L,L1),L1). 122% has_literal(L,(_,L1),Left):- 123% !, 124% has_literal(L,L1,Left). 125% has_literal(L,L,true). 126 127%%%%%%%%%%%%%%%%%%%%%%%%%%%% 128% Utilities 129 130abs_val(X,Y):- X < 0, !, Y is -X. 131abs_val(X,X):- X >= 0. 132 133:-end_bg. 134:-begin_in_pos. 135 136f(-1.0,0.0). 137f(-0.5,0.5). 138 f(-0.25,0.75). % adding this results in the correct theory 139f(0.0,1.0). 140f(0.5,0.5). 141f(1.0,0.0). 142 143:-end_in_pos. 144:-begin_in_neg. 145:-end_in_neg.
?-
induce_tree(Program)
. */