The GALVANO  tools
This document describes a few applications of our optimisation techniques, the genetic 
algorithm ("GA") and simulated annealing, in the real world:
risk prediction ("credit scoring") using "scorecards" ("Galvano Score")
automatic production of rule bases for decision-making ("Galvano Rule Base")
Neural network production and improvement ("GalvaNeurones")
and other examples, less applied, for demonstration.

A disk containing the software described here is included.
You need a PC-compatible computer with a graphics screen (VGA or better) and if 
possible a mouse.
Copy file GADEMO.EXE onto your hard disk (using xcopy ou copy). Then type GADEMO 
and this file will decompress itself, producing the programs. You may then erase file 
GADEMO.EXE, as it is no longer useful.
You can reproduce the software and this accompanying documentation freely.
Scorecards for credit scoring - Galvano Score (GASCD.EXE)
We have built a scorecard production software using the GA, "GALVANO Score".
 
A scorecard is used by banking establishments to assess the risk associated to a 
potential money borrower. The personal description of the borrower is matched 
against a list of variable modalities (married/unmarried/widower) or value ranges 
(salary or age from so much to so much) and their corresponding "weightings". For 
example, the prospective borrowers earn 30 points if they are unmarried, 60 if they 
are, etc. The loan is granted or denied according to the sum total of points over all 
variables of the customer's description.
Weightings and when necessary the value range limits are optimised from random 
values using the genetic algorithm. The fitness value is in this case the rate of 
correctly predicted customers (from 500 to 2 000), over a set of past borrowers 
whose fate is known. This present example concerns loans for the purchase of used 
vehicles.
As is the rule with the GA, fitness computations take up 99 % of the time.


A few hours of evolution are necessary to obtain better (more correct predictions, 
higher robustness) scorecards than these painstakingly obtained by highly paid 
statisticians in a few weeks. The process is faster if one reduces the number of 
learning examples, but this has negative consequences for robustness (as is the with 
all modeling techniques).
To start this demonstration, type
GASCD (then <Return>)
An initial population of 2000 randomly generated scorecards is created, and a few of 
these are shown. The best element of the population is always displayed as the top 
leftmost one.
Click on "Find" or press <F1> to start the evolution. The population starts improving: 
the red ("Bad risk") and green ("Good risk") bars are better and better separated. 
Evolution goes on until you press <Escape>. Hit <F5> or click on "Exit" to end the 
demonstration.
The best scorecard produced is stored to text file scorinSC.txt and you can 
inspect it (with type scorinsc.txt or edit scorinsc.txt at the DOS 
prompt). 
It is a classical scorecard, with the weighting associated to each modality for 
qualitative variables, and the range limits and each corresponding weighting for 
quantitative (continuous) ones. Option /log lets you score using logistic functions.
The same problem can also be handled by a rule base: see demo GABRD2.EXE.
Evolving rule bases: Galvano Rule Base(GABRD.EXE, GABRD2.EXE)
Another tool from the Galvano series, "Galvano Rule Base", "grows" decision rules 
from a set of examples. 
 
This software has applications in credit scoring and Stock Exchange trading. Its other 
use, beside producing rule bases themselves, is variable selection (option /SigVar) 
This option counts the number of times a variable was used for making a decision, 
over all examples and the best few hundred rule bases. The hierarchy of most-quoted 
variables indicates which of them are the most relevant.
It is then possible to process these most meaningful variables with a neural network. 
The GABRD example deals with a toy problem, the exclusive-OR. Points belong to a 
class or to the other according to their position on a 4-square checkerboard.
The GABRD2 example is another treatment of the used-vehicle credit scoring problem 
described earlier.
Neural network evolution: GalvaNeurones (GARND.EXE)
 
The GalvaNeurones software builds neural networks using the GA and/or classical 
backpropagation. This very useful whenever backpropagation is unsuitable, for 
example with small data sets.
This is especially useful when there are few learning data and learning using backprop 
gives poor results. There is little or no improvement when data sets are large and the 
problem simple.
Demo GARND, based on "GALVANeurones" lets you perform "pure evolution" or add 
backpropagation to the GA, according to the value of option /ARC. Type GARND ? 
for details.
The Galvano Tools - Applications of the Genetic Algorithm  - 2


