






PGSL is a direct search algorithm that utilises global sampling for finding the minimum of a user defined objective function. The power of PGSL is in handling blackbox objective functions and constraints. Gradients are not needed and no special characteristics of the objective functions (such as convexity) are required. PGSL is more suitable for continuous variables with nonlinear objective functions and nonlinear constraints. You get good results also when variables are discreted but reasonably ordered. Totally unordered discrete variables can be treated by repeating PGSL runs many times. Tests indicate that PGSL performs better than other stochastic search algorithms when the size of the problem increases beyond 20 variables. It can handle thousands of variables, of course, with no guarantee of finding the global optimum.
For many engineering optimisation problems, we do not need the exact global optimum. Instead we are interested in a good enough solution. PGSL is an attractive solution technique in such situations. It can be used to solve black box optimization problems where the objective function might not be written in the form of neat mathematical expressions. The objective function might require calling an external program and it may not be possible to compute derivatives and other mathematical characteristics. I developed PGSL in 1998 when I was working at EPFL. Since then it has been used by many people for various tasks. This includes design, control, parameter identification, diagnosis and solving inverse problems. Over the years, I have developed interfaces to PGSL in many languages, for example, matlab, java, VBA, etc. In order to download these versions and other information please see the links below.