Zaborski and Grzesiak [8,9] applied ANN to dystocia detection in Polish Holstein-Friesian Black-and-White cattle, Zaborski et al [10] used boosted classification trees for the same purpose, Morrison et al [11,12], Basarab et al [7], Arthur et al [13], and Johnson [14] applied

linear discriminant function analysis for dystocia prediction in beef heifers, whereas Piwczynski et al [15] used decision trees for analyzing factors affecting dystocia in dairy cows.

Linear discriminant function analysis (DFA) was executed to predict and classify each specimen to their respective populations based on their morphometric features.

The principle of multivariate

linear discriminant function is that measured variables are taken as independent variables whereas sex is a dependent variable.

Specifically, it projects high-dimensional points onto low-dimensional space and uses univariate analysis of variance to establish a

linear discriminant function per criteria of maximum between-class distance and minimum inner-class distance.

In this study, a Bayesian stepwise discriminant model was established, and a corresponding

linear discriminant function was built.

The coefficients (

linear discriminant function) cannot be interpreted reliably, but the fitted (classified) values are not affected.

The most common discriminant analysis method adopted by researchers to detect financial frauds using discriminant analysis is the

Linear Discriminant Function.

The Fisher

linear discriminant function for use in the development of classifiers for coffee seeds submitted to the LERCAFE test is efficient; however, additional tests are still required with other discriminant functions and other methods for quantification.

The paper presents some useful results on the probability of misclassification when the classical

linear discriminant function is replaced by a linear median discriminant function.

A linear classifier is based on a

linear discriminant function f(x)

The

linear discriminant function provided the following prediction model for year 2002.

Based on the VC dimension theory of statistical learning theory and the structural risk minimization principle, support vector machines method [12] converts the practical problem to high-dimensional feature space through nonlinear transformation and realizes the nonlinear discriminant function in the original space by constructing

linear discriminant function in higher space.