NAG Library Function Document

nag_rand_kfold_xyw (g05pvc)

 Contents

    1  Purpose
    7  Accuracy

1
Purpose

nag_rand_kfold_xyw (g05pvc) generates training and validation datasets suitable for use in cross-validation or jack-knifing.

2
Specification

#include <nag.h>
#include <nagg05.h>
void  nag_rand_kfold_xyw (Integer k, Integer fold, Integer n, Integer m, Nag_DataByObsOrVar sordx, double x[], Integer pdx, double y[], double w[], Integer *nt, Integer state[], NagError *fail)

3
Description

Let Xo denote a matrix of n observations on m variables and yo and wo each denote a vector of length n. For example, Xo might represent a matrix of independent variables, yo the dependent variable and wo the associated weights in a weighted regression.
nag_rand_kfold_xyw (g05pvc) generates a series of training datasets, denoted by the matrix, vector, vector triplet Xt,yt,wt of nt observations, and validation datasets, denoted Xv,yv,wv with nv observations. These training and validation datasets are generated as follows.
Each of the original n observations is randomly assigned to one of K equally sized groups or folds. For the kth sample the validation dataset consists of those observations in group k and the training dataset consists of all those observations not in group k. Therefore at most K samples can be generated.
If n is not divisible by K then the observations are assigned to groups as evenly as possible, therefore any group will be at most one observation larger or smaller than any other group.
When using K=n the resulting datasets are suitable for leave-one-out cross-validation, or the training dataset on its own for jack-knifing. When using Kn the resulting datasets are suitable for K-fold cross-validation. Datasets suitable for reversed cross-validation can be obtained by switching the training and validation datasets, i.e., use the kth group as the training dataset and the rest of the data as the validation dataset.
One of the initialization functions nag_rand_init_repeatable (g05kfc) (for a repeatable sequence if computed sequentially) or nag_rand_init_nonrepeatable (g05kgc) (for a non-repeatable sequence) must be called prior to the first call to nag_rand_kfold_xyw (g05pvc).

4
References

None.

5
Arguments

1:     k IntegerInput
On entry: K, the number of folds.
Constraint: 2kn.
2:     fold IntegerInput
On entry: the number of the fold to return as the validation dataset.
On the first call to nag_rand_kfold_xyw (g05pvc) fold should be set to 1 and then incremented by one at each subsequent call until all K sets of training and validation datasets have been produced. See Section 8 for more details on how a different calling sequence can be used.
Constraint: 1foldk.
3:     n IntegerInput
On entry: n, the number of observations.
Constraint: n1.
4:     m IntegerInput
On entry: m, the number of variables.
Constraint: m1.
5:     sordx Nag_DataByObsOrVarInput
On entry: determines how variables are stored in x.
Constraint: sordx=Nag_DataByVar or Nag_DataByObs.
6:     x[dim] doubleInput/Output
Note: the dimension, dim, of the array x must be at least
  • pdx×m when sordx=Nag_DataByVar;
  • pdx×n when sordx=Nag_DataByObs.
The way the data is stored in x is defined by sordx.
If sordx=Nag_DataByVar, x[j-1×pdx+i-1] contains the ith observation for the jth variable, for i=1,2,,n and j=1,2,,m.
If sordx=Nag_DataByObs, x[i-1×pdx+j-1] contains the ith observation for the jth variable, for i=1,2,,n and j=1,2,,m.
On entry: if fold=1, x must hold Xo, the values of X for the original dataset, otherwise, x must not be changed since the last call to nag_rand_kfold_xyw (g05pvc).
On exit: values of X for the training and validation datasets, with Xt held in observations 1 to nt and Xv in observations nt+1 to n.
7:     pdx IntegerInput
On entry: the stride separating row elements in the two-dimensional data stored in the array x.
Constraints:
  • if sordx=Nag_DataByObs, pdxm;
  • otherwise pdxn.
8:     y[n] doubleInput/Output
If the original dataset does not include yo then y must be set to NULL.
On entry: if fold1, y must not be changed since the last call to nag_rand_kfold_xyw (g05pvc).
On exit: values of y for the training and validation datasets, with yt held in elements 1 to nt and yv in elements nt+1 to n.
9:     w[n] doubleInput/Output
If the original dataset does not include wo then w must be set to NULL.
On entry: if fold1, w must not be changed since the last call to nag_rand_kfold_xyw (g05pvc).
On exit: values of w for the training and validation datasets, with wt held in elements 1 to nt and wv in elements nt+1 to n.
10:   nt Integer *Output
On exit: nt, the number of observations in the training dataset.
11:   state[dim] IntegerCommunication Array
Note: the dimension, dim, of this array is dictated by the requirements of associated functions that must have been previously called. This array MUST be the same array passed as argument state in the previous call to nag_rand_init_repeatable (g05kfc) or nag_rand_init_nonrepeatable (g05kgc).
On entry: contains information on the selected base generator and its current state.
On exit: contains updated information on the state of the generator.
12:   fail NagError *Input/Output
The NAG error argument (see Section 3.7 in How to Use the NAG Library and its Documentation).

6
Error Indicators and Warnings

NE_ALLOC_FAIL
Dynamic memory allocation failed.
See Section 2.3.1.2 in How to Use the NAG Library and its Documentation for further information.
NE_ARRAY_SIZE
On entry, pdx=value and m=value.
Constraint: if sordx=Nag_DataByObs, pdxm.
On entry, pdx=value and n=value.
Constraint: if sordx=Nag_DataByVar, pdxn.
NE_BAD_PARAM
On entry, argument value had an illegal value.
NE_INT
On entry, m=value.
Constraint: m1.
On entry, n=value.
Constraint: n1.
NE_INT_2
On entry, fold=value and k=value.
Constraint: 1foldk.
On entry, k=value and n=value.
Constraint: 2kn.
NE_INTERNAL_ERROR
An internal error has occurred in this function. Check the function call and any array sizes. If the call is correct then please contact NAG for assistance.
See Section 2.7.6 in How to Use the NAG Library and its Documentation for further information.
NE_INVALID_STATE
On entry, state vector has been corrupted or not initialized.
NE_NO_LICENCE
Your licence key may have expired or may not have been installed correctly.
See Section 2.7.5 in How to Use the NAG Library and its Documentation for further information.
NW_POTENTIAL_PROBLEM
More than 50% of the data did not move when the data was shuffled. value of the value observations stayed put.

7
Accuracy

Not applicable.

8
Further Comments

nag_rand_kfold_xyw (g05pvc) will be computationality more efficient if each observation in x is contiguous, that is sordx=Nag_DataByObs.
Because of the way nag_rand_kfold_xyw (g05pvc) stores the data you should usually generate the K training and validation datasets in order, i.e., set fold=1 on the first call and increment it by one at each subsequent call. However, there are times when a different calling sequence would be beneficial, for example, when performing different cross-validation analyses on different threads. This is possible, as long as the following is borne in mind:
For example, if you have three threads, you would call nag_rand_kfold_xyw (g05pvc) once with fold=1. You would then copy the x returned onto each thread and generate the remaing k-1 sets of data by splitting them between the threads. For example, the first thread runs with fold=2,,L1, the second with fold=L1+1,,L2 and the third with fold=L2+1,,k.

9
Example

This example uses nag_rand_kfold_xyw (g05pvc) to facilitate K-fold cross-validation.
A set of simulated data is split into 5 training and validation datasets. nag_glm_binomial (g02gbc) is used to fit a logistic regression model to each training dataset and then nag_glm_predict (g02gpc) is used to predict the response for the observations in the validation dataset.
The counts of true and false positives and negatives along with the sensitivity and specificity is then reported.

9.1
Program Text

Program Text (g05pvce.c)

9.2
Program Data

Program Data (g05pvce.d)

9.3
Program Results

Program Results (g05pvce.r)

© The Numerical Algorithms Group Ltd, Oxford, UK. 2017