ee.ConfusionMatrix.kappa

  • The ConfusionMatrix.kappa() method computes the Kappa statistic for a confusion matrix.

  • This method takes a ConfusionMatrix object as input and returns a float value.

  • The Kappa statistic is a measure of agreement between observed and predicted classifications.

  • The provided examples show how to calculate the Kappa statistic using both JavaScript and Python.

Computes the Kappa statistic for the confusion matrix.

UsageReturns
ConfusionMatrix.kappa()Float
ArgumentTypeDetails
this: confusionMatrixConfusionMatrix

Examples

Code Editor (JavaScript)

// Construct a confusion matrix from an array (rows are actual values,
// columns are predicted values). We construct a confusion matrix here for
// brevity and clear visualization, in most applications the confusion matrix
// will be generated from ee.Classifier.confusionMatrix.
var array = ee.Array([[32, 0, 0,  0,  1, 0],
                      [ 0, 5, 0,  0,  1, 0],
                      [ 0, 0, 1,  3,  0, 0],
                      [ 0, 1, 4, 26,  8, 0],
                      [ 0, 0, 0,  7, 15, 0],
                      [ 0, 0, 0,  1,  0, 5]]);
var confusionMatrix = ee.ConfusionMatrix(array);
print("Constructed confusion matrix", confusionMatrix);

// Calculate overall accuracy.
print("Overall accuracy", confusionMatrix.accuracy());

// Calculate consumer's accuracy, also known as user's accuracy or
// specificity and the complement of commission error (1 − commission error).
print("Consumer's accuracy", confusionMatrix.consumersAccuracy());

// Calculate producer's accuracy, also known as sensitivity and the
// compliment of omission error (1 − omission error).
print("Producer's accuracy", confusionMatrix.producersAccuracy());

// Calculate kappa statistic.
print('Kappa statistic', confusionMatrix.kappa());

Python setup

See the Python Environment page for information on the Python API and using geemap for interactive development.

import ee
import geemap.core as geemap

Colab (Python)

# Construct a confusion matrix from an array (rows are actual values,
# columns are predicted values). We construct a confusion matrix here for
# brevity and clear visualization, in most applications the confusion matrix
# will be generated from ee.Classifier.confusionMatrix.
array = ee.Array([[32, 0, 0,  0,  1, 0],
                  [ 0, 5, 0,  0,  1, 0],
                  [ 0, 0, 1,  3,  0, 0],
                  [ 0, 1, 4, 26,  8, 0],
                  [ 0, 0, 0,  7, 15, 0],
                  [ 0, 0, 0,  1,  0, 5]])
confusion_matrix = ee.ConfusionMatrix(array)
display("Constructed confusion matrix:", confusion_matrix)

# Calculate overall accuracy.
display("Overall accuracy:", confusion_matrix.accuracy())

# Calculate consumer's accuracy, also known as user's accuracy or
# specificity and the complement of commission error (1 − commission error).
display("Consumer's accuracy:", confusion_matrix.consumersAccuracy())

# Calculate producer's accuracy, also known as sensitivity and the
# compliment of omission error (1 − omission error).
display("Producer's accuracy:", confusion_matrix.producersAccuracy())

# Calculate kappa statistic.
display("Kappa statistic:", confusion_matrix.kappa())