[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-10-06 UTC."],[[["Computes the Kappa statistic, a measure of agreement between predicted and actual classifications, for a given confusion matrix."],["The `kappa()` function takes a `confusionMatrix` object as input and returns the Kappa statistic as a float."],["This function is part of the Earth Engine API and can be used to evaluate the performance of classification models."],["Usage examples are provided in JavaScript and Python for calculating the Kappa statistic from a confusion matrix."]]],["The `ConfusionMatrix.kappa()` method computes the Kappa statistic, returning a float value. This method operates on a confusion matrix, which is typically generated from a classifier. The provided examples demonstrate constructing a confusion matrix from an array, then utilizing `kappa()` to calculate the Kappa statistic. They also showcase related accuracy metrics like overall, consumer's, and producer's accuracy.\n"]]