AI-generated Key Takeaways
-
Array.bitCount()
calculates the number of 1s in the binary representation of each element in an array. -
It operates on 64-bit two's complement binary representation, supporting both positive and negative integers.
-
The function returns an array with the same dimensions as the input, where each element represents the bit count of the corresponding input element.
-
This method works for arrays of various data types, including int8, int16, int32, and int64.
-
It handles empty arrays, single-element arrays, multi-dimensional arrays, and arrays with both positive and negative numbers.
Usage | Returns |
---|---|
Array.bitCount() | Array |
Argument | Type | Details |
---|---|---|
this: input | Array | The input array. |
Examples
Code Editor (JavaScript)
print(ee.Array([], ee.PixelType.int8()).bitCount()); // [] print(ee.Array([0]).bitCount()); // [0] print(ee.Array([1]).bitCount()); // [1] print(ee.Array([2]).bitCount()); // [1] print(ee.Array([3]).bitCount()); // [2] print(ee.Array([0xFFFF]).bitCount()); // [16] print(ee.Array([1, 2, 3]).bitCount()); // [1,1,2] print(ee.Array([[0, 1], [6, 13]]).bitCount()); // [[0,1],[2,3]] // https://en.wikipedia.org/wiki/Two's_complement signed values. print(ee.Array([-1]).bitCount()); // [64] print(ee.Array([-1], ee.PixelType.int8()).bitCount()); // [64] print(ee.Array([-2]).bitCount()); // [63]
import ee import geemap.core as geemap
Colab (Python)
display(ee.Array([], ee.PixelType.int8()).bitCount()) # [] display(ee.Array([0]).bitCount()) # [0] display(ee.Array([1]).bitCount()) # [1] display(ee.Array([2]).bitCount()) # [1] display(ee.Array([3]).bitCount()) # [2] display(ee.Array([0xFFFF]).bitCount()) # [16] display(ee.Array([1, 2, 3]).bitCount()) # [1, 1, 2] display(ee.Array([[0, 1], [6, 13]]).bitCount()) # [[0, 1], [2, 3]] # https://en.wikipedia.org/wiki/Two's_complement signed values. display(ee.Array([-1]).bitCount()) # [64] display(ee.Array([-1], ee.PixelType.int8()).bitCount()) # [64] display(ee.Array([-2]).bitCount()) # [63]