NickName:jjepsuomi Ask DateTime:2014-02-14T19:26:25 Counting the number of non-NaN elements in a numpy ndarray in Python I need to calculate the number of non-NaN elements in a numpy ndarray matrix. How would one efficiently do this in Python? Here is my simple code for achieving this: import numpy as np def numberOfNonNans(data): count = 0 for i in data: if not np.isnan(i): count += 1 return count Is there a built-in function for this in numpy? Efficiency is important because I'm doing Big Data analysis. Thnx for any help! Copyright Notice:Content Author:「jjepsuomi」,Reproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.Link to original article:https://stackoverflow.com/questions/21778118/counting-the-number-of-non-nan-elements-in-a-numpy-ndarray-in-python Answers M4rtini 2014-02-14T11:29:54 np.count_nonzero(~np.isnan(data))\n\n\n~ inverts the boolean matrix returned from np.isnan.\n\nnp.count_nonzero counts values that is not 0\\false. .sum should give the same result. But maybe more clearly to use count_nonzero\n\nTesting speed: \n\nIn [23]: data = np.random.random((10000,10000))\n\nIn [24]: data[[np.random.random_integers(0,10000, 100)],:][:, [np.random.random_integers(0,99, 100)]] = np.nan\n\nIn [25]: %timeit data.size - np.count_nonzero(np.isnan(data))\n1 loops, best of 3: 309 ms per loop\n\nIn [26]: %timeit np.count_nonzero(~np.isnan(data))\n1 loops, best of 3: 345 ms per loop\n\nIn [27]: %timeit data.size - np.isnan(data).sum()\n1 loops, best of 3: 339 ms per loop\n\n\ndata.size - np.count_nonzero(np.isnan(data)) seems to barely be the fastest here. other data might give different relative speed results. ", G M 2017-05-03T09:24:13 Quick-to-write alternative\nEven though is not the fastest choice, if performance is not an issue you can use:\nsum(~np.isnan(data)).\nPerformance:\nIn [7]: %timeit data.size - np.count_nonzero(np.isnan(data))\n10 loops, best of 3: 67.5 ms per loop\n\nIn [8]: %timeit sum(~np.isnan(data))\n10 loops, best of 3: 154 ms per loop\n\nIn [9]: %timeit np.sum(~np.isnan(data))\n10 loops, best of 3: 140 ms per loop\n", Darren Weber 2019-03-20T03:04:27 To determine if the array is sparse, it may help to get a proportion of nan values\n\nnp.isnan(ndarr).sum() / ndarr.size\n\n\nIf that proportion exceeds a threshold, then use a sparse array, e.g.\n- https://sparse.pydata.org/en/latest/", Manuel 2017-02-15T00:38:44 An alternative, but a bit slower alternative is to do it over indexing.\n\nnp.isnan(data)[np.isnan(data) == False].size\n\nIn [30]: %timeit np.isnan(data)[np.isnan(data) == False].size\n1 loops, best of 3: 498 ms per loop \n\n\nThe double use of np.isnan(data) and the == operator might be a bit overkill and so I posted the answer only for completeness. ",