Loading...

algogeeks@googlegroups.com
[Prev] Thread [Next]  [Prev] Date [Next]
[algogeeks] total number of one bits [EMAIL PROTECTED] Fri Oct 19 12:07:17 2007
Hi all: I met a problem in the programming pearls as follow: Q: Given a very long sequence(say, billions or trillions) of bytes, how would you efficiently count the total number of one bits? A: The first approach is to count the number of one bits in each input unit(perhaps an 8bit character or perhaps a 32bit integer), and sum them. To find the number of one bits in a 16bit integer, one could look at each bit in order, or iterate through the bits that are on, or perform a lookup in a table of. What effect will the cache size have on your choice of unit? The second approach is to count the number of each input unit in th input, and then at the end take the sum of that number multiplied by the number of one bits in that unit. My question is, I am puzzled about the solution. What's the difference of the two approaches? In the second approach, what are multiplied? Thanks! ~~~~~~~~~ You received this message because you are subscribed to the Google Groups "Algorithm Geeks" group. To post to this group, send email to [EMAIL PROTECTED] To unsubscribe from this group, send email to [EMAIL PROTECTED] For more options, visit this group at http://groups.google.com/group/algogeeks ~~~~~~~~~
 [algogeeks] total number of one bits [EMAIL PROTECTED] 2007/10/19 <=
 [algogeeks] Re: total number of one bits Gene 2007/10/19