On Wed, Feb 11, 2009 at 11:07, kevin liu <lwtbenben@gmail.com> wrote: snip >> But this will still take O( n * m ) if all the elements of @nwarray0 are in >> @nwarray1. >> >> If the elements of @nwarray1 are sorted then you could a binary search on >> it and reduce your worst case to O( n * log m ). > > But how could this be, I have got the best algorithm from Rob, but I > don't know why a binary search would be O( n * logm ) > Could you please help to explain? Thank you in advance. snip You must have n, there is no other way (which is why even using a hash version has an amortized cost of O(n)) because you must examine all n items in the first array. In the naive case (searching the whole second array) you multiply n by m because for every item in the first array you must search m items in the second array. Changing the naive case to use a binary search reduces the number of elements in the second array from m to nl m/nl 2 or as it is normally written log m. This is because a search through the longest path through the (balanced) binary tree is log m (so for 128 items the longest path is 7, for 256 it is 8, for 512 it is 9, etc.). -- Chas. Owens wonkden.net The most important skill a programmer can have is the ability to read.Thread Previous | Thread Next