Back in the deep mists of time, every programmer worth their salt knew that division was an expensive operation. If they were lucky, the numbers they would have to divide by would be a power of two, and doing so requires a single shift instead of a divide. Back in the day, compilers did not automatically substitute division by a power of two with a shift, and if you were reading sourcecode, you could tell that the programmer knew their stuff if they wrote "x>>1" instead of "x/2".
Then along came optimizing compilers, that were smart enough to know what you were trying to do, and for unsigned integers they would automatically substitute the divide with a shift. Soon after, people stopped writing a shift and just wrote a divide instead, trusting the compiler to do so and in the process, making the code easier to read.
A newer generation of programmers came along, and noticed nothing 'unoptimized' about division by constant powers of two, so they just took it for granted, and were completely unaware of this trick. However, one consequence of this is if picking an arbitrary number to divide by, if they don't choose a power of two, the compiler can no longer do the abovementioned optimization. It used to be the case that arbitrary choices of numbers were powers of two, but not anymore. In fact, if books that had the potential to be turned into video-games were written bearing in mind this optimization, we may see some top-level title-changes. For example, instead if calling it 50 Shades of Grey, E.L. James should have called it Sixty Four Shades of Grey, and if someone tried to compile it, they'd end up with a somewhat faster game.
Also, if we take for granted the compiler will do this for integer division, we may forget that when dividing signed numbers, we may get the compiler producing code like this
(x>0 ? x>>1 : x/2 )