The entropy in hashes must be less than the entropy in the data or it isn't a hash. That means that a hash requires that there be collisions by definition. A good hash will minimize those but there will always be a risk.
When writing a program that requires a hash, I find it useful to gut the hash function so that if I'm using sha256, I set all the bytes except for one to zero so I see what happens with collisions and can test that functionality. It is amazing how many bugs I've found in protocol implementations by doing this with hashes and block cyphers.
A coder also needs to balance performance with the function of the code if it's cpu bound. Many web pages now spend more than half their load time doing the TLS handshake. If you decided you want to go beyond what the CPU supports, you can also find your code runs very slow. Say you want to run something like a hypothetical AES-1024. The hardware only supports 256 bits so you get a 10x penalty for that plus you have to deal with 4 times more bits so there isn't anyway the new code won't be less than 40 times slower. Sometimes it is just better to use a much faster weaker hash for some parts and a slower better hash for data intergrity. An example of this would be something like rsync or torrent where there are lots of little blocks and very fast hash is helpful but for a better hash can be used for sets of blocks. You can not count on the speed of a hash for security either. A cheap bitcoin USB device can do hashes 31,000 times faster than my workstation.