Sometimes, the cheapest and most efficient LED bulbs are in the blue end of the spectrum, especially when the color temperature doesn't matter too much - like a flashlight.
In that case, it's not so much the color temperature as it is the spectrum. The color temperature tells you what temperature of blackbody radiation your light source most closely resembles, but it doesn't tell you how closely it resembles it. Our eyes work best with light that has a distribution similar to blackbody radiation, i.e. with a wide, smooth distribution of wavelengths. If the distribution has sharp spikes, it can cause things to look the wrong color compared to what they're expected to look like. This is most obvious if you get one of the LED lights that uses a mix of pure red, green, and blue to simulate other colors; you can get something that looks like white if you look directly at the lights, but nothing they shine on looks right. That color shift is what CRI (color rendering index) is supposed to measure.
Lights have to have a CRI of at least 80 to qualify for Energy Star, which means that most household lights are now fairly decent. Cheaper lights and ones not intended for general illumination may go for higher efficiency at the cost of lower CRI, which is what you're probably noticing in the light from flashlights. High CRI (90+) lights are available, but they're usually a bit more expensive and less efficient.