I think the disconnect here depends on the context.
I've worked on web UIs where the amount of data is tiny.
I've also worked on AAA games where you have 16ms to process one game tick.
And I've worked at internet behemoths on backend code with massive datasets.
In the former, you don't need to worry much about big-O. You could implement exponential algorithms and get reasonable runtime behavior.
But in the latter 2 cases, you absolutely have to consider it at every turn. For instance, you simply can not run algorithms on a game with 10s or 100s of thousands of entities using pairwise comparisons of those entities, for O(n^2) time. You have to think very carefully about avoiding inefficient algorithms, and big-O is almost always going to matter more than whatever constant you're multiplying it by. On massive backend services, the whole approach used to gain scale is organizing data so that efficient algorithms can be used. You don't just slog all your data through 3 for loops or whatever.