If MidJourney and Photoshop are both tools, then so is a tool to download copyrighted films (which is clearly not respecting copyrights).
Copyright law already distinguishes between exact copies, derivative works, and fair use. All delineated by fuzzy boundaries. So it's contextual, based on circumstances. In the case of MidJourney, to comply with copyright law, they probably need to put up guardrails like GPT5 already has done. GPT5 will outright refuse to draw Superman, but MidJourney happily complies. If guradrails let something slip through, then maybe there should be a DMCA take-down mechanism.
Soon we will have AI call screening that answers for us, interacts with the caller, and decides whether to handle it directly, disconnect, or forward it to the user. At what point do we just have AI talking to AI, peddling AI services to AI agents? Will we end up with both sides of the AI getting into a generation loop and find calls of them repeating a word or phrase at each other indefinitely? Or will the human suddenly have a 117 quadrillion dollar charge declined on their credit card because their AI agent agreed to buy one petaseat of licensing?
Suffice to say: "What could possibly go wrong?"
It's super-trite, but true: technology can be used for good or bad.
I love the productivity gains and breadth of instructional knowledge AI has given me.
I hate that when I'm on Facebook I have to spend half my time blocking groups that generate AI summaries of classic TV shows and characters (that I'm otherwise a big fan of and follow).
Matter will be damaged in direct proportion to its value.