Ok, as the post above stated above, it takes ~10 watts / m^2 to illuminate. Raster scanning does not fix the problem. if you only have a signal in a given place for say 1/1000th of the time, then the signal needs to be 1000 times stronger to be noticed by the naked eye.
I'd dispute that. If the signal is, say, on for one second, then off for the next, does that make it less visible? You've just halved the average power requirement, and I'd argue the flashing makes it more visible if anything. I think you'd only need a very brief flash to make it visible, particularly if that flash was repeating periodically, so the average power requirement would fall correspondingly.
Say for the sake of argument that this was just 1 square mile of coastline. That is around 2.5 Million square meters, so again for just one square mile of coastline you need 25 Megawatts. This is roughly the power consumption of a small town. Good luck with that.
I had in mind a thin line, 20km long, but only 1 metre wide - one end of the strip at the beach, the other end inland (exact distance depending on gradient of coastline at that point). So that's 20,000 square metres, and I believe that 200kW is achievable with (really big) solar panels.