I'll try to keep this short. I am a graduate Physics research student, so I have a lot of first-hand experience here.
First, you're right. Get a laptop that runs Linux well. Others have discussed this thoroughly already, no need for me to repeat what they've already said. Second, definitely get one with the best nVidia graphics you can afford. If Quadro is an option, choose it, hands down.
I've seen people try to do physics and chemistry research in Mac OS or in Windows. It's a pain in the ass (but possible). It's really not worth the trouble... just use Linux. Worst case scenario, even running Linux in a virtual machine is better than being that one person spending half their time trying to figure out how to do XYZ in windows, because the instructions will all be written targeting Linux systems. Also, in physics research, you'll probably be writing code that will eventually run on a supercomputer (or, in our terms: high-performance cluster), so you might as well be running something as similar as possible to the cluster nodes.
Regarding graphics cards, nVidia Quadro is where you want to be (and try to get a good one, if you can afford it). I prefer AMD. I don't *like* nVidia. Unfortunately, being productive doesn't mean getting to use what I *like*. Everybody uses CUDA, which is an nVidia technology. If you want to be able to test CUDA code, you're going to need an nVidia graphics card. There are different versions/levels of CUDA support, I think the technical term is "Compute Capability" or something like that. You want to get the most recent one that you can, and I think these come to the Quadro cards before they come to the consumer lines. The Quadro cards also have other features that make developing CUDA code easier, although I forget exactly what they are. I think they're related to debugging. Consumer GeForce cards DO support CUDA, but still try to get Quadro if you can. By the way, recent "GPU equipped" supercomputers usually have nVidia hardware, too. I really hope AMD steps up their game soon, but the fact is, nVidia owns the high performance GPU computations market right now.
For background info: I personally do computational biophysics research. Yes, I have supercomputers at my disposal, but no, I'm not comfortable using them to test early versions of my code. The on-site supercomputer is CPU-only. I have a workstation that I use for development, which has a quad-core Xeon and an nVidia Tesla card in it (Teslas aren't available in laptops, otherwise I would recommend that instead). Yes, I reach the computational limits of my workstation CPU and my GPU. It's not hard in computational research. Other types of research will also make heavy use of the processor and GPU as well... the difference is that you might wait a few minutes, while a computational researcher waits 80 hours for his results. My laptop is an 8-year-old 17-inch macbook pro. The nVidia GeForce 8600M GT supports CUDA, but not a recent enough "compute capability" to be able to test code that will run on my workstation or the remote supercomputers. I mainly use my laptop to remotely connect (ssh) to my workstation, but that only works well because all of my work is command-line anyway. Speaking of remote supercomputers, I just got a grant that will let me use the Oak Ridge National Labs supercomputer, called "Titan". You can look it up, but it's got an nVidia Tesla in every one of its thousands of nodes (Maybe tens of thousands? I forget.). My advisor and I are hoping to get access to Oak Ridge's brand-new "Summit" supercomputer, which will also be running lots of nVidia GPUs. You can google Titan and Summit for details. Even if you're not doing computational research, or using supercomputers, most research packages support using CUDA for GPU acceleration, so it's a good idea to have anyway.
Point is: Linux + nVidia Quadro. As for brands? Who knows. My workstation is a Dell. My laptop is a Mac. I bought a Mac way-back-when because I knew it would be a "common" hardware configuration (since there's less variety in Macs). Common hardware means more attention from Linux developers, which in turn means you're more likely to have success running Linux. My next laptop will not be a Mac though, because Linux compatibility has come a long way in 8 years, and I kinda want an ARM processor. My workstation is there when I need to do development or heavy-lifting, and I intend to upgrade that too whenever AMD finally updates their server chipsets.
Extra thoughts: I've had major quality-control issues with HP when I used to work in IT. Lots of issues with brand new laptops straight out of the box. I've also had terrible experiences with Acer's customer service. Nothing was wrong with the laptop, I needed detailed specifications for a laptop (not mine) that were omitted from the user manual. They wouldn't talk to me unless the laptop was under warranty, or unless I was ordering a replacement part. The only other option was their toll number. I ended up pretending to order a replacement part, and asking for its specifications, then hanging up. As for Dells, their business sector is different from their consumer sector. From the business sector, I have lots of experience, and no comments (which is a good thing). They met expectations, but didn't go "above-and-beyond" much. From the consumer sector... replacement parts are generally cheap and easy to find, likely because of Dell's popularity. Most of my repairs were on Dells, but I don't know if that was just because most of the people I knew had Dells, or whether it's because Dells are not reliable. I have never personally owned a Dell, and my workstation (university-owned) is the first I've ever had to myself to use daily.
That wasn't short. Sorry.
Good luck!