I'm sorry but this is a stupid idea. You are going to go to great lengths to detect things the patient could just tell you. You certainly aren't going to ask a three-hundred pound person to stand on your tricorder to take their weight. You'll just ask them.
medical diagnosis is an algorithm -- it's a software problem. how you get the data is not that important, nor does it require that much technology. A thermometer, a scale, a stethoscope, a blood-pressure cup are all well-established inexpensive tools that work. What matters is what you do with that data. There are two things we need to work on, in my not so humble opinion: 1. An algorithm for diagnosing visual data -- pictures of ears, nose, throat, skin and 2. An algorithm for parsing patient narrative about their ailment against their medical history, family medical history, recent dietary/athletic/sleep/sex/environmental/social triggers.
Rather than a Swiss army knife of instruments, we need an i-phone with Siri hooked up to something like IBM's Watson, configured for medical diagnosis. Unless you know how to fit an MRI, XRAY Machine, centrifuge, and dna lab into a five pound box.