Anyone who thinks that machine with 16G or RAM is _commodity PC_ has to see a doctor.
Most important question for scaling would be could anyone keep base requirements withing _commodity_ limits in case of data sets grows? So if 16G of RAM _now_ is sufficient what will happen when the data sets doubles? could you just add 16G nodes or you'll ask for 32G or 64G nodes? Could you keep data your engine works with of a certain size? There is a HUGE difference between adding 4 16G nodes and single 64G. Is the memory the only requirements?