A "gigabit" is a perfectly valid measure of data (it's an eighth of a gigabyte), and the pronunciation "jigga" for that particular SI prefix is obscure but not incorrect.
It typcially measures data transfer rates but it could measure other things as well.
3D has been used to describe wireframe rendering, polygon rendering, parallax, and about a dozen different technologies.
Every time display technology improves, they're going to call it 3D. The term is so overloaded as to be meaningless.
I still remember Sony saying the PS3 was going to be 4D with a straight face. That's the sort of ludicrous irony that makes the situation bearable to me.