As an actual researcher, let me state that your post has little to no bearing on reality. That is, open-access journals do not prevent an individual or group of individuals from artificially inflating various publication metrics. Moreover, agencies look at much more than those metrics, e.g., research output, research impact, past publication venues, and the number of students who are supported and are expected to graduate under a grant, when deciding how to dole out funding.
All that's missing is some mention of hosts files.
In an age where you can patent a rectangle, is it really about innovation anymore?
I just wanted to notify you, informally, that you've infringed upon my patent that details a process for complaining about patents. I'll make sure that my lawyers send you the appropriate notice paperwork by the end of next week.
Actual research is a wholly unintended side effect of academia. Only naive fools even attempt real research and inevitably fail.
Come tomorrow, I guess I should stop by the Department Chair's office and let him know that he should revoke my endowed scholar position, let alone the positions of my colleagues, as we're all apparently fools.
Mac is also not very stable with heavy applications like photoshop, after effects, 3dsmax, etc.
I chuckled heartily over this, especially considering that Autodesk hasn't released a native 3DS Max binary for OS X.
Good, their work is best done by private contractors anyway.
Private entities rarely, if at all, focus a majority of their efforts into pure research, unlike the national labs. Funding pure research, which is one of the few actions that the US Government at least does halfway correctly, is ultimately essential if we are to progress the state of the art and thus create new fields and products that are ripe for commercialization.
It's also important to remember, as in any major discipline, that mathematics has numerous components, some of which aren't commodious for many real-world problems; as such, it could take a fair amount of time to train someone so that they would be able to make a worthwhile contribution.
As one example, I have a friend and colleague who focused entirely on abstract algebraic topics for his research and enrolled in an ordinate number of analysis, topology, and algebra classes whilst eschewing ones deemed more practical, like those dealing with differential equations, optimization, numerical analysis, and applied probability; further, despite graduating from an Ivy League institution, let alone being incredibly smart, he has yet to find employment, as most of his knowledge does not translate well to solutions for any of the burgeoning fields, such as data analysis, computer vision, or robotics/autonomous systems. Consequently, in order to even consider a position out in industry, he's looking at spending the next two years diving into a sea of applied math.
I'm normally not one for coarse language and insults, but, given that the atypical neurogenic tic disorder that the individual suffers from can lead to both life-threatening asphyxia and tachycardia, I would have to say that you are a massively apathetic twat. I hope that you never become afflicted by any debilitating condition, let alone wind up in a similar situation and encounter someone insouciant who denies you access to medicine or necessary sustenance, as I doubt you'd have the fortitude to stand up to your ilk.
Fortunately, your pococurante attitude served some purpose beyond broadcasting your own inadequacies: it spurred me to pledge several thousand dollars for this guy's legal fund.
As you noted, the project was fun to undertake, even though it was only a sub-component to a much larger endeavor. I may yet go back, visit it, and submit an extension as a nice stand-alone article.
To answer your questions, though, I relied on a pool of around seventy subjects, equally distributed across genders and with a tri-modal distribution for age, many of whom were nudists that had heard about the data collection through some friends of mine. I also had a couple of adventurous fellow students and peers sign up to contribute; even my girlfriend at the time had no qualms about being filmed.
In any event, while there was some inherent selection bias in who I chose, mainly because I needed footage of as many different body types as I could capture, so as to allow the underlying model to generalize well, I do admit to being elated whenever people with certain body types were incredibly eager to help. Granted, I did, at the later stages, have to turn some people away, since I was spending too much time acquiring data.
For the experiments themselves, people had multiple options for what to wear for the various training phases, aside from the different changes of loose- and tight-fitting clothes that I'd ask them to bring and don. I did my best to provide multi-sex body suits of different sizes, which provided more than sufficient constraints when coupled with manually-derived measurements of quantities such as chest circumference, stomach circumference, and so forth. Others opted to strip down to their undergarments and a fair amount, surprisingly many of them women, wore nothing at all.
Regardless of what they wore or didn't wear, each subject executed a series of actions, such as walking, sitting down, standing up, skipping, and climbing. I used six pairs of stereo vision cameras to record the events. I had hoped to use Vicon cameras for the ground truth, but the professor that had them in her lab, even though they hadn't been turned on in a year or so, was aghast over my intended application and barred me from borrowing them.
What you proposed isn't that far-fetched, as I ended up having to contrive and implement the equivalent of this, i.e., passive, automated estimation of body shape under clothing, either from a single image or from multiple video frames, for some work I did in action recognition that required a fairly accurate representation of the person's proportions. Others, e.g., A. O. Balan and M. J. Black, "The naked truth: Estimating body shape under clothing," in Proceedings of the European Conference on Computer Vision (ECCV), 2008, pp. 15–29, have come up with solutions too.
How long ago? How common were computers?
Unless you are even older then me, I call bullshit.
I matriculated when I was fourteen, about two years before the turn of the second millennium, and finished both degrees before I turned twenty. Despite starting early, I was far from the youngest graduate, as one of my peers managed to complete an S.B./M.Eng. CS by the time he was sixteen.
In any event, in both my situation and his, let alone those of others I have encountered, we all had little prior experience dealing with electronics and computers yet plenty of natural aptitude for and budding interest in the subject. In my case, I was fascinated, and still am, about the possibility of furthering statistical machine vision and managed to find the perfect adviser to not only spur my creativity, but also put up with my astounding initial ignorance. In his, he wanted to advance computer graphics and wound up submitting some excellent, now heavily-cited papers to SIGGRAPH and Eurographics.
It's hard to believe that he could really be that oblivious to how the real world works.
There are more than a handful of people who grow up in affluence or are sheltered most of their lives from the denizens of seedy places that might prey on others. Ergo, they have little recourse, mostly in the form of previous experience or tales from their associates, to guide them in such matters.
I know that, in my case, it was not immediately apparent that I was a potential drug mule target, when I was accosted, late one evening, by a buxom, beautiful, crying woman in Ybor City. The only factors that ultimately saved me from helping her were that: (i) I had never been anywhere near the Central/South Florida area and hence was lost looking for a sushi restaurant at which I was to meet some fellow research conference attendees and (ii) I was incredibly late due to having canvassed the area on foot several times without finding the restaurant.
If your [sic] going into college and you haven't coded anything yet, give up on CS or EE. You can likely do both, but you will never be really good. You don't love it enough. You better be open to being a better coder though.
For EE I'd raise the bar some more. If you don't already know how to use basic bench equipment don't go into EE.
What a crock of shit. I hadn't programmed or played around with circuits before heading to university, yet managed to leave, the first time around, with an S.B./S.M. EECS, either sole or first authorship on more than ten top-tier journal papers, a handful of patents, and more than enough money on which to retire from having worked at and propped up a start-up company.
For those who might come across HornWumpus' comment, do not, even for a brief moment, feel discouraged. Anyone, regardless of his or her background, can go into EE or CS and make fantastic contributions to either field. All that ultimately matters is finding the right environment to nurture your innate talents, the tenacity to see your ideas come to fruition, and the willingness to learn, even if it takes more than one try.
A 4.0 from MIT might help when securing an interview for Google but most places are more concerned about your ability to reliably deliver. [...]
As an aside, MIT has a 5.0 scale, not a 4.0 one: http://web.mit.edu/registrar/gpacalc.html
Developers/publishers need to fight back against pre-owned, as game retailers really started to take the piss, and it's really been hurting the people who make the games. [...] This directly hurts publishers and developers, who need the new sales and make no revenue from pre-owned. Publishers have been way to slow and scared to respond, they should have clamped down much earlier.
By this logic, you should be all for contractors demanding and receiving a percentage of the sale price for any building they constructed, car companies forbidding the use of any second-hand vehicle, and all other sorts of wonderful nonsense.