Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
For the out-of-band Slashdot experience (mostly headlines), follow us on Twitter, or Facebook. ×

Comment: This is what I do now, too. (Score 1) 177 177

I state up front that I work on my own terms. I have talent to offer and can solve problems that others often can't, but I place a premium on flexibility and on my own health and family. I am incredibly productive, more than many other employees, but I do not offer *maximum productivity*, i.e. "as much as I am humanly able to produce." Even if it seems that I have more to offer (i.e. I leave at 6:30 when everyone else is still working and Skyping me at 11:30 pm, I travel a only couple of times per year and decline to travel 20 times per year, etc.), I am not willing to give this "more" to the organization—it is for my family and my own personal growth.

And both of the phrases I used are things I've been told—"We have doubts about your how serious you are; we're interested in someone that's more serious about their career" and "We don't doubt that you're highly skilled and productive, your resume and recommendations are stellar, but we're in a competitive industry and we need highly competitive people, and we're not sure you've got that competitive fire in your belly—that you're really going to be one hundred percent invested in the company and its growth."

I have two friends that have been on the serial startup carousel as founders. Both burned out and moved in other directions because they felt it was impossible to actually have a life, be a human being, and get growth and operating capital support from investors. Each startup became their entire lives each time until positive exit, and at some point each said, "I'm not doing this again, I'm losing my own sense of identity and my family."

And if you take that kind of statement out into the public sphere, I'd bet that what others would say is, "Well, they weren't really made to be enterpreneurs, then; they were destined to burn out because it's not the lifestyle for them."

Which is precisely my point—and it sounds like you've seen it, too—there's a prevailing "wisdom" that "real" career builders or "real" enterpreneurs are a particular "type"—the type that gives every . last . drop . of . blood to the company. The rest? They're just not "cut out for it"—they should "do something else."

Of course, if you're not "cut out" for the job market or for enterpreneurship, it's not quite clear what "else" you ought to be doing to earn a living. There are only so many jobs at nonprofits and in government agencies.

It would be better if society were to take a step back and assume the opposite—that everyone is basically loyal, driven, and productive, but in general, a healthy person cannot exist without healthy hours, life balance, and relationships, and if someone is the "type" to be working from 4:00 am until midnight every day of the week, and double that on holidays to pick up the slack, the are probably in need of counseling or personal development, rather than a raise and a promotion. But I suppose that's not how the market works.

Comment: Unhealthy society. Not just in business or tech. (Score 5, Insightful) 177 177

This isn't just about startups, this is across U.S. society—there is zero work-life balance.

Sure, every other company proclaims how great they are WRT work-life balance, but it's pure bullshit.

During hiring (for employees) and/or funding (for startups), if you give any evidence that you will ever put anything before the company (family, health, whatever, it doesn't matter) in ANY way, or ever draw a line in the sand about hours/commitment at ANY number, you are totally noncompetitive/nonfundable (they won't use these words) and won't be hired/be funded. If there is any evidence in your CV, online persona, or history that you have ever done any of these things, you won't be hired/funded.

Even after employment/funding, you have to keep this up. Sure, you may be asked (or even pressed) to "slow down," but it's superficial. The moment you do, positive evaluations/promotions/funding dries up; there is a perception that you're "not serious," "not committed," "not a good risk," or simply "not as capable/investment-worthy" as those *other* supermen/women that work 100+ hours a week (at least) and always put work first.

Yes, they want you to take a break, take care of yourself, and balance your life. But hey, if someone else delivers more value or growth more quickly... Well, they'd be nuts not to go with them instead, and hope you stay healthy in the meantime, all the best.

So, in the interest of your self/family/relationships you try to build a career that precisely demands that in order to keep it, you destroy your self/family/relationships. Depression is easy to fall into when your life will fall apart no matter what you do.

Comment: MORE importantly, in this case the modeling was (Score 4, Insightful) 193 193

done precisely in order to encourage behavior that would change the inputs to the model.

Nobody looks at cigarettes today and says, "Gosh, nobody smokes anyway and death rates are coming down, there was no need for all that worry, smoke away!"

The whole point of the data in that case (and in this one) was to encourage the world to change behavior (i.e. alter the inputs) to ensure that the modeled outcome didn't occur.

To peer at the originally modeled outcome after the fact and say that it was "wrong" make no sense.

When we tell a kid, "finish high school or you're going to suffer!" and then they finish high school and don't suffer a decade down the road, we don't say "well, you didn't suffer after all, guess there was no point in you finishing high school!"

That would be silly. As is the idea that the modeling was wrong after the modeling itself led to a change in the behavior being modeled.

Comment: To clarify, (Score 1) 364 364

what I mean is that more and more, there is a perception that for the *real* superstars in research, the job is coming up with wild ideas. The two criteria that a great idea must have are that it must be:

1) Radical, and
2) Internally logically and mathematically consistent

So long as these two criteria are met, the rewards seem to roll in. It's a version of postmodernist discourse/textual analysis; it's not about reference to anything outside the text, it's about the beautifully labyrinthine and provocative way in which the text (hypothesis, theory, etc.) is self-referential and self-consistent.

I'm not saying that PhDs and PhD students should be lab monkeys. I'm saying that more and more there's a perception that if you do have to touch a lab, ever, or if you do actually lower yourself to the point of carrying out "empirical research," it means that you're a couldn't-cut-it rather than a "superstar," i.e. a tier B or tier C scholar that has to operate in the realm of meat and matter, rather than in the realm of ideas.

Real "scholars" speculate beautifully about far out stuff in tremendously clever ways—or so goes the thinking, more and more. And I suspect the reason is because that is what is marketable from the attention and media standpoint. That's where the strong "branding" potential, for individual and for institution, lies. If you can get 50 people in a conerence session to yell at you for being an idiot misconstruing all of science and another 25 to call you the second coming of Albert Einstein (or some other prominent thinker), that's 75 more loud and attentive attendees than the guy down the hall with the methodological field innovation got for his session.

And if you can get covered in some national dead tree rag as a "controversial" scholar, your department tends to be thrilled and to put you front-and-center in their marketing materials.

Comment: Not just physics. I see this in a number of fields (Score 4, Interesting) 364 364

in academics.

The problem is that there is a huge oversupply of Ph.D. and Ph.D. candidate labor for the number of positions available pure academic research and institutions.

People that offer incremental improvements or work—just lab stuff, just more data, just duplication research, or slight variations to tease out empirical nuances—are a dime a dozen and struggle to differentiate themselves. Real science is often workman-like and laborious.

On the other hand, if young Ph.D. candidates and people weaving their way through the identity-building process that is a Ph.D. focus on conceptual innovation and performances—ideas, narratives, radical departures—then they are seen as doing something "new" and "innovative" (which somehow has become what science is about in popular discourse, which creeps into academic discourse), and something that sells better in the presses and to the public when the monographs come out, enabling "crossover" works and coverage that is much more lucrative than straight empirical work that gets buried in the journals or small print runs. They also more attendees at the conferences, and by virtue of interviewing and appearing more, get more coverage for the institution that hires them, often doing more to drive prestige and enrollments.

I think market forces play into this in a significant way.

Comment: Not entirely. That's what a market does sometimes. (Score 1) 216 216

But we also have voting, and the voting public chooses leadership, often in part on the basis of precisely the use of policy to incentivize behavior with government funds. Tax breaks for specific kinds of behavior being the most common of these.

This gives the public two ways to encourage people ot do/build/invent/fix things; one for individual choices, and one for collective goods, presumably (though people don't often think in these ways) to avoid tragedy-of-the-commons situations.

You were never asked per se, but you are a member of said voting public if you're complaining that Musk is being supported by you tax dollars. In such cases, you may or may not have a beef, but we're beyond complaining about Elon Musk there and into the basic tensions of democratic governance, which really isn't all about whether Musk is a good guy or a bad guy, but about whether you like collective goods or not, and if so, what system you prefer for choosing them.

That's political science, political philosophy, and/or the fringe of sociology, but has little to do (once again) with this particular article.

Comment: Seems pointless point to me. What am I missing? (Score 5, Insightful) 216 216

The reason we use government funding to incentivize things is because we as a public want people to do/build/invent/fix those things and are willing to pay for that to happen.

So Elon Musk comes along and says he will and then he does. And then we pay him what, as a public, we planned to pay (via those incentives) to whoever did them.

Seems like everything is going according to plan, for all involved, and that we're lucky enough to have found something of a one-stop-shop for incentivized work that few others are willing to take on, but that seems to really move the needle on tech progress for something other than consumer electronics gadgets.

Win/win all around. Smells like right wing paranoia and demagoguery to me in here.

Comment: High-priced is one key. (Score 1) 469 469

Sure, the hardware was pretty cool, but the prices were just really, really high.

I remember a call to Sun where I was asking about a SunOS license for a used 3/80 I was thinking of buying from the department, and it was going to run me like $3,000 for the OS or $5,000 with development environment or something along those lines, and it was quoted to me as the *academic* price, and no matter which media delivery I wanted I would have had to buy additional hardware to read it, and so on. I mean, that's $5,000 to $8,000+ in today's cash for an already old, low-end Unix system at the time. I remember pleading with the person on the other end of the line to help me brainstorm and find other options, as I was a CS student and needed to be able to do my homework at home and all of that, but of course, they just felt like I was tying up the line—I looked sort of ridiculous from their perspective.

And then they told me that it was really too old to be useful, that I wanted an IPC or an IPX, I forget which, and the prices were well into five digits, again *academic* price for a bare bones configuration. And here I am a broke CS undergrad already struggling to pay $4k/year in tuition to a state school. It was a total non-starter.

Meanwhile, the first purpose-specific Linux box that I assembled (as my Linux excitement grew and I knew I wanted to do a dedicated build) was a 386/40 with 8MB RAM and about 1GB ESDI storage, along with a Tseng ET4000 VGA card. It was all used gear, again bought in surplus channels, but the thing was that it got me beyond what the 3/80 would have provided in terms of performance, and was perfectly servicable at the time as an X+development box, and it cost me a total of like $200. That was just plausible.

I built a sync converter circuit to connect an old Tektronix 19" fixed-sync color monitor to the VGA port and felt like I had a real, honest-to-god Unix workstation for $300, with a very competent development environment and Emacs, NCSA Mosaic, etc., rather than having had to spend 10x that much for less in the end.

People talk about *BSD, but the driver support on *BSD wasn't as good even a few years later when I looked at it. Linux supported crap hardware in addition to great hardware, which sounds bad until you realize it means that any broke student could scrounge around in the boardbucket and put together a fairly decent Linux system for peanuts.

Comment: To each his own (servers). (Score 1) 469 469

I didn't have access to that tier of internet service at home at the time—I had UUCP and a Fido tosser, both of which dialed out over a modem once a day for a multi-megabyte sync.

Sun3/4, HP900, and some DEC boxes were all in use in the CS labs at my school at that period, though they were getting old. So I had ftp access there, but there was no removable storage on those machines, and no direct way to access them from my dial-up, and my account quota was pretty low on the NFS server, just enough to hold a bunch of C code for class and the binaries it produced, so space was pretty precious. For those reasons, it never occurred to me to ftp or gopher around at school for software to figure out how to take home. Plus lab access was limited and you really wanted to spend your time there doing your homework problems and getting them to compile and run, not dinking around looking for freeware.

Minix I read about on Usenet and tracked down some binaries, IIRC. But it wasn't impressive. Every now and then I'd hear something about *BSD, but it really wasn't clear how to get my hands on it, and nobody could really tell me. The people "in the know" as far as I knew were in the CS department and they were using the commercial unices and knew those pretty well.

In the Fido and Usenet groups I was in, Linux was the thing that turned up often and easily. Maybe I was reading the wrong groups, who knows. There were a hell of a lot of them in those days, if you recall, and it was all pretty noisy.

But Linux seemed then and over the several years that followed to come up over and over again. Others—not so much. It is what it is.

Comment: Single case anecdote. (Score 4, Interesting) 469 469

I had been trying to afford a Unix installation at home as a CS student. All I knew was the Unix vendors. I was not aware of the social structure of the Unix world, various distributions, etc. I was crawling university surplus lots and calling Sun and DEC on the phone to try to find a complete package that I could afford (hardware + license and media). Nothing was affordable.

I was also a heavy BBS and UUCP user at the time over a dial-up line. One day, I found an upload from someone described as "free Unix." It was Linux.

I downloaded it, installed it on the 80386 hardware I was already using, and the rest is history. This was 1993.

So in my case at least, Linux became the OS of choice becuase it had traveled in ways that the other free Unices didn't. It was simply available somewhere where I was.

This isn't an explanation for why Linux ended up there instead of some other free *nix, of course, but by way of explaining the social diffusion of the actual files, I saw Linux distros as floppy disks around on BBSs and newsgroups for several years, with no hint of the others.

For someone with limited network access (by today's standards), this meant that Linux was the obvious choice.

As to why Linux was there and not the others—perhaps packaging and ease of installation had something to do with it? Without much effort, I recognized that the disks were floppy images and wrote out a floppy set. Booted from the first one, and followed my nose. There was no documentation required, and it Just Worked, at least as much as any bare-bones, home-grown CLI *nix clone could be said to Just Work.

I had supported hardware, as it turned out, but then Linux did tend to support the most common commodity hardware at the time.

My hunch is that Linux succeeded because it happened to have the right drivers (developed for what people had in their home PCs, rather than what a university lab might happen to have), and the right packaging (an end-user-oriented install that made it a simple, step-by-step, floppy-by-floppy process to get it up) while the other free *nix systems were less able to go from nothing to system without help and without additional hardware for most home and tiny lab users.

For comparison, I tried Minix around the same time (I can't remember if it was before or after) and struggled mightily just to get it installed, before questions of its capabilities were even at issue. I remember my first Linux install having taken an hour or two, and I was able to get X up and running the same day. It took me much longer to get the disks downloaded and written. Minix, by comparison, took about a week of evenings, and at the end, I was disappointed with the result.

Comment: Re:Rely on the counterfactual. (Score 1) 211 211

Yes, in practice it's usually a mix of the two, so the principle is more an abstract model than an argument about real, concrete thresholding.

But the general idea is that by the time someone stops being promoted, if they continue in the job that they are in while not being promoted for an extended period of time, it means that they are likely not amongst the highest-merit individuals around for that particular job and responsibility list—because if they were, they'd have been promoted and/or would have moved to another job elsewhere that offered an equivalent to a promotion.

Comment: Rely on the counterfactual. (Score 5, Informative) 211 211

The best way to understand the principle is to imagine the counterfactual.

When does a person *not* get promoted any longer? When they are not actually that great at the position into which they have most recently been promoted. At that point, they do not demonstrate enough merit to earn the next obvious promotion.

So, the cadence goes:

Demonstrates mastery of title A, promoted to title B.
Demonstrates mastery of title B, promoted to title C.
Demonstrates mastery of title C, promoted to title D.

Does not manage to demonstrate mastery of D = is not promoted and stays at that level indefinitely as "merely adequate" or "maybe next year" or "still has a lot to learn."

That's the principle in a nutshell—when you're actually good at your job, you get promoted out of it. When you're average at your job, you stay there for a long time.

Comment: You are describing engineering on public works (Score 1) 634 634

projects and in academic research, which are an already tiny, extremely competitive, and ever-smaller part of the general pool of engineering labor.

Most of what engineers do is in the broader consumer economy, engineering objects, systems, etc. for people that are already amongst the world's wealthy (i.e. consumers in the largest national economies), that they don't really need, to enrich still wealthier people that don't really need any more enrichment.

I may be a woman underneath it all, because despite being gainfully employed in a high-skill position that makes use of my Ph.D., I can't stand my job, which is all about making stuff with little direct bearing on daily life to help make rich people even richer—yet of course it is taken deadly seriously by everyone in the company, and there is a general disdain for and scoffing at "causes," like say, preventing climate change, expanding human knowledge and capability, or helping to address the massive wealth inequality on the planet.

Counting in binary is just like counting in decimal -- if you are all thumbs. -- Glaser and Way

Working...