Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

Comment Re:agreed (Score 1) 426

No, the main issues with Vista were the fact that for most of its life, its driver support sucked (It was v1.0 of a new line, what do you expect?) with many really broken ones out of the gate (because they released the OS way too soon for the hardware manufacturers to be ready) and it's broken security model which incessantly asked its users if they were sure whether they wanted to let this or that do something or the other.

They fixed both of these issues (for the most part) in Win7. Which is why people still want Win7.

Comment Re:income (Score 1) 371

I'm sure that's the case overall. Adding in such managers as fast food and chain store shift managers (as well as project managers in tech) will do that for you. What's the average management salary level for people managers in high-tech? That's what the discussion's about.

I'd peg that at average engineers salary about where you put it, but the average for managers is going to be at the $120-140K range.

Comment Re:Incentive Bug Finding (Score 1) 331

You already know that unsecure use of these languages can lead to serious security breaches throughout the system. We have several methods to deal with this kind of insecurity - but they cost, either in development time or needing more people or more process or simply not being able to do certain things. All of which suck.

Honestly, at this point, I really don't see much choice other than putting most of the web on lockdown. We've built our libertarian utopia and due to the intrusion of the real world, it's sort of become a bit of a crapfest. It's time for us to grow up and actually figure out how to govern the place (or at least parts of it) for the greater benefit of all of us, even at the cost of some of our liberties (and, before you yell "I am BennyF's BFF and he who s willing to give up...blah, blah, blah", I'm hoping this governance would be democratic, representative, and permanent, rather than temporary, resulting in a greater enjoyment of this resource for all into the future), rather than letting the whole shebang collapse in a riot of fraud and idiocy.

Comment Re:Obligatory: "There's Plenty of Room at the Bott (Score 1) 151

None of the linked articles even mention Feynman's name.

Why should they? Not many current astrophysics papers mention Galileo, either. Nor do most papers in modern computing reference the work of John von Neumann.

In science, an original idea or suggestion by someone, no matter how famous, is built upon by others, who's work is built upon by others, until someone actually turns an incomplete idea into a field of study. And by this time the literature has evolved to view the problem slightly differently, perhaps more completely, perhaps from a point of view that's more useful from a research point of view. And then these papers by the others who made these changes become the ones that are referenced. It's the cycle of scientific research. And don't think it's because we've forgotten our roots... If you asked the author of this paper, I'm pretty sure he'd start with either Shannon or Feynmann. We leave older references off, because, often it's not relevant to the research you're talking about. And, frankly, your space is already so limited you don't want to spend any on name checks.

But come on, do you really think a 55 year old paper is going to be at the top of impact rankings when computed against current research in a field moving this fast? And, even if so, isn't it more likely this work has been superseded by others? IT'S BEEN 55 GOD DAMN YEARS, FOR CHRISSAKE!!! I think your hero worship is showing. At least find a more modern reference.

Google

Google Expands Safe Browsing To Block Unwanted Downloads 106

An anonymous reader writes "Google today announced it is expanding its Safe Browsing service to protect users against malware that makes unexpected changes to your computer. Google says it will show a warning in Chrome whenever an attempt is made to trick you into downloading and installing such software. In the case of malware, PUA stands for Potentially Unwanted Application, which is also sometimes called Potentially Unwanted Program or PUP. In short, the broad terms encompass any downloads that the user does not want, typically because they display popups, show ads, install toolbars in the default browser, change the homepage or the search engine, run several processes in the background that slow down the PC, and so on."

Comment Re:comparing hypes (Score 1) 98

Interesting indeed! It's almost like a puff piece for them, with an underlying message:

See how well Gartner pushed the "Internet of Things" meme! We took it from nothing to peak hype in only three years! Very efficient for your PR dollar, isn't it? You want to know the "new thing", don't you? Heck! You want to invent the "new thing"! In fact, you have a new thing you're inventing right now, don't you? Well, if we write enough reports for you, your category of new thing will be in the buzz and hype forefront! You'll have investors crawling down your shorts looking for jewels! And they're so inexpensive! Remember - nothing to "hype leader" in three years!

'Cause that's what they do - write reports reinforcing what the industry wants to hear about itself to be used as PR. They do it at all levels, too. I've read many of these things at the product level, too. Basically, discount any sales estimates by about 4/5 and lengthen the time frame of any graph by about 150% and it might be accurate*. Hell, I'd love to be in that business, but even I don't have brass ones big enough to "invent the future" like that.

*Which is a interesting measure in and of itself - how much do you have to distort a graph of any prediction to make it match what actually happened.

Comment Re:Blame HR ...(what about the Recruiters/Agents?) (Score 1) 278

How do the Recruiters/Agents submit their chosen candidate applications to HR?

They don't, if they're smart. They may start there, but they parlay their contacts there into contacts in the engineering department whom they start to contact directly to find out about openings. Really, it's all about networking now from the top down. Positions have become too specialized to allow random people to apply. Chances are your manager also knows enough people who need jobs that he doesn't have to go through HR (except for the final paperwork), anyway.

Comment Why? (Score 1) 278

Because they can be.

People want the jobs, so they use the awful online systems. It's their first step in hating the company they may eventually work for, so it's a head start anyway.

The real issue is that the job match market is so crowded, you have dozens of competing companies selling crappy SaaS job-app systems that various companies use to auto-sort applicants. The problem is that each of these wants to be the only one in the market and, thus, don't want to share information with any of the others. So you end up having to enter, edit, etc. the information for each crappy system and company you want to apply for.

HR departments, job-app sellers? You want to be my friend? You want me to like your company? Here's how:

  • Download my fucking resume from LinkedIn. No I don't like them much, either, but at least it's centralized, accessible, and won't waste my time.
  • No, I don't want to send you a Microsoft Word formatted resume - frankly, text is much more easily parseable and, because I want a decent looking resume that looks the same on all screens, I'm using PDF's anyway. Frankly, there's enough people using non-Microsoft products to create these now days (heard of Google Docs?) that requiring resumes in Word format shows that (a) you're locked in the 1990's or (b) you're a recruiting firm too small and cheap to afford the tools that would allow you to edit a PDF.
  • There have been improvements in matching algorithms such that you don't have to go with Boolean criteria for your filters - use approximate matching and grading cutoffs rather than absolute criteria.
  • Always send an auto-response letting me know my application has been received and is under consideration. If possible, send me a response if I'm no longer under consideration, too. There used to be a thing called common courtesy - if you can't handle that, you're going to look like jerks (and most of you and, by extension, your companies look that way right now).

For those of you looking - just be aware that it's not personal. The people who make or run these systems are just relatively incompetent fucks. You'd have better luck using networking, anyway.

Comment Re:Wyvern = Wyrm (Score 1) 306

Yes. A company I worked with tried an blah-XL based language, too. The thing was a nightmare. If you didn't have a decent editor handy, you were up shit creek. Simple to poorly indent, leading to misunderstanding in scope and structure, while simultaneously being syntactically bulky and hard to read, XML a great notation to hate.

Whatever your language was, if it had anything at all to do with XML syntactically (other than throwing its syntax away), it's probably an example of this: A programmer had the problem of designing a data representation. He chose XML and now he has two problems.

Comment Computing is bigger than any one language! (Score 1) 637

I'm no fan of Java-based curricula, for the same reason I'd be no fan of Fortran-based curricula. Computing isn't about one language. Each language and system shows you one hyperplane of a vast multidimensional space. The best programmers know lots of languages, and choose wisely among them — or even create new ones when appropriate.

In the production world, there are times where some C++ or Java code is appropriate ... and there are times when what you want is a couple of lines of shellscript and some pipes ... and there are times when the most sensible algorithm for something can't be neatly expressed in a language like C++ or Java, and really requires something like Common Lisp or Haskell. If you need to exploit multiple processors without getting bogged down in locking bullshit and race conditions, you're much better off using Go than Java.

(Just last night, at a meetup, I was talking with two bright young physicists who reported that their universities don't do a good enough job of teaching Fortran, which is the language they actually need to do their job. Scientific computing still relies heavily on Fortran, Matlab, and other languages well removed from what's trendy in the CS department — no matter if that CS department is in the Java, Haskell, or Python camp. But if you want to learn to write good Fortran, you basically need a mentor in the physics department with time to teach you.)

And there are times when the right thing to do is to create a new language, whether a domain-specific language or a new approach on general-purpose computing. There's a good reason Rob Pike came up with Sawzall, a logs-analysis DSL that compiles to arbitrarily parallel mapreduces; and then Go, a C-like systems language with a rocket engine of concurrency built in.

(And there's a good reason a lot of people adopting Go have been coming not from the C++/Java camps that the Go developers expected, but from Python and Ruby: because Go gives you the raw speed of a concurrent and native-compiled language, plus libraries designed by actual engineers, without a lot of the verbose bullshit of C++ or Java. Would I recommend Go as a first language? I'm not so sure about that ....)

What would an optimal computing curriculum look like? I have no freakin' clue. It would have to cover particular basics — variable binding, iteration, recursion, sequencing, data structures, libraries and APIs, concurrency — no matter what the language. But it can't leave its students thinking that one language is Intuitive and the other ones are Just Gratuitously Weird ... and that's too much of what I see from young programmers in industry today.

Comment Re:Beards and suspenders. (Score 3, Insightful) 637

I can't believe that you can graduate with a CS degree today without having at least one assembly language class which should show you about bit-twiddling and memory management just a bit. Not to mention an OS class that would expose you to exercises to modify a Linux kernel - written in C.

What do they actually teach in a CS degree these days? Don't tell me... Gamification, HTML, CSS, and Javascript, right? Do they actually make you take a database or an algorithms class any more?

Slashdot Top Deals

Happiness is twin floppies.

Working...