Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
DEAL: For $25 - Add A Second Phone Number To Your Smartphone for life! Use promo code SLASHDOT25. Also, Slashdot's Facebook page has a chat bot now. Message it for stories and more. Check out the new SourceForge HTML5 internet speed test! ×

Comment Re:Holy Blinking Cursor, Batman! (Score 1) 230

Indeed, and it didn't need that fsck'ing HARDWARE cursor emulation that the PC needed, either!

(Yes, Hercules, CGA, EGA and VGA had the text-mode cursor in hardware (including the blinking)). VGA (and maybe EGA, I forget) also had a single "sprite" for the "hardware" cursor.

I was coming here to make exactly this point. Cursors used to be hardware sprites that required no additional CPU cycles. At some point windowing systems took over the task of rendering the cursor, but still typically used XOR'd sprites to keep things fast and efficient. Then they started using GPU-optimized code and software CPU emulators as a fallback and things went downhill from there...

I still keep my 80s-era graphics programming books on my shelf as a reminder of how to do fast graphics when all you have is the ability to draw pixels... ;)

-Chris

Comment Interviews need training, too (Score 5, Interesting) 1001

What I've always found funny about this interview process is that the assumption is always that the interviewer knows the correct answer(s) to the question. It's painful when they don't.

Years ago I interviewed at Google and was asked a question about bit counting (some variation on "given a bit vector, wat's the fastest way to count the number of 1s?"). I quickly answered, "well, if your processor has a population count instruction, stream your vector through that and accumulate the result in a register". Having just evaluated bit counting methods as part of my Ph.D. dissertation, I knew this was the fastest way to do it, assuming the instruction was available (it's not on x86, but is on Power/VMX and most DSPs support it as well).

After I got a blank stare back from the interviewee, I said, "Oh, you were looking for the lookup table answer". We could have left it at that, but he went on to explain using some very convoluted logic how the lookup table would actually be faster than using a dedicated instruction and that my answer was clearly wrong. I mentioned a little bit about the number of cycles required for each approach, but he had none of it. In his mind, I had the wrong answer, even though my second answer was what he was looking for.

It was at that moment that I realized Google was not going to be a good place to work.

-Chris

Comment Re:Don't buy what you can't afford. 3,500feet, $24 (Score 1) 805

As others are saying, don't live in the Bay Area if you can't afford it. But, if you want housing that's affordable and not too far away, it's not impossible...

There's the whole Central Valley within driving distance of the Bay Area. Sure, a 1-2 hour commute isn't ideal, but with a flexible work schedule and work-from-home options some days the of the week, it's totally doable. You can get a nice house with a pool in a small CV town for less than $250k. Hell, in New England "bedroom communities" are all over the place and feature similar price differences/commute times. (you can even throw in a few nights each month for a hotel and still come out ahead)

Fwiw, I grew up in the Central Valley. Day in and day out it's really no different than living anywhere else - you eat, sleep, and work, lather/rinse/repeat. Oh, and you're much closer to the Sierras than you are in the Bay Area, if mountains are your thing. An hour and half to the slopes is much nicer than the 6-12 hours it takes to get to Tahoe on the weekends from the Bay Area.

-Chris

Comment Re:mail.app (Score 1) 216

Of course, since this is in mail.app, which I use constantly, this is the first I've heard about it.

I wonder how many great features in Apple products people miss simply because Apple refuses to provide sensible documentation and instead relies on users to "discover" features organically or via message boards.

-Chris

Comment Geeks repellant! (Score 3, Interesting) 233

So, at the more hardcore geek conference (Supercomputing comes to mind), there has never really been an issue with booth babes for a simple reason: geeks are scared to talk to them. Every now and then a company will hire one, only to see a nice exclusion zone form around their booth. Sure, sales guys from other booths will stop by, but none of the intended audience will risk talking with an attractive female.

Comment Tail wags the dog... (Score 5, Insightful) 293

As a developer/power user who sits at the far end of the bell curve, here's what I see as the folly of Apple's ways.

I switched to Macs after working on a beta version of OS X in the late 90s. Unix + sensible desktop was enough to keep me off the Linux train for daily use. That the hardware was also well designed with a good level of performance was also important. For the next 10 years or so, that held true.

But, in the last 5 years:

- the hardware has stagnated (e.g., I'd really like to buy a MacMini for my kids, but there's no way I'm shelling out Apple prices for 3 year old processors)
- new hardware decisions make it difficult to use existing peripherals (music is a hobby - no way am I dropping a few grand on new audio interfaces just b/c I upgraded my Mac and need to support new ports)
- Apple has ignored sensible design decisions made on the non-Apple side of the world (specifically, touch screens on laptops - my wife as an HP for work and the touch screen is useful, those old studies that claim otherwise are just that, old and dated).
- The OS continues to have a slew of undocumented features that may or may not be useful, but definitely affect performance (the real dig here: just document the features Apple, I hate discovering things OS X has done for years on random blog posts)
- The iPhone and OS X still don't work well together

Why does this matter from the perspective of the bell curve and my place on it? Simple: I switched not only my family, but also my company over to Macs. The middle part of that curve was filled by people following people like me into the Mac universe. I'm seriously considering dropping Macs for computer use and (horror of horrors) going back to Windows + Linux. If I go that way, it's just a matter of an upgrade cycle or two before those in my sphere of influence abandon Macs as well.

Apple seems to have forgotten that it's us geeks that couldn't wait for Linux on the Desktop that helped drive adoption 15 years ago. Kinda like the Democrats forgetting that the working class matters.

-Chris

Comment Data will not save us (Score 1) 635

"The report also calls on the government to keep a close eye on fostering competition in the AI industry, since the companies with the most data will be able to create the most advanced products, effectively preventing new startups from having a chance to even compete."

I call BS on this one. The two companies with arguably the most data anyone has ever accumulated in history are both incapable of producing new products, despite the fact that they know everything about everyone.

Google's only innovation was its advertising platform. It's a cash cow. That cash and the data in its search/mail systems has failed to yield anything new and innovative beyond incremental improvements in search.

Facebook's only innovation was leveraging privilege to build a social network. Remember the early days where it was just limited to Harvard students and then a few other universities and then finally everyone else? That was a brilliant strategy to create artificial scarcity to build demand. They also leveraged that time of limited users to fine tune the platform and create a social network that was generally acceptable to a broad user base. Since then, they've made a ton of money and collected a lot of data (granted, it's mostly people's family pictures and political rants) but haven't done anything innovative.

Innovation will always come from the small disrupters. Both companies made their innovative moves when they were small.

-Chris

Comment Re:We already have one. (Score 2) 635

It won't even be fine for those who own the companies. If the majority of the population has no source of income beyond a basic income provided by the government, the total amount of that basic income basically caps the size of all markets. To keep the money cycling, businesses will be taxed and the owners will only make modest incomes. Basic math gets in the way here (as it does in a free market for basically the same reason, the only difference is that corporations find ways to redistribute wealth via the market rather than regulations, limting the wealth of their customer base, and eventually destroying their source of income).

The parent's point is dead on: "the acquisition of wealth as the primary goal and measure of a person" is the bug in our society, not a feature. Swap that out for a different set of axioms and we can reshape society however we want.

-Chris

Comment Re:Realistic (Score 1) 94

That's where biology becomes problematic. From a convenience perspective, it'd be great to have a small pack of sensors somewhere that let you monitor vitals, especially for people who are sick. The problem is, there's no one place we can put a range of sensors and have them all be accurate enough to be useful. The body is a distributed system and different parts let you measure some things and not others.

Theranos ran into this problem recently by attempting to perform a wide array of tests on a single drop of blood while ignoring the basic biology behind blood-based measurements. Many of the measurements they claimed to be able to make are for things that occur in low copy-numbers in blood, which is why large draws are required to accurately measure them. If something only occurs once for every 10M blood cells and you need at least 10 copies to accurately detect it, you'll need at least 100M cells to have a chance at detection. The same is true for most other biological measurements. Sample size and location both matter.

-Chris

Comment Re:Sue the CEO (Score 2) 94

Not sure about Pebble specifically, but CEOs at VC funded companies typically don't have high salaries. Usually the top engineers and salespeople make more than the CEO. The CEO's compensation is delayed in the form of equity, which only turns into cash after an acquisition or other liquidity event. In this case, taking the $740MM would have resulted in a nice payday for the CEO. $40MM probably didn't even get the investors/debtors their money back.

Public company and profitable private company CEOs are almost always overpaid, but startup CEOs rarely are.

-Chris

Comment Re:Realistic (Score 2) 94

Probably true for smartwatches - battery life being the main technology issue that needs to be resolved. Once batteries are better (or power consumption is lower), you'll be able to pack more processing power and radios into a watch form factor and eliminate the need to carry a phone. Or, for those of us who don't like wearing jewelry, we can carry Zoolander size phones. Win win either way.

Fitness bands, on the other hand, are most likely a fad. People are always looking for silver bullets for weight loss and exercise. There's always a small market for products for athletes who find the gear improves their training, but the vast majority of these devices sell to consumers who really aren't using them as anything other than a feel good product. Plus, the science behind fitness bands is mostly bogus. Beyond GPS tracking for pacing, there's not much they can do accurately enough at their form factor. Biology, not technology, gets in the way of that. (e.g., for heart rate monitoring, we already know that straps are the most accurate way, but most people won't wear those, regardless of what they're tethered to)

-Chris

Comment Self Reporting is not accurate (Score 4, Insightful) 57

These sites are dangerous. I just went through the process of setting salary ranges for a number of new hires and the discrepancies between the self-reported sites and the commercial data brokers are fairly large.

As best I can tell, most people reporting their salaries on Glassdoor (for example) are junior people who are either inflating their title/experience, rounding up their salary, or both. Also the higher up you go in titles, the wider the variance. Without information about sample size, it's hard to know if the range for, say, a CTO in Springfield is really $80k-300k or if they just happened to have two people report their salaries (or aspirational salaries).

Self-reported salary sites are simply too easy to game to be reliable. If I wanted to depress salaries in Springfield, I could just submit some carefully designed "employees" to skew the stats. Alternatively, employees appear to already be doing that to try to get salaries raise.

Once you're out of the "junior" part of your career (say 5 years of career maturity, regardless of your title), you tend to know your market value and what your salary trajectory will be (if not, talk to your co-workers about pay - that's how executives all keep their pay high, though they communicate via lawyers, board members, and SEC filings). At that point, you're not going to report to these sites.

Employees and job seekers have ready access to these sites and use the data when negotiating raise. The problem is that HR departments have access to commercial databases compiled from actual pay-stub data. This sets up employees for some awkward conversations when they try to justify their 150% pay increase + company Ferrari because someone on Glassdoor claimed that's what their compensation is.

-Chris

Slashdot Top Deals

We can found no scientific discipline, nor a healthy profession on the technical mistakes of the Department of Defense and IBM. -- Edsger Dijkstra

Working...