Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment Re:Why yes, I would. (Score 1) 209

I'd still probably feel better, at least for the first couple of generations of such a device, if I could hold a remote control in my hand. Maybe 3 switches:
1. Pause - I'm in pain, just hold still for a moment.
2. Safety - in a controlled manner, remove the needle, apply pressure to stop the bleeding, whatever. Basically just return to a safe state.
3. Kill - something is horribly wrong. Remove all power from the system. Yes, the needle might still be in my arm, but for whatever reason, even the safety may not work correctly, so manual human intervention may be needed.

Of course, having those switches (especially the Kill switch) would probably make it less safe, as people would panic and press them when everything is really fine. Maybe give those switches to the machine's operator, I guess.

I think people would probably be more comfortable with like an AR device for phlebotomists; it could use the same data to show them where the best vein is, and how to poke - but the needle is still in the human's hands.

Comment Re:Sigh (Score 1) 445

Fixing light pollution doesn't necessarily mean not having lights at all. For example, my new house has a standard lamp that shines in all directions by the back door. The effect is that it very dimly illuminates everything, but also shines brightly in my eyes so that its illumination is only a small improvement over not having it turned on. A better light design would cast the light downwards so that it's not in my eyes, rather than outwards. It might also be mounted higher, so that the downwards light covers the needed area - or it could be replaced with several smaller lights. Among the benefits of a design like that:
I probably need to see the ground (or things on or near the ground) more than the fences or treetops that the current light illuminates - by focusing the light on the ground, it will be brighter there rather than wasting its light where I don't need it.
The light won't shine in my eyes, which will help my night vision.
The wattage of the light could be reduced, since its output is focused on only the areas that need illumination.
If most lights in an area get this treatment, then the night sky becomes visible. (I once took a photo in Death Valley of the Milky Way. When I returned to my home in $SUBURBAN_SPRAWL, I took another photo with identical settings (aperture, ISO, shutter speed, etc.) to see what I'd get - the result was a photo that was completely blown out because there was so much light bouncing around in the night sky.)

I couldn't find a good comparison photo with a quick google, but somebody else probably can - there is quite a difference between a yard that is illuminated properly and one that just has a giant floodlight spraying light in every direction.

Comment Re:Benchmarks, trustworthy? (Score 1) 82

Let me start this by saying I'm no fan of Intel - quite frankly, many of their business practices are a little suspect, and they've had some downright nasty ones before (like selling a bundle of CPU + Northbridge for less than the CPU alone, and then saying it violated the agreement if the OEM buyer decided to toss the Northbridge in the garbage in lieu of a different manufacturer's chipset.). But I don't see a slam-dunk case for antitrust in this alone.

The first reason is that there may actually be technical reasons for doing this; both Intel and other CPUs all have a variety of errata, and they generally do not match up. It may be desirable for Intel to write their compiler to take their own SSE errata into account, while using simpler (X87) support for other vendors to avoid dealing with their errata. Just an example, but there could be some totally legitimate reasons for doing what they do (while also, of course, helping them out by making everything else slower).

A second reason is that just writing a shitty compiler that only works well on your own products is not antitrust by itself, no matter how easy adding support for other products would be. I think you might have a case for antitrust if their compiler had a strong position over the market and there wasn't other good competition out there. As far as I know (which is not very far at all), that isn't the case - GCC is widely used. Companies that choose to purchase ICC may choose it because they only expect to use Intel CPUs, and are willing to pay extra for better performance on them.

Finally, is there any reason why one could not use both compilers? I'm not exactly a compilation infrastructure expert, but I would expect that it would be possible to compile the same code with different compilers and put it into the same binary. I believe this is what GCC was doing (albeit only a single compiler, but still two distinct assembly code paths) when I wrote PowerPC code and asked GCC to optimize it for both PPC 7400 and PPC 950. (I could be wrong - maybe it's only using optimizations available for both - but in any case, this shouldn't be an intractable problem.)

Now if Intel uses their CPU stronghold to sell their compiler, AND they get a stronghold in the compiler business, AND they use their compiler to ensure that their CPU stronghold is maintained/strengthened, I think that'd be more reasonable of calling out the antitrust goons on. For now though, I think they're still missing that middle link.

Comment Re:Benchmarks, trustworthy? (Score 4, Insightful) 82

To be fair, any use of a benchmark to judge which system to buy is pretty silly. The best benchmark you can make is something that is identical to your intended workload; eg play a game or use an application on several systems, and see which feels better to you.

Taking some code written in a high-level language and compiling it for a platform is a great benchmark - if that's what you're going to be doing with the system. But you'd better be using the compiler you'll be using on the system. If you need free, you should test GCC on both. If you are considering buying Intel's compiler (it's not free, is it?), then add it in as another test to see if it's worth the extra outlay of cash. Intel puts a lot of work into making compilers very good on its systems, so if you're going to use the Intel compilers for Intel systems, it's perfectly valid to compare against using GCC on an ARM platform, if that's what you'd be using on ARM.

But if most of what you're running will be compiled in GCC for either platform, yes, you should absolutely test GCC on both.

That said, much of what's noted isn't necessarily intentional wrongdoing. For the example of breaking functionality, it's quite possible that the compiler made a perfectly valid optimization to get rid of 31 of the 32 loop iterations. One of my professors once told a story about how he wrote a benchmark, and upon compiling it, found that he was getting some unbelievably fast results. As in literally unbelievable - upon investigation, he discovered that the main loop of the benchmark had been completely optimized away, because the loop was producing no externally visible results. (As an example, if the loop were to do "add r3 = r2, r1" 32 times, a good compiler could certainly optimize that down to a single iteration of the loop; as long as r2 and r1 are unchanging, then you only need to do it once. Similarly, even if r1 and r2 are changing on each iteration, you need to use the result in r3 from each iteration of the loop, otherwise you could optimize it to only perform the final iteration, and the compiler could pre-compute the values that would be in r2 and r1 for that final iteration.)

So perhaps it's a bad benchmark - but I wouldn't default to calling it malicious, just that the benchmark isn't measuring what you might want it to measure. And quite frankly, most users aren't going to be doing anything that even vaguely resembles a benchmark anyway, so they really have little justification to make a buying decision based on them.

Comment Safeguards to protect privacy (Score 3, Interesting) 133

The UN chief says that appropriate safeguards are needed to protect privacy - well they WERE doing a great job......until Snowden came around.

Think about it - what better way to protect your privacy than by not even telling you that they're invading it? If neither you nor anybody else in the public knows that your privacy has been violated, then obviously it hasn't been, because it's being kept private!

Then Edward Snowden came along and ruined the whole thing - simply knowing that our privacy has been violated means that it IS being violated. If it weren't for him, all our data would still be safely kept private (in the hands of the NSA).

Comment Re:Poor premise (Score 3, Insightful) 229

+1 to this...
Intel has great foundries and process engineers, and they have been pretty consistently ahead of TSMC and other foundries. There are also a million reasons to NOT use Intel. For one, there is no way Apple will ever be Customer #1 at Intel - Intel will always always ALWAYS be customer #1 at their own fabs. If there's limited capacity, Apple would lose out to Intel. TSMC might not be willing to put Apple on a pedestal over all their other customers, but they at least won't be 2nd place to anybody - in a limited-capacity situation, Apple would get a fair share of some sort, rather than zero.

There's also an argument to be made for spreading the wealth around; Intel got their leadership position because everybody bought CPUs from them, giving them huge piles of cash to invest in R&D, making it hard for everybody else (eg AMD) to compete because they don't have the process advantage that Intel does.

Also, TSMC isn't a competitor, but Intel is trying to be with their mobile chips. TSMC sells fab space to whoever wants it, but they don't make any chips or sell any devices. Intel isn't quite a direct competitor with Apple, but there may be some desire to not give them any more profits that could be used to fund R&D of mobile chips/devices that could be used by Apple's competitors. The revenue TSMC earns will go into further process R&D, since that's their only business.

So there are all kinds of reasons to not use Intel for fab, even assuming that they would offer it to Apple.

Comment Re:Accept the difficulty ahead (Score 2) 207

Agreed, the best way to go is some place that is at the intersection of your skill set. If you compete against other candidates where the requirement (and your qualifications) consist of just "Electrical Engineer", you'll lose to somebody with more (or more recent) experience. But if the requirements include knowledge of flight systems, the military, piloting, etc. as well as Electrical Engineer, then you have a chance to stand out in a pile of resumes. All the same, don't forget to brush up on your EE fundamentals (at least the ones relevant to the type of job you're applying for) before you start interviewing.

Having seen them at a college career fair a number of years ago, I'd imagine that a place like Astronautics Corporation of America (avionics manufacturer) would be interested in your skill set. There are probably numerous companies and contractors around the country that do similar things; the trick is finding them, and then selling yourself with a resume tailored to their needs and your strengths.

Comment Re:Genius judge (Score 4, Interesting) 540

Internships are even flourishing in US industries where paid internships are the norm. I'm in Engineering, and I've never heard of anybody doing an unpaid internship. My alma mater's current statistics say interns in my field from the past year earned between $13 and $38 per hour, and $20/hour on average. (Full-time work after graduation pays $20-$53/hour, $35 on average.)

Supply and demand factors a lot into this - good engineers are usually in demand, and there are many companies that will pay top dollar for both interns and full-time workers. In many industries, though, there is an excess supply of workers relative to jobs. This is how you end up with newspapers that have unpaid internships for journalism students - there are so many people that can do the job that they'll work for free. Similarly, you get people that are caught by the aura of the silver screen; they want to be big-time actors or movie producers, and they see that unpaid internship as their ticket in; but there are far too many of them for far too few jobs. (Especially if you count the labor pool that isn't lured to that particular industry, but is just generally qualified for that line of work - such as fetching lunch and coffee, answering phones, and assembling office furniture, as these interns did. Seems like an appropriate use for an MBA...)

I think the judge's decision as summarized above makes sense - if they're doing real work, they deserve at least minimum wage. If you just want to run them through training classes and exercises, then by all means, they can do no work for no pay.

The reason to hire paid interns is this: they get some money and experience. You get a worker that costs less, and a trial period to see if you like them. If they perform very well, you invite them back for another internship (if they're still in school) or a full-time job (if they're almost done), and bring their increased experience with them. If they perform poorly, then you know not to hire them again. This process is far easier for the employer than hiring somebody only to find out they stink, and then firing them. A sizable percentage of my employer's full-time workers started as interns; and a sizable percentage of interns are invited back for full-time positions.

Comment Re:"War against jailbreaking?" (Score 2) 321

Agreed, this. I once talked to an Apple engineer who works on security; this was the whole reason to plug the holes found by jailbreakers. After all, if you can visit a website that gives you root, you could visit a website that gives Sergei in eastern Russia root too. He could steal your saved passwords, or make collect calls, or send spam, or do thousands of other things to earn some quick money once he has control of your device. The jailbreakers just provide Apple with a convenient security testing service for free.

Comment Re:Seems an unnecessary feature (Score 1) 398

I'm not sure I see the distinction of why keyless start is "irritating and stupid". It doesn't matter if it's keyless or keyed - if you leave the key in the car, or leave the car running, it can be stolen. If you turn the car off and take the key with you, you're good. Assuming you don't want somebody to steal everything out of your car, you need to lock it anyway.

Or maybe you're distinguishing something with keyless start from keyless entry? I'm not really sure. In any case, my car does both, and it's great. To unlock the door, I grab the handle and pull. To turn the car on, I press the Start button. Having to use a key isn't usually a big deal (unless you're carrying a couple of bags of groceries and don't have a hand free to dig through your pocket/purse), but it sure is nice to not have to bother with one.

Slashdot Top Deals

I don't want to be young again, I just don't want to get any older.

Working...