Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment Re: Hard to believe (Score 4, Interesting) 166

IE 11 implements W3C standards better than any browser. Webkit might have more check offs from html5test but they are not implemented the same way as w3c.

Css 3 animations are a good example. Chrome does not do them right without hacks.

It is not IE 6 anymore and Sun and IBM subverted and changed proposed standards IE 6 used in development on purpose. It was not designed to break Web pages. Mozilla and Netscape were worse in 2001 believe it or not

Comment Re: If you hate Change so much...... (Score 1) 516

Skuemorphic design with gloss, shininess, gradients, are out dated according to the Art professors.

Look at ios 8? Buttons are gone as they represent real objects. Macosx? Yosemite is flat and high color scales and gradients gone. Andriod M? Same. Furniture? That too is now minimalist color and design.

This is the new thing.

Even chromes icon now no longer looks like 3d plastic circa 2011. It is flat and slightly frosted.

The old way is out of date now. You all whined Skuemorphism sucks!!! Look at the leather in macosx address book??? Well the art professors heard. You got it

Comment Re:Operating at 20W gives zero improvement. (Score 1) 114

AMD was (before haswel) not too bad if you have a multithreaded workload.

Thanks to the XboxONE with 8 cores games will run better on AMD as they will become more threaded since many are crappy xbox ports.

For a cheap box to run VM images in virtualbox/vmware workstation, video editing, or compiling code AMD offers a great value and the bios does not cripple virtualization extensions unlike the cheap ones from intel.

FYI I switched to an i7 4770k for my current system so I am not an AMD fanboy. But I paid through the nose for hyperthreading and 4 cores. I wanted good IPC for single threaded as well.

AMD dropped the ball twice. Both with abandoning the superior 5 year old phenom II which is still 25% faster per clock ticket than their newest system??? Second selling their fabrication plants to raise the shareprice last decade. They have .28 nm chips while intel is busy switching to .10nm??! How can you compete agains't that? Worse global foundaries are more interested in ARM chips as AMD has too low demand. OUCH.

Even if you love Intel it is in your best interest for AMD to stick around for competition and lower prices.

Comment Re:Operating at 20W gives zero improvement. (Score 1) 114

In SWTOR I got a doubling of FPS from moving from a PhenomII black edition to an i7 4770k.

I would be surprised if that were not the case. The i7-4770k came out 5 years after the Phenom II - a lot happened in that time, including the entire Phenom line being discontinued and succeeded by newer architectures. I'd be more interested in a comparison between the i7-4770k and its 30%-cheaper contemporary, the FX-9590 (naturally, expecting the i7-4770k still to win to some degree if we focus purely on single-thread performance, but is that worth it? Once SWTOR is no-longer CPU-bound you wouldn't see any difference between the two at all).

Here is the quicker. The half decade old phenom II is faster per clock cycle than the FX series based on the bulldozer architecture?? AMD really messed up as it was optimized for its graphics hoping it would win this way. In other words those mocking it call it Pentium IV 2.0

Comment Re:Operating at 20W gives zero improvement. (Score 2, Informative) 114

Do you have a link for that? It's not that I disbelieve you, I strongly suspect that Intel would do that crap. I'd like to read more about it however if you hae a link handy, then stash the link for the next time this benchmark comes up.

Personally, I like the Phoronix Linux benchmarks. They're more meaningful for me since I use Linux and they're all based on GCC which is trustworthy.

http://www.phoronix.com/scan.p...

The i7 4770 ocasionally blows away the FX8350 by a factor of 2, but in many benchmarks they're close, and Intel loses a fair fraction. The 4770 is the best overall performer, but not by all that much. It seems that the choice of CPU is fairly workload dependent.

For servers, I still prefer the supermicro 4s opteron boxes. 64 cores, 512G RAM, 1U. Nice.

The i7 4770k is a fairly high end chip by Intel. I own one but I would not expect to find one in a sub 700. It is not a Xeon, but it is just 1 notch down from the $900 extreme edition so it is 2nd highest in consumer non server chips.

Well sites like tomshardware.com make it look like a Pentium or i3 can smoke the latest AMD black edition fresh out of the water. However, biased or not my real world experience says otherwise as many games are optimized for intel and use NVidia specific directX extensions with their studio software which boils a lot of AMD users blood but it is the truth.

In SWTOR I got a doubling of FPS from moving from a PhenomII black edition to an i7 4770k. True it has less cores but apps are optimized still for single tasking and I do have 4 real and 4 hyperthreaded cores for my VMs.

Reason being are games are crappy xbox ports. The 360 I think was single or dual core so games were single threaded. Therefore they kick ass on Intel. The only good news is the xboxONE is changing this with 8 cores with an AMD and forcing game makers to optimize more for ATI.

I expect the newer games to be more competitive as a result as they are more threaded and ATI optimized on tomshardware.com and other sites.

But damage is done and the power is much better with intel chips as they leave AMD further and further in the dust with lower chip nm sized dies since AMD sold their foundries. Global Foundaries only cares about ARM chips so sorry AMD stay in 2010 ... Intel is going 10nm next year and will finally put the last nail in the coffin. ... I pray NOT!

Comment Re:Real Engineering (Score 1) 176

In some places it is illegal to call yourself an engineer if you isn't really one (unlike software "engineers").

Alright sounds fair. Why can't a consortium or guild provide this certification? For coders who need something done will be programmers who will earn less and therefore no need for H1B1s and for critical architects and SOA for critical projects you can have certified engineers?

We could have 2 grades rather than average both of them and not having enough talent for one, yet be too expensive for the other use?

Comment Re:I have an H1-B employee (Score 1) 176

Where are you getting the impression that inexperienced kids out of college are regularly getting $70,000 a year to start?

It seems you are just making stuff up to argue against. The odd part is that you would do that to argue against a rising standard of living for some group of working people. Are you the kind of person who wants to tear others down when they are getting more than you? Because that's a more harmful attitude to have for society in general than some businesses trying to pay less for more.

Really? Go look at people posting here and some of the job ads if you do not believe me that kids start at $60k a year?

Not to sound assholish but I thought only Indians would be doing IT and dropped out of CS to do business. Worst mistake ever! I am happy just to make 50k in a few years of experience so yes I do not believe they deserve a rise and HR and accounting need a way to conserve costs.

Only in slashdot is it discussed there are starving programmers all making 30k a year thanks to those horrible greedy H1B1 recruiters. I do not see it. I am being honest as I see it no different than CEOs whinning about making less than a million.

What is a good starting and experienced programmer salary? 100k a year? Unless they are specialized or own their companies a programmer should not make that much. Simple and the corporations are just trying to reballance the market. American programmers make nearly 2x as much as any other major so I have little sympathy.

Slashdot Top Deals

To the systems programmer, users and applications serve only to provide a test load.

Working...