Forgot your password?
typodupeerror

Comment: Re:20 generations (Score 1) 222

by perpenso (#48231013) Attached to: High Speed Evolution
Interesting. However the staged nature of the photos is a non-issue. The photos are not evidence themselves, they are merely illustrative of a concept. Now the questions of whether birds or bats are the main predators and whether moths rest in the canopy or on trunks, those are excellent questions to pursue.

Comment: Re:20 generations (Score 1) 222

by perpenso (#48229773) Attached to: High Speed Evolution
I recall a video shown in school where moths evolved from light colored to dark colored and back to light colored fairly quickly depending upon whether the local trees with a light colored bark were covered with dark soot from local coal burning factories. Factory started up, moths changed rather quickly. Factory shut down, moths changed back rather quickly. The moths with the wrong camouflage suffered greater predation from birds.

Comment: Re:Shot in the back (Score 2) 279

Here's the deal.

It's expensive to stay on high alert all the time. All those extra guards, guns, maintenance, etc. That costs money. Up here, after 9/11, we maintained high alert at the bases for a couple of years, then decided to go back to more-or-less before. Not quite; back in 2000 I could walk onto the base only flashing my ID, and once I did show a post-it that said PASS on it. As it stands now, I do require an actual valid pass to get onto the base. However, the security on the base itself is lower than that of my local YMCA. (The base passes are easy to forge and don't get scanned or recorded; the gym requires an active membership and records your entry times.)

What I'm getting at here is that when you're on guard duty at the War Memorial, you're there to be a meet-and-greet kind of soldier. The only shooting you're expecting is some selfies with the kilted guy (meaning you) and maybe a couple of shots at the bar after work. You're not guarding anything. It's a public sculpture that's maybe 50 feet per side. There's literally nothing there to defend. (I've been there a few times; years ago for work I stayed at the Lord Elgin and worked in the next-door building, housing some PW stuff.)

Now, here's the other thing. Bullets. You have to track the shit out of them. If you gave the guards at the War Memorial live ammo, it would be a complete clusterfuck. If you're giving someone ammo, you're expecting them to get shot at, right? Which really means they should be wearing armour as well, not the ceremonial dress uniform (which only offers protection against thrown bullets) So you've got to get them armour, bullets, and a real gun, plus track all that stuff from day to day. What if the gun got dropped and discharged? What if you stopped for a picture and someone took your gun or cut themselves on the bayonet? What if the magazine fell out and the ammo sprayed all over the ground? Now the person guarding is presenting the image of a drunkard scrambling around for their car keys in the dark.

Weird scenarios, but all significantly more likely than a schizophrenic walking up to you and shooting you in the back in cold blood on a boring Hump Day morning.

Comment: In our college? (Score 1) 289

by Sycraft-fu (#48225925) Attached to: How Sony, Intel, and Unix Made Apple's Mac a PC Competitor

The big ones I can think of are Cadence SPB, Ansys HFSS, Ansys Fluent, Dassault Solidworks, Dassault Abaqus, Rocscience RS3D, Agilent ADS, Bently Microstation, PTV Vision, Intel Fortran, and Xilinx ISE.

There are more, but those are the ones I can think of we use the most off the top of my head.

Comment: 50s 60s 70s business deferred costs to "now" ... (Score 1) 662

... Back in the 50s, 60s, early 70s -- before large scale automation and computerization -- businesses had big labor expenses but somehow managed to stay in business ...

Your argument fails due to what is perhaps the most common caveat in statistics and economics, "all other things being equal". There are huge factors that make those decades different, post-WW2 factors, deferred labor costs, etc.

Yes, by deferring employee costs to future decades. For example a company like General Motors in the 50s 60s and early 70s negotiated lower labor hourly rates by offering increased retirement benefits. Basically the CEOs of the 50s 60s and early 70s effectively shifted costs from those decades to, well, "now". This shifted cost was one of the major factors in GM's "recent" near bankruptcy.

The other factor that you failed to consider is that the US emerged from WW2 with not only the only intact manufacturing base but an expanded and modernized manufacturing base. Plus a population that did not see their savings and often their homes and worldly possessions lost, rather a population that had been earning good wages during the war and had no real place to spend their money so they saved it.

So in the US we had a population flush with cash, a huge demand for consumer goods, and no competition. It was a business environment where a company could survive the dumbest practices.

Now add a huge government stimulus as the Marshal plan helped rebuild Europe and Japan. This created a huge demand for heavy industry goods and services.

This US industrial and manufacturing dominance had a long tail as it took decades for former industrial nations to recover from the war. In other words a lot of the profitability of the 50s 60s and early 70s was part of that long tail of the post war years.

Comment: US had wage and price controls in the 1970s ... (Score 1) 662

Yes and we could also elect a dictator who would set price controls and order stores to sell certain items. It worked great in Venezuela.

No need to go that far. The US instituted wage and price controls in the 1970s in an attempt to fight inflation, it didn't work.

Comment: Silicon Valley is a terrible example ... (Score 1) 662

When the minimum wage went up in San Jose, the downtown pizza parlor raised the per-slice price by $0.25 USD and per pie price by $1.00 USD. Business remained steady and the world didn't come to an end. Never mind that states with higher minimum wage have higher job growth

San Jose is the largest city in Silicon Valley, third largest city in California, and 10th largest city in the United States.

Silicon Valley is a terrible example to demonstrate the effects of a minimum wage increase and corresponding increases in local product/service costs. The area is too wealthy, this distorts the reaction to $1 more per pizza.

"The median household income is $90,000, according to the Census Bureau. The average single-family home sells for about $1 million. The airport is adding an $82 million private jet center."
http://www.usatoday.com/story/...

Comment: Re: Am I missing the point? (Score 1) 124

You think that would be a standard feature, but apparently it bears special mention.

I miss the older Foldershare then Live Mesh for that very reason. I think it might have been before "cloud" was a buzzword, and folks still thought about networks and file storage in a traditional way.

Skydive came out and I was fine with the giveth, but then was the taketh away. I remember being excited about the Live Framework developer API. The ideas presented don't seem especially innovative at the end of 2014, but they were at the time.

Still, implementation of those ideas is lacking. I can't use my phone apps on my computer, and my tablet and my phone can have the same app, but individual copies of local data. It's rather inconvenient and at times humorous.

Comment: Even more than that (Score 2) 289

by Sycraft-fu (#48218129) Attached to: How Sony, Intel, and Unix Made Apple's Mac a PC Competitor

Want to know a big reason people have been getting Macs, that Apple doesn't like to admit? You can run Windows on them now. The Intel switch made it viable to run Windows on them, natively if you wanted, and good virtualization tech means it runs fast in OS-X. That lets people get their shiny status symbol, but still use the programs they need.

We've seen that at work (an Engineering college). Prior to the Intel conversion, there were almost no Mac users. The thing is engineering software just isn't written for the Mac. There is actually some stuff now, but even so the vast majority is Windows or Linux. Back in the PPC days, there was almost nothing. So we had only really two stubborn faculty that used Macs, one because he did no research and just played around, and one because he wrote his own code and was stubborn. However that was it, you just couldn't do your work on them.

Now? All kinds of faculty and students have Macs. PCs are still dominant, but we see a lot more Macs. However every one has Windows on it. Some it is all they have. Seriously, we have two guys who buy Macs, but have us install Windows on it, they don't use MacOS they just want the shiny toy. A number have bootcamp, and many have VMWare. Regardless, I've yet to see one, faculty, staff, or student, that didn't put Windows on it to be able to do the work they need to.

So that is no small part of how Intel helped Apple gain market share.

Comment: Also (Score 2) 290

by Sycraft-fu (#48210809) Attached to: Will Fiber-To-the-Home Create a New Digital Divide?

Speed matters less with each step up. Going from a modem to broadband is amazing, going from something like 256k DSL to 20mb cable is pretty damn huge, however going from 20mbps cable to 200mbps cable is nice, but fairly minor and going from a few hundred mbps to gbps is hardly noticeable.

I have 150mbps cable at home, and get what I pay for. Games from GOG and Steam download at 18-19MB/sec. It is fun, I can download a new game in minutes... however outside that I notice little difference from the 30mbps connection I stepped up from. Streaming worked just as well before, web surfing was just as fast, etc. The extra speed matters little to none in day to day operations.

Same thing at work. I'm on a campus and we have some pretty hardcore bandwidth, as campuses often do, so much it is hard to test as the testing site usually is the limit. Downloading large stuff it is nice, though really not that much less time than at home. I don't really mind the difference between a 2-5 minute wait and a 15-20 minute wait for a program. Surfing, streaming, etc all are 100% the same, no difference at all, speed seems to be limited by waiting for all the DHTML crap on a site to render, not the data to download.

While geeks get all over excited about bigger better more when it comes to bandwidth, for normal use what matters is just to have "enough" and "enough" turns out to be not all that much. It'll grow with time, of course, higher rez streaming, larger programs, etc will demand more bandwidth but still this idea that there is the difference between uber fast Internet and just regular fast Internet is silly.

It will not create any meaningful divide.

Comment: Am I missing the point? (Score 5, Insightful) 124

They copied some data across a local network. Then they compared it how long it took to transfer the same data to remote servers across their internet connection? 1.36 GB in 41 seconds is 33 MB/s, which is either extremely underwhelming for local network performance (I suspect a magnetic hard drive bottleneck), or extremely impressive for a fat internet pipe, neither having to do with the software in question.

Comment: Nah looks like an attempt to restrict speech (Score 1) 324

by Sycraft-fu (#48205311) Attached to: Hungary To Tax Internet Traffic

Even in the US such an amount wouldn't be a tax in the sense of raising revenue, but an attempt to stifle usage. That is a lot per GB, even at US income levels. As such in Hungary, this is even more restrictive, given the lower income levels. It is for sure an attempt to stifle usage, and not a legitimate revenue measure.

Comment: I think they way you tune it can be bigger (Score 2) 108

by Sycraft-fu (#48191475) Attached to: Which Android Devices Sacrifice Battery-Life For Performance?

I mean sure if you use heavy usage games lots then maybe this matters, but most of your use is standby and cell network stuff. I've got my Note 3 lasting 3-4 days on a charge. How?

1) Turning off background services that slurp up battery. Just took some looking at the battery monitor and then considering what I needed and didn't.

2) Turning off additional radios like Bluetooth and GPS when I don't need them. It doesn't take long to hit the button if I do, and even when they aren't doing things actively they can sip some juice.

3) Having it on WiFi whenever possible. In good implementations on modern phones it uses less power than the cell network. Work has WiFi and I have a nice AP at home so most of the time it is on WiFi.

4) Using WiFi calling. T-mobile lets you route voice calls through WiFi. When you do that, it shuts down the cellular radio entirely (except occasionally to check on things) and does all data, text, and voice via WiFi. Uses very little juice and an hours long call only takes a bit of battery.

The WiFi calling thing has been really amazing. When you shut down the cellular radios battery goes way up. Not just in idle, but in use. Prior to that (when I first got it T-Mobile was having trouble with the feature) standby life was good, though not as good as it is now, but talk seriously hit the battery. Two to three hours could do it in almost completely. Now? I can do that, no issue, and still have plenty left.

"Gotcha, you snot-necked weenies!" -- Post Bros. Comics

Working...