Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re: Stock ROMs are shit (Score 1) 211

And you bought that BS? They want you to use their online storage services. Period. All of Google's hardware is designed to coax you into using their online services by limiting the local storage space. This saves them pennies on the hardware, but gives them exposure on the cloud software. Why else would they be the ONLY android and chrome-book manufacturer who omits a widely used storage expansion port?

Thankfully, with an OTG cable, most of their phones and tablets can use flash drives. I'm a Nexus 7 tablet owner... love the device, though a bit sad it's reached end of service life. Hope other ROM makers will support it now that Google has relegated it to the dustbin.

Comment Re:already exceeding expectations (Score 2) 1544

Most of us in the USA are glad Hillary isn't president (including the ones who voted for her.) We also aren't very happy about Trump being president either. We're stuck with a 2 party system and pick the lesser of 2 evils mostly.

Clinton's voting history has been gung-ho for wars, so I cannot argue with you there. Trump is very likely the better president for foreign relations in that arena. He's also very crass and likely to insult our enemies and allies alike, and do a bit of bullying as well, but I don't think he's the sort of president that wants to start WWIII... or even spend as much as we currently do on wars in the middle east.

Trump is mostly dangerous to our domestic side -- he has no concept for how macro-economics work or how international trade and tariffs work on economies. His policies would likely lead us to a recession and/or a trade war... but, I think he's finaly beginning to wise up to that as he backs off from his campaign promises.

Comment Re:Apple also lost education to Chrome OS (Score 1) 228

One doesn't compete in education for money. It's to introduce the next generation to your walled garden so they will go home and buy compatible products and eventually use them at home, work, and play -- because you learned it early on and everything works together... and switching is a pain.

Often, Apple would give very steep educational discounts. They'd be smart to just donate computers to schools and universities and take the tax deduction.... especially for currently unsold inventory before the next models come out.

Comment Re:Funny thing about 8K... (Score 1) 192

Not really. For 20/20 vision ("good" eyesight), the limit is closer to 5K, so most everyone will notice the difference from 4K to 8K because it will surpass the 5K barrier. But, that's not the limit of human eyesight. There are those of us with 20/10 vision and better that can discern up to 11K or better. Lots of pilots have "eagle eye" vision in the 20/10 or better range. One can also get better than 20/10 with laser eye surgery.

You can read up on a decent article about it here:
http://www.red.com/learn/red-1...

My bet is it'll hit a barrier at 16K where no one will be able to appreciate anything higher, but we aren't there yet.

Comment Re:I'm sure there's a reason... (Score 2) 192

Binocular human vision is only useful for 3D up to about 6 meters away (roughly 20 ft). Objects farther than 6 meters appear too similar in position to both eyes to discern distance accurately, so the brain uses other clues for reference. The average distance between human eyes is only 6 cm, and the parallax is so small of an angular difference at over 20 ft, the brain really can't tell the difference. That's partly why piers in the distance always seem so close, you could walk to them... and then 30 minutes walking on the beach later, you could still be very far away.

Comment Re:Get ready for more glued-in/soldiered on parts (Score 1) 71

One could, but why? The parts for the entire supply chain are also in Asia. Why ship all the parts to the USA just to assemble a finished product instead of just shipping the final, tested product? It's easier to work with local supply chains on just in time orders and returns for defects than to deal with international shipping. It takes up to 6 months to send cargo by ship to the USA from China. If there's a problem with a shipment (rusted parts in transit, banged up casings, etc), do you really want to shut down your factory for 6 months to wait for the next one? Or maybe pay through the nose for air shipments that will still take 3-4 weeks? One could build a warehouse and stock it with 6 months worth of parts inventory... but, that'd be a pain when the higher-ups decide to change models and make your inventory worthless. Or you have some other inventory issue... like a hurricane, earthquake, flooding, etc that damages it.

Unless your plan includes moving the entire supply chain to the USA... screws, aluminum sheeting, camera lenses, RAM, CPUs, chipsets, and all; it's a bad plan. That'd be a mighty feat considering their foundries are all in Asia, they cost a fortune to build, and Asia has fewer regulations, cheaper land, and still has cheaper human labor. I'm sure you could source some of the parts from companies already in the USA, but not for the prices and volumes you could get them in Asia. The most important parts couldn't be built in the USA without creating new chip foundries.

//used to work in international supply chain management -- trust me, you don't want to even bother with this idea.

Comment Re:Most depressing thing I've read all week (Score 5, Insightful) 139

This is true, but the CPU isn't the bottleneck for your examples. For user input (especially games), the user is the bottleneck. Games largely benefit from parallelization for rendering graphics. The logic isn't the bottleneck, and the latency for the response of user input is imperceptible to the human. For most instances, the RAM and CPU are waiting on the human and already have everything loaded to respond to the human. If a human's choice requires the loading of a different zone, the game could even predict which zone would load and pre-load a zone without human input, but dump it if the input wasn't what was predicted. Still, it's the I/O for the disk that's the bottleneck, not the CPU.

As for databases, the biggest bottleneck is the storage medium. Depending on the database and how it's divided, one can even run many tasks on the same database simultaneously so long as the tables don't interact. Ramping up the CPU speed does little to nothing if the I/O to the storage medium of the database is slow b/c the db won't unlock the region of the database for the next transaction until the last transaction is written at least to a buffer if not the final storage medium.

For that example, the best way to improve DB processing is to add RAM, add cache, and increase the clock speed of both.... if possible, even let entire tables if not the full database to exist in RAM and only write to disk periodically as a save-state. Even DDR4 2400 RAM only operates around 1.2 Ghz, though with access on rising and falling edge, it's effectively 2.4 ghz. What is your 20 Ghz CPU going to do with 10 cycles between every read and write to RAM ? Current Intel CPUs have a 4 stage pipeline. Even with a sizable cache at a higher speed, it's going to choke on the RAM latency... especially for large sequential database transactions. RAM is already hot enough to fry eggs on, so it'll be until the next RAM replacement tech comes out before we see some real boosts there. Maybe in a year or two.

I'm curious what exactly you'd like to run at 20 Ghz through the general purpose CPU registers that can't be done better/faster with extensions using specialized hardware. For instance, x265 HEVC video playback can really heat up a CPU to nearly 100% usage, but if it has x265 decoding hardware, the CPU barely breaks 1% playing the same video on a similar CPU architecture and speed. Seems if you have a single thread that you need to have repetitively run at very high speeds, you'd rather have a FPGA or some other hardware to accommodate whatever you're trying to do rather than a general purpose cpu.

Comment Re:Most depressing thing I've read all week (Score 4, Interesting) 139

Intel gave up on increasing clock speeds way back when they hit 4Ghz. They hit a wall, and they're done, so I wouldn't expect them to revisit it. That's when they went to multi-core. Every computer does better with dual core over single core. Most do better with quad core than dual core. (because even if a single program isn't compiled for multi-core, different programs can be assigned different cores). With VR tech and GPUs added to the cores, multi-core is likely going to continue to be the area of development for some time. As always, expect new physics and graphics extensions as well as codecs.

Multi-core means managing the power and speed of each core individually and allowing some to power down while ramping up one or two to keep the thermal and power envelopes within tolerances. The biggest metric for Intel is performance per Watt -- as data centers are concerned about power usage for the machines and the air conditioning systems.

I don't think there is enough of a market for enthusiasts that want 20 Ghz clock speeds for Intel to bother even doing the research for new materials to pull that off... assuming it's even possible without extreme cooling.

Comment Re:I prefer regulations that promote safe operatio (Score 2) 150

Trust me when I tell you the FAA is way, way worse. I had family running a small business doing aerial photography with drones over a decade ago -- when the FAA had no rules at all about drones. The person piloting the drones was an actual plane pilot and put in a flight plan for every flight.

We're talking cutting edge helicopter drones with high-end cameras and zero regulation. The business was booming -- real estate agencies contracted with them, police agencies used it for tracking fugitives. YET, the local competition -- full sized aircraft photographers complained to the FAA constantly (and lied about location, times, and flight plans in their complaints). The FAA drug their feet for YEARS creating regulations that my family wanted so they could show they were within the regulations (since there weren't any -- and no rules at all to go by). Instead of creating regs, they basically had a moratorium on flying drones for those without military clearance until they could create them. So, my family members lost their business over threats from the FAA, allegations, and eventually the moratorium.

Now, those family members fly drones for a military contractor.... again, cutting edge stuff, only this time top secret. Things end up working out just fine, but if not for the FAA's incompetence and poor regulation of an emerging industry, my family could have been franchising aerial video and photography services long before it became so common that anyone could fly a drone without any aviation experience.

Comment Re:Pretty much (Score 4, Informative) 113

AMD began as a supplier for Intel. Every time they improve the x86 architecture, they end up cross-licensing the improvements with Intel for their improvements as well. AMD has pulled ahead twice in its history -- both times, Intel crushed them so bad, they almost didn't recover. Once due to illegal market pressure and the second time by revamping the cpu to blow AMD out of the water in specs. Intel has AMD's 64 bit tech now.

AMD was looking for a market they could actually compete and even maybe succeed in by buying ATI. NVIDIA is solid competition, but nowhere near Intel on the CPU side. AMD's APUs are the synthesis of ATI and AMD's 64 bit tech. Intel's got a few moves to make with this. Intel can improve their own integrated graphics or buy NVIDIA to use inside their cpus. Both are unlikely. The most likely outcome is Intel will license the tech from AMD at their next cross-licensing deal.

AMD makes money on the low end and gaming console market. They have no hope of ever taking on Intel, so they'll settle for a percentage of every chip Intel makes in a licensing deal instead. Wash, rinse, repeat. Intel won't let them die as AMD is their only evidence that they aren't a monopoly. (ARM is great, but it's got a long way to go before it's really a competitor -- especially in the laptop/desktop market). AMD will likely do quite well in the gpu market moving forward -- especially with VR being the next big thing.

Comment Re:Also nothing supports it (Score 1) 76

I can play H.265 1080p content on my 3 year old laptop without any issue. VLC barely budges a single core on the cpu. My Nexus 7 2013 can handle H.265 720p files just fine with VLC, but it does hit the CPU really hard. (1080p on it plays audio, but the video is jerky) Almost all ARM chips that were produced in the last year or two support H.265 .

The only thing I have that probably couldn't handle H.265 is a 6 year old smart TV... but, I could easily get a Roku or something for that.

I'd say it won't be long before they switch to H.265. Sure, licenses aren't cheap, but when you factor in the bandwidth savings from the file size reduction, I can't imagine it not being worth it for Netflix to switch. It's just a matter of time, testing, re-encoding, and ensuring customers are ready for the switch. It wasn't that long ago that Netflix re-encoded everything from masters to H.264. They drug their feet on that for quite a while, and they paid for a license for H.264. VP9 isn't that impressive. Maybe the next iteration will be better, but for now H.265 is the best out there.

Comment Re:Ubuntu makes to much decisions for me... (Score 4, Informative) 137

I'm not sure I follow. Ubuntu will let you install the proprietary drivers and will let you file bug reports for issues, but if the close-sourced driver is found to be the culprit, they'll refer you to AMD... because AMD is the only one with the source code, and thus the only ones able to help you fix the bug. That's about as much support as one could ask for.

The open source drivers are the default install, but you certainly can replace them with the proprietary closed source drivers.

Here's the How To from Ubuntu for the most recent 16.04 LTS:
https://help.ubuntu.com/commun...

As for the open vs closed source quality, recent benchmarks show that the MESA 13 drivers are pretty close to the closed source ones for most chipsets, but it's still a tiny bit high on the latency. I doubt you'll ever get parity until/unless AMD phased out the closed source drivers by fully opening the source code. There's probably some things in there they license and/or don't care to share with competitors, though.

Comment Re:Nope (Score 3, Insightful) 468

Economics has never had to deal with this level of AI before, and Milton Friedman died 10 years ago, so I doubt he had much to say on the topic.

In a world where robots with AI can do just about every blue collar and almost all white collar work better, faster, and cheaper -- what do you propose? AI is even replacing most clerical work and has begun replacing tattoo artists and surgeons.

Seriously, who would hire a human being to do any job if they can have a one-time-purchase AI to do the same job that is literally superior in every way?

Ask the rust-belt about all their manufacturing jobs that went to Mexico after NAFTA and to China as well... but, which now are moving from China to Ethiopia or are being replaced by robots. That's right -- China has been cutting thousands of jobs and replacing them with robots... b/c it's cheaper than even the pittance they paid the Chinese labor.

Have a look at the 2 million 18-wheeler driver jobs and the additional 5 million delivery/taxi jobs in the USA. When vehicles become fully self-driving, that's 7 million jobs gone over the course of just a few years to replace the drivers. It'd be one thing if people had time to prepare, to learn new skills, and to find a new job that a robot with AI wouldn't threaten. Thing is, the AI is taking over jobs in all fields. There's even a robot pharmacist dispenser at my local hospital -- sure, it's stocked by a real pharmacist, but it basically does their job and multiple pharmacy tech jobs in one.

The Industrial Revolution made it so that people could do more work. The Information Age made it so that people could do more work and do so globally instead of just locally. The AI Revolution will make it so that few people can find work... b/c the AI is made to REPLACE people, not to help them do more work. Sure, those displaced workers could try to find work in an area that an AI just can't do. But what would that be? Software coding?

No matter the subject, as AI grows, its capabilities will become exponential. There's no job that's truly safe from its encroachment.

Comment Re:Communism (Score 5, Interesting) 134

This is an interesting argument. On the one hand, getting something for free can lead to laziness and complacency. Yet, somehow we let children go for nearly 18 years sometimes without earning a paycheck. Oh, sure -- some get an allowance for chores or get a paper route -- some even flip burgers in their teens, but really it's not enough to live on. It's as if we let their wealthier parents take care of all their basic needs, but they can go out and earn discretionary income if they're motivated enough! Why, it's pure Leninist Communism on the family-scale!

Or, you know. Maybe in a world where human physical labor is obsolete and even many white collar jobs are now obsolete, maybe we should prepare for a world where just about every job is obsolete, and the rich, wealthy owners of the land and corporations can afford to use the immense wealth built on robot labor and Artificial Intelligence to let everyone have their basic needs tended to with a tiny bit of discretionary money to buy their products so that the whole system doesn't collapse under its own weight. Because if you have an AI/robot workforce and so does every other company on the planet, no one has a real paycheck to buy products, so the economy collapses and your AI/robot infrastructure crumbles b/c it's useless to make things for people that can't afford your products.

Hyperbole? Nope. China is replacing their human workforce with robots. Read that again and let it sink in a bit. China, where workers are paid less per year than many Americans make in a week has decided to replace thousands upon thousands of human beings with robots... b/c it's cheaper. Self-driving cars are going to be a thing in the next 5 to 10 years -- so much for those 2 Million American trucking jobs plus another few million taxi drivers... and Uber/Lyft. I've seen whole departments shelled out to the core to be replaced with automated systems. The other day, I saw a robot tattoo artist! Seriously, it scans your body, preps the needle, and will do a complete sitting for a tattoo given the design. There is no job that's safe. Legal Clerks are being replaced with automation. Nurses, pharmacists. Even surgeons. The more creative and nuanced the job, the longer the hold-out... but it's coming. The information age made globalization possible, but the AI age will make global massive joblessness a reality -- Who would hire a human being if an AI and/or robot could do the job cheaper, faster, for longer, and more reliably?!?!? Most kiosks cost around $30K -- and McDonald's is rolling those out nation-wide to replace people that used to take your order (or at least prevent them from having to hire more than a couple people capable of taking your order per site) Many auto-manufacturing robots are cheaper than union labor. In the USA, we have union workers sitting in seats on robot arms and the arm moves the worker to the place for them to screw the bolt in. In foreign plants... that human is replaced by a robot hand that does the job better. How long before the unions break down and let the USA plants do the same?

Slashdot Top Deals

As in certain cults it is possible to kill a process if you know its true name. -- Ken Thompson and Dennis M. Ritchie

Working...