Follow Slashdot stories on Twitter


Forgot your password?
Check out the new SourceForge HTML5 internet speed test! No Flash necessary and runs on all devices. ×

Comment Re: terror alert (Score 1) 140

My money would be on the iPhone 6s. The early devices had a battery problem where they would suddenly shut down long before they were supposed to fully discharge. This leads me to wonder if the batteries have a dendrite problem, which (in addition to causing early shutdowns) could potentially cause them to overheat and catch fire.

Comment Re:Extrapolation? (Score 1) 68

So Deep AI is the same as Deep Learning? Deep Learning isn't AI, though those that like it call it that. When Deep Learning can predict a future trend, then it will be useful. Identifying the start of a trend because something does what something else once did isn't the same.

When Deep Learning can look at the economy and predict the valuation curve of a house as it goes up and down over 20 years, that'd be something interesting. "Bob lives in ZIP 90210 and has previously bought blue boat shoes, his firstborn is likely gay." Is simple probabilities using more data than a human can sift through conveniently, and has no "intelligence" at all, and is not a path to anything that would have been called AI 20 years ago.

AI will exist only when we've finally shifted the definition far enough to allow non-AI to be classified as AI.

Comment Re:Extrapolation? (Score 1, Interesting) 68

Extrapolation using Big Data is AI. Extrapolation using small data is extrapolation. Didn't they teach you this in AI school? The AIs that "learn", don't. They just cull wasted CPU when the requests fit patterns. If something is outside the pattern, it's as dumb as the first time it was run. Data tends to group into a normal curve (or something like it) and "AI" as they describe, groups things into similar bundles.

If a smart programmer were to spend years with BI/BAs and work out the value of the parameters, the AI would be 100% useless. AI (in this context, which isn't an actual AI) doesn't do anything other than look at past trends and apply them to new data. Being human, we assume it's doing it the way we would, which would be actual intelligence. But it does so in a computer-like itterative manner than has no "insight" into the patterns, and could *never* predict a pattern, but counts simple numbers.

Doing lots of math fast looks like AI. So call it AI and claim your AI is the best AI, and nobody does it like you. Smoke and mirrors.

Comment Re:battery life a braindead argument (Score 4, Interesting) 217

Falsehood #2. Wifi is still a pretty uncommon feature, and even when present is fairly problematic, finicky, and requires an unreasonable number of steps to initiate.

Actually, I've never found it finicky. The problem is that the actual maximum speed of wireless is GARBAGE for transferring photos, much less video. Wi-Fi is more than an order of magnitude too slow to be practical. Anybody who thinks otherwise has almost certainly never shot photos with anything more capable than a toy iPhone camera.

To give some context, my brand-new, high-end 5D Mark IV shoots photos that can be from 30–70 megabytes each depending on RAW settings. Even though it supports 802.11n, if memory serves, all devices in IBSS mode (without infrastructure Wi-Fi) are limited to 802.11g speeds. So in practice, unless you bring a Wi-Fi router along with you (no camera supports the captive portal Wi-Fi that you'll find in every hotel on the planet), you'll be limited to only 54 megbits per second.

At 54 megabits per second, transferring a typical daily run of 500 photos at 70 megabytes each takes almost an hour and a half, and that's actually slightly optimistic. I do use the wireless functionality to transfer a few pics at a time from my camera to my iPhone while traveling so that I can quickly post pics from my real camera on Facebook. It works well for that, because I'm only grabbing five or six pics at a time, and I'm getting a much smaller JPEG copy instead of a RAW file.

At night, though, the flash card comes out of the camera and goes into the side of my laptop, where I spend only about four or five minutes to import that entire batch of photos. If Apple had bothered to keep their SD card reader hardware up-to-date, it would take under two minutes, but the two minutes saved isn't worth the hassle of trying to dig a flash card reader out of my bag.

With a laptop that lacks a flash reader, however, the entire equation changes. Suddenly, my choices are to either try to dig out an SD card reader (which will always be hard to dig out of a camera bag) or carry a retractable USB 3.0 cable (which turns out to be easier to put in a place where it is accessible, because it is so thin) and use the camera itself as a reader, albeit with the same poor performance as Apple's old SD card reader, and draining the camera battery the whole time. Both choices are approximately equally bad, and the decision to hobble their hardware by removing such a convenient way of importing content makes me seriously question Apple's commitment to the photography market.

Then again, I never used Aperture. If I had, I'd probably have much stronger negative comments....

And finally, Falsehood #5. What universe are you from? Have you even shopped for cameras ever? I cannot even fathom where you're pulling all this nonsense from.

Pretty much. Apart from cellular phones (where nobody uses the micro-SD slot anyway), pretty much the only cameras that use micro-SD are the little cameras built by GoPro. All pro cameras use either CF or full-size SD, because when the camera isn't a tiny little toy, the size savings of micro-SD aren't enough of a benefit to make up for the smaller contact size and the resulting decrease in reliability and robustness.

Nothing you say is true to the point where you're either delusional or trolling.

Trolling, I'd imagine. Either that or it's an Apple employee astroturfing. Hard to say which.

Comment Re:Way more braindead to take large hit on battery (Score 1) 217

A Mac Pro is not 'portable' if you still have to lug a screen around.

Yes, and? The same argument applies for a "laptop" that you MUST lug a power adaptor around with constantly. That is not a device amenable to using on the patio or the bed. In fact the choice Apple made seems like the are tailor made to the case you lay out - which how most people (including myself) use laptops, and why suffering a huge hit in battery life just to go beyond 16GB is a non-starter.

I absolutely need 32GB in order to be properly productive.

I find this very dubious considering much of the work I do is very memory intensive - iOS development and a lot of high-resolution photography including some huge panoramic work, more recently some neural networking stuff - would I *like* more memory? Yes. Do I *absolutley need* more memory? No, I get by OK wth what I have now. If anything the real memory need people would have for modern tasks is not main system memory but GPU memory, and there Apple is still doing OK with 4GB of memory, if anything else I would have liked to see more there.

Most modern professional apps are used to dealing with partitioning tasks into smaller amounts of memory and the main SSD on the new MacBook Pros is extremely fast, making those applications run quite well (mainly affects the pano work I do).

If you look at Activity Monitor sometime, I think you'll be surprised at how much memory you actually use...

Comment Re:False premise (Score 1) 478

When everyone in the house had a computer, a house would have 4 computers. Now, the personal device for everyone is the phone, and the PC is one per house. So the demand dropped to a lower (but still healthy) baseline. And the cost of the computer dropped as well, with the average computer being cheaper than the premium phone.

The death of the PC is over-stated. Fear-mongering by those who benefit from the one-PC per person numbers we saw near 2000.

Comment Way more braindead to take large hit on battery (Score 1) 217

for people that have a legitimate need for more than 16GB of ram battery life is a secondary factor

What laptop owners would that really be true of though? A handful, even among pros... if it's going to be plugged in all the time, and battery life is of secondary or no concern - then my not just use a Mac Pro? It's also fairly portable and will be much faster (yes, even before any updates to the current model).

I personally cannot see Apple releasing a laptop with an option that has way worse battery life just to add more RAM at the very top end - nor making a whole other variant of motherboard to support that option, or adding complexity to the existing design.

In the end it's just a matter of a single year before truly top-end purchasers will get a laptop with more than 16GB of RAM. For the past few years CPU improvements have not been all that large, so not being able ot buy now is not that huge a hit...

What I'm hoping to see in the next revision of the MacBook Pro is an even better GPU.

Comment Credit card chargeback. (Score 4, Informative) 85

Go to your card provider (Visa/MC/Discover/Amex) and tell them to remove the charge because the service was not rendered and/or the charge was improper.

They will.

Once AT&T starts getting a lot of chargebacks, they will do something about it.

I had this sort of thing happen do me years back in NYC with Verizon. I called to cancel, was given a confirmation # and everything, and was still billed again the next month. When called again, furious, the manager I was escalated to said that they could not offer a refund because they did not have that policy. I said I don't care about policy, give me a refund, and he said there was literally no way for him to do that in the system and suggested (of course) that I accept the service for a month, since I'd already paid for it, and then if I didn't want it next month, I could call and cancel [n.b. AGAIN] then.

I hung up on him, dialed Visa, and had them charge it back. Of course THAT got Verizon's attention and a day or two later I was called by retention or some similar department to offer me a discount if I would stay on, along with a lot of apology garbage.

I told them I'd rather eat a bug.

Comment Re:Just what the world needed most urgently... (Score 1) 186

Interesting, given that micro plugs are rated for more insertions than minis:
"The newer Micro-USB receptacles are designed for a minimum rated lifetime of 10,000 cycles of insertion and removal between the receptacle and plug, compared to 1,500 for the standard USB and 5,000 for the Mini-USB receptacle." [ ]

Slashdot Top Deals

Real Programmers don't write in FORTRAN. FORTRAN is for pipe stress freaks and crystallography weenies. FORTRAN is for wimp engineers who wear white socks.