Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Comment What this is really about (Score 5, Interesting) 253

This is not about volitility or any other type of consumer protection. This is about protecting their points programs.

Basically, people were buying crypto coins on their credit cards, getting the points for the purchase, then immediately selling the coins and paying off the card. The points more than made up for the transaction costs.

Since all these cards were being paid off before any interest accrued, the credit card companies were losing a lot of money on the transactions in the form of reward payouts.

The CC firms will never admit this publicly, but given the number of people I knew who were doing this, I’d bet this is the real reason they’ll disallowing the purchases.

Comment Re:How I Handle Messaging Authority? (Score 1) 62

(reposting as myself to get some mod points)

My kids can message all they want with people I approve of. How do I enforce this? All messaging happens through my phone while I'm present (Facetime with relatives, text messages to plan playdates, photo sharing, etc).

Anything else that tries to set up messaging accounts for my kids? If messaging/chat features can't be disabled, my kids can't use it.

-Chris

Comment Re:Hand-typing Forms (Score 4, Insightful) 135

Handwriting, speech recognition, and image processing along with their machine learning foundations do not impress the older /. crowd because they are not new technologies.

Dragon has been doing speech recognition better than Siri for almost 20 years. Simple command-based systems that only recognize a few words have been around longer than that.

Handwriting recognition for constrained tasks is also not new. The US Postal Services has had zip-code OCR systems since the 1980s.

Feature detection in images is not new, either. The only thing that's really changed there is we have the processing power to do it at scale.

Going beyond the applications, all the modern "AI" systems are simply classifiers on steroids. Processing power and great storage capacity allows us work on larger data sets, but in the end, we're just creating complex hyper-planes to bin data in one bucket or another.

Machine learning algorithms are great tools and it's great that we have the compute resources to really leverage them, but there's nothing really new that wasn't obvious 30 years ago. The only question was when we'd have the compute power to start doing the cool things we knew they could be used for.

(ok, I'll give a little credit to the deep learning researchers for bringing neural nets back into vogue, since those were written off 30 years ago during the AI winter, but they're still just classifiers from the mathematical perspective).

-Chris

Comment Re:Alamo Drafthouse (Score 1) 370

Was just coming to post the reasons to go to the Alamo as really the only reasons to go to a theater.

You left off my personal favorite: fried pickles

(in general, their food sucks, but there are a few bright spots on the menu).

Fried pickles, beer, and the great pre-preview content (including the PSAs) are what keep me going there.

Comment Re:Holy Blinking Cursor, Batman! (Score 1) 236

Indeed, and it didn't need that fsck'ing HARDWARE cursor emulation that the PC needed, either!

(Yes, Hercules, CGA, EGA and VGA had the text-mode cursor in hardware (including the blinking)). VGA (and maybe EGA, I forget) also had a single "sprite" for the "hardware" cursor.

I was coming here to make exactly this point. Cursors used to be hardware sprites that required no additional CPU cycles. At some point windowing systems took over the task of rendering the cursor, but still typically used XOR'd sprites to keep things fast and efficient. Then they started using GPU-optimized code and software CPU emulators as a fallback and things went downhill from there...

I still keep my 80s-era graphics programming books on my shelf as a reminder of how to do fast graphics when all you have is the ability to draw pixels... ;)

-Chris

Comment Interviews need training, too (Score 5, Interesting) 1001

What I've always found funny about this interview process is that the assumption is always that the interviewer knows the correct answer(s) to the question. It's painful when they don't.

Years ago I interviewed at Google and was asked a question about bit counting (some variation on "given a bit vector, wat's the fastest way to count the number of 1s?"). I quickly answered, "well, if your processor has a population count instruction, stream your vector through that and accumulate the result in a register". Having just evaluated bit counting methods as part of my Ph.D. dissertation, I knew this was the fastest way to do it, assuming the instruction was available (it's not on x86, but is on Power/VMX and most DSPs support it as well).

After I got a blank stare back from the interviewee, I said, "Oh, you were looking for the lookup table answer". We could have left it at that, but he went on to explain using some very convoluted logic how the lookup table would actually be faster than using a dedicated instruction and that my answer was clearly wrong. I mentioned a little bit about the number of cycles required for each approach, but he had none of it. In his mind, I had the wrong answer, even though my second answer was what he was looking for.

It was at that moment that I realized Google was not going to be a good place to work.

-Chris

Comment Re:Don't buy what you can't afford. 3,500feet, $24 (Score 1) 805

As others are saying, don't live in the Bay Area if you can't afford it. But, if you want housing that's affordable and not too far away, it's not impossible...

There's the whole Central Valley within driving distance of the Bay Area. Sure, a 1-2 hour commute isn't ideal, but with a flexible work schedule and work-from-home options some days the of the week, it's totally doable. You can get a nice house with a pool in a small CV town for less than $250k. Hell, in New England "bedroom communities" are all over the place and feature similar price differences/commute times. (you can even throw in a few nights each month for a hotel and still come out ahead)

Fwiw, I grew up in the Central Valley. Day in and day out it's really no different than living anywhere else - you eat, sleep, and work, lather/rinse/repeat. Oh, and you're much closer to the Sierras than you are in the Bay Area, if mountains are your thing. An hour and half to the slopes is much nicer than the 6-12 hours it takes to get to Tahoe on the weekends from the Bay Area.

-Chris

Comment Re:mail.app (Score 1) 216

Of course, since this is in mail.app, which I use constantly, this is the first I've heard about it.

I wonder how many great features in Apple products people miss simply because Apple refuses to provide sensible documentation and instead relies on users to "discover" features organically or via message boards.

-Chris

Comment Geeks repellant! (Score 3, Interesting) 233

So, at the more hardcore geek conference (Supercomputing comes to mind), there has never really been an issue with booth babes for a simple reason: geeks are scared to talk to them. Every now and then a company will hire one, only to see a nice exclusion zone form around their booth. Sure, sales guys from other booths will stop by, but none of the intended audience will risk talking with an attractive female.

Comment Tail wags the dog... (Score 5, Insightful) 293

As a developer/power user who sits at the far end of the bell curve, here's what I see as the folly of Apple's ways.

I switched to Macs after working on a beta version of OS X in the late 90s. Unix + sensible desktop was enough to keep me off the Linux train for daily use. That the hardware was also well designed with a good level of performance was also important. For the next 10 years or so, that held true.

But, in the last 5 years:

- the hardware has stagnated (e.g., I'd really like to buy a MacMini for my kids, but there's no way I'm shelling out Apple prices for 3 year old processors)
- new hardware decisions make it difficult to use existing peripherals (music is a hobby - no way am I dropping a few grand on new audio interfaces just b/c I upgraded my Mac and need to support new ports)
- Apple has ignored sensible design decisions made on the non-Apple side of the world (specifically, touch screens on laptops - my wife as an HP for work and the touch screen is useful, those old studies that claim otherwise are just that, old and dated).
- The OS continues to have a slew of undocumented features that may or may not be useful, but definitely affect performance (the real dig here: just document the features Apple, I hate discovering things OS X has done for years on random blog posts)
- The iPhone and OS X still don't work well together

Why does this matter from the perspective of the bell curve and my place on it? Simple: I switched not only my family, but also my company over to Macs. The middle part of that curve was filled by people following people like me into the Mac universe. I'm seriously considering dropping Macs for computer use and (horror of horrors) going back to Windows + Linux. If I go that way, it's just a matter of an upgrade cycle or two before those in my sphere of influence abandon Macs as well.

Apple seems to have forgotten that it's us geeks that couldn't wait for Linux on the Desktop that helped drive adoption 15 years ago. Kinda like the Democrats forgetting that the working class matters.

-Chris

Comment Data will not save us (Score 1) 635

"The report also calls on the government to keep a close eye on fostering competition in the AI industry, since the companies with the most data will be able to create the most advanced products, effectively preventing new startups from having a chance to even compete."

I call BS on this one. The two companies with arguably the most data anyone has ever accumulated in history are both incapable of producing new products, despite the fact that they know everything about everyone.

Google's only innovation was its advertising platform. It's a cash cow. That cash and the data in its search/mail systems has failed to yield anything new and innovative beyond incremental improvements in search.

Facebook's only innovation was leveraging privilege to build a social network. Remember the early days where it was just limited to Harvard students and then a few other universities and then finally everyone else? That was a brilliant strategy to create artificial scarcity to build demand. They also leveraged that time of limited users to fine tune the platform and create a social network that was generally acceptable to a broad user base. Since then, they've made a ton of money and collected a lot of data (granted, it's mostly people's family pictures and political rants) but haven't done anything innovative.

Innovation will always come from the small disrupters. Both companies made their innovative moves when they were small.

-Chris

Slashdot Top Deals

I go on working for the same reason a hen goes on laying eggs. -- H.L. Mencken

Working...