Become a fan of Slashdot on Facebook


Forgot your password?

Comment Depends on what homebrew means... (Score 1) 181

The last few years we have seen Microsoft, Nintendo, Sony and Apple all bring out means to thwart homebrew development. The app store on both Android and iOS have taken many homebrew devs over to try and break the market.

Well I guess it depends on your definition of home-brew, but I think it is hard to make a game for iOS or Android that wouldn't be let into the store (unless you say crash on launch, or are noticed grabbing all the user's contacts without permission). It is in fact far simpler then it was to get your own games onto the Dreamcast! You get the real dev kit for very cheep (cheeper then the hardware you are developing for), and while the hardware to host the development on isn't free, it isn't exactly expensive (hardware dev systems for the 16bit era ran to $30k, now it is just a Mac mini, or pretty much any old PC for Android).

On the other hand if homebrew has to mean "we figured out how to get onto the hardware ourselves and made our own psudo dev kit", yes Android and iOS are hurting that effort because who really wants to go to all that bother when they could just get down to making a game?

Comment Re:FUD (Score 1) 375

to get into the big market (Android), it's well worth it

The market may be big, but does it pay? A lot of small developers have reported the android apps make a whole lot less then iOS apps ("an order of magnitude" sticks in my mind, but a quick google search shows a lot of 4x articles, and a smattering of 11%, but I didn't see an order of magnitude in the first page of results).

Assuming the 4x number is true, is it worth getting $1.25/app and writing C/C++ for 80% of the app, and then writing the last 20% in ObjC and again in Java -- or are you better off writing it all in ObjC and then starting work on the next app? (I imagine the right answer depends on how well served your core logic is by ObjC and the available frameworks, and also the total sales involved, and if you have another app that would make similar money, or if you are "played out" of good ideas)

Comment Re:FUD (Score 1) 375

The business logic for your app should be written in a platform agnostic way, and will be trivial to port.

Sure except...

...different platforms have different optimal workflows, and capabilities. This frequently drives changes into what you would think of as platform agnostic code. This is especially true of games but is true of most software. The effects of this can vary from just having a bad port (maybe a non-natiave feel, or just plain a kooky UI), to needing to re-write large parts of the "agnostic" code. This can be costly, and time consuming. Also if you have future versions of the products you need to decide if you want to port these changes back to the orignal platform (or platforms), or hold them apart. Both have their own sets of issues.

...even code that can be made platform agnostic isn't always as simple to write or as fast in platform agnostic form. For example use of CoreData on OSX/iOS is very platform specific, but it is tied to how your objects persist across executions, and even how you represent the objects. It can save an enormous amount of effort (save/load is trivial, undo/redo can be close to trivial, and so on). When it is the perfect fit as much as a third of the code you would normally need to write goes away.

Or if you look at Android, writing the "platform agnostic" part in Java gives you garbage collection so you spend very very little time hunting down memory leaks (you might end up with a few places that forget to nil out a pointer and end up pinning down extra memory for too long, but this isn't as common or painful as memory leaks in C/C++...). No debugging pointers that now dangle into the wring types of objects or to system heap structures. That can safe a whole lot of time.

However a platform agnostic core (business logic, or game play engine, or whatever) won't be able to use any of that. You have to restrict yourself to the intersection of what every platform you want to port to will have. I would be surprised if it cost you as much as having to write it twice, but not if it cost you a good 33% more then writing it platform specific.

Then you have the platform specific (UI?) part of your application. Could be pretty small for something like bug tracker, could be very large for a game or maybe a bike ride activity tracker. If making the core agnostic costs you 33% more, and then doing the platform specific part is significant the new platform has to be a very large percentage of the original platform's revenue before it is worth doing vs. making the faster, cheaper, but less flexible core logic and then moving on to a new project (or the next version of the current project).

I know this is sad when the platform you love is the underdog, but economics isn't called the dismal science for nothing.

Comment Re:designed to fend off malware (Score 2) 230

There is a checkbox in System Prefs to turn it off. Or if you control click on the app and select open it will launch (and white list for future launches).

It is really so people don't double click an app that has an icon that looks like a MP3. Or maybe they won't launch what looks like PhotoShop, but isn't. If it gets enough adoption from 3rd parties I can see it being a huge help to the average user. If it gets low adoption it'll be more useful for folks that really know what is going on.

Comment Re:because - (Score 3, Interesting) 793

I think C's originators (or at least the still living one) changed his mind about some of them. From looking at the go language which targets the same programming niche some of these things have been addressed.

The semicolons are implied in most places now (as a side effect it enforces a brace style many people dislike, but happens to be my preference -- so even though I'm happy with C's semicolons this is a borderline positive change for me).

Declaration syntax has been made "more sane", which isn't surprising, and by the time K&R wrote the C book they had already started regretting it (one of the assignments was to parse C declarations into "english", look at what the authors wrote about it).

Go revamped switch (and a lot of the control flow operators).

Some of those changes might just be because computers have gotten a wee bit faster in the last 25 years or so, what constituted a great tradeoff on a computer with a 64K (split I+D) address space and maybe 512K max RAM a clock cycles measured in a few Mhz (oh, and these were multi user computers) is a wee bit different from what makes a good tradeoff now. (semicolons I think wind up here)

Some are likely to be a change they would still have made on the original system. (most of the control flow changes wind up here, likely variable decl too)

Comment Not so much (Score 0) 492

Sure, I admit there are similarities. Both are giant greedy companies. Both gobble up competitors, and when they are prevented form that they both launch competing products. I view Google's "Don't be Evil" lip service as about as transparent and self serving as the 1990's and 2000's era MS open source lip service.

On the other hand Google's own products are fairly decent. MS's are largely crap. Most times when MS buys a company the "adopted" products go rapidly to crap. Google's "adopted" products tend to trundle along for a while. MS was a creditable platform vender and most of the assaults on other companies were against those that built on top of MS's own infrastructure. Google has only made one Android related purchase that I can recall. Maybe that is an area ripe for future abuse, but for the moment they have not had their own "it ain't done 'till Word Perfect won't run" moment.

If Google is the new MS, then at least the trains run on time. (most days) It ain't much, but at least it is something.

Comment ATT+VZW is the best option in CA (Score 1) 134

CA is a mighty big place, and I haven't traveled all that much of it. However I do happen to have phones on ATT's 3G network that can act as hotspots, and USB networking devices for VZW and Sprint. I don't have T-Moble because the coverage map looked like it wasn't really useful. I also have an RV, and have left "major city areas" quite a bit. I don't currently have any 4G networking gear (unless you count all the 3G stuff the ITU reclassified, in which case I have 4G but no LTE).

In my experience there is a lot of coastline that has no service from anyone. There are some inland areas where hills or mountains block signal from everyone. Some places I can get ATT but not VZW. Some places I can get VZW and not ATT. Around the coast it was pretty even. In land VZW seems to have a little bit of an edge, but not a lot. Many places I could get VZW and/or ATT but not Sprint. I can't recall any places I could get Sprint but neither ATT or VZW, but looking at coverage maps there might be such places, I've just never attempted to get signal there. However there is no substitute for actually testing in your location. Everyone has said VZW has the best coverage for years, but for years my house had no VZW service, while it did have spotty ATT service (recently VZW started serving the area, and also around the same time ATT's service picked up a lot as well)

Most places where I could get ATT and VZW 3G the ATT was faster. Sometimes it was even faster if the device showed "fewer bars".

The "reasonable best option" I would see is to get one device on ATT and one on VZW, and ignore the rest. My VZW device came form I don't know if they still sell them or not. They use to have a $50/month plan for 10G. It looked like a no-contract plan, but the way it was set up when you stop paying the monthly fee they want the device back or hit you with a big disconnect fee (and they charge for the device up front), however even so it was still a bit cheaper then other VZW data devices, just not by as much as it first looked. Things may have changed since then, so look around, but make sure you give them a peek. My ATT device is an iPhone (was a 3GS, then a 4, now a wife and I take turns getting a new one each year). Another option is the new iPad, they have large up front costs, but a month by month plan (no fee to cancel, no fee to restart). From what I have read on the net only the VZW one currently supports hotspot sharing, ATT still hasn't gotten their ducks in a row there. Depending on what you want to do with the internet you might be just fine only having access on the iPad anyway though.

I have no data for Nevada. Last time I was in Arizona I didn't have a VZW device, but ATT seemed fine pretty much everywhere.

Comment Re:Small Claims DDOS (Score 3, Interesting) 730

Don't sue youtube. That would be under the ToS. Sue the party that claims to have reviewed the audio and decided you are infringing their copyright. That sounds like a lie that damages your reputation. In other words libel (or slander, I forget which is the written form).

I'm not a lawyer, I don't even play one on TV, but this seems like the kind of thing you could take to small claims court for say $1000 or so and win.

Of corse you would be better off with a lawyer, but I don't see how you could sue for enough to actually pay the lawyer. If you have access to a free half hour consult or something you could ask a real lawyer, they might have a different opinion.

Comment Re:Maybe, maybe not. (Score 1) 785

I expect that this new developer will underestimate the work load, miss deadlines, go over budget, not document anything and possibly even quit before the project ends.

I was the new guy once. I underestimated the workload, but "the new guy" tends to have a fair bit of free time, and a willingness to use it for work. So that didn't hurt anyone (except myself). I didn't have any budget, so I didn't go over it (unless you count hardware, and I stayed under budget there). I didn't miss many deadlines, fewer then many senior engineers. I didn't quit during any major projects. However I sure didn't document squat. I made vast complex systems and left behind only a sketch of how to operate it, and nothing on how the internal bits were lashed together.

To whomever became my maintenance engineer after I left, I'm sure sorry. Hope you lived.

I believe that allowing a senior developer to learn new skills on the clock or even be sent to company paid training is a big moral booster and makes the guys job more interesting too.

I bet it would be. The only place I worked at that did even a little of that basically just sent us to a few classes (or hired some really interesting folks to come teach), but that was less then a week a year. It was more morale building then useful to the company. For example you can't teach software guys much of use about VHDL in a week. Sure it can be interesting to them, but not very useful to the company.

Actually I'm wrong, come to think of it, the place I'm at currently has a "mostly once a week" hour long "class" on something, maybe a new technology, or maybe some API that has been around for a while that someone thinks more people ought to know about. It is nice, but for almost every topic it is really only enough to give you an idea that if you hit a problem with a specific shape that you should go look into the FOO API, or the BAR language, and do some real research in how to apply it. Seriously, how much OpenCL (for example) do you think anyone can teach or learn in an hour?

Every place I've worked has really just expected the senior folks to learn new things on their own time (with maybe a few pointers). Largely we did.

Comment Maybe, maybe not. (Score 1) 785

the new grad knew a hot emerging technology that a client wanted.

Did the senior engineer know the technology? Sure the technology may not have existed when they were hired, but for a senior engineer to deserve the title they learn or invent new technologies. If the senior engineer did know the technology, or could pick it up before the new hire becomes productive (i.e. learns company procedures and enough politics to operate effectively), then the new hire isn't worth the extra cash. If the senior engineer doesn't know the new technology and can't pick it up fast enough, then the new guy deserves more. At least until the senior engineer catches up.

NOTE: I don't mean to imply a senior engineer knows all new technologies, nor that they can alway guess what might be important to their company. Just that they can and do keep up to date on things, and sure sometimes a new library or language they missed or dismissed as "no better then this other thing I already know" becomes more important then they had guessed, but they ought to be able to pick it up. For example prior to the iPhone becoming popular it might have been reasonable to not know any Obj-C. Someone coming out of collage might know Obj-C, but a senior engineer with knowledge of C and any sort of smalltalk style OO language should be able to pick up Obj-C really fast. Likely fast enough that they are able to make real programs while the new guy is still learning the ins and outs of the bug tracking system, and who to listen to and who to ignore in meetings.

Comment Re:A Better Question: (Score 1) 214

My personal pet theory is yes it really does matter. For a long time doubling every 18 months was the arbitrary goal, and Intel could say "if we don't hit X by Y then AMD will overtake us because they double every 18 months", and AMD could say "If we don't hit X by Y then Intel will smash us because they double every 18 months". So each poured whatever they needed into R&D to make it more or less happen. Sometimes one got more ahead then the other and got to roll in the hay for a while (or be unable to fill all the market demand).

If Moore's law wasn't sitting their prodding them to double or die every 18 months my guess is they would have sat there and gone "Will they REALLY invest $12.6 billion dollars on a new fab? That seems stupid, I bet they will try to get by with a 10% bump from a new microarchicture, so we should aim for 15%". We would have seen a LOT more severely lopsided product matchups, but overall I'm guessing slower growth.

Not that it matters so much now since they don't seem to be able to keep up with doubling processing capability (unless your problem set is very parallel...and even so Amdahl's Law will get them sooner or later -- add as many CPUs as you like, at some point there is a non-parallel part of your problem space and that part's performance will dominate, even with an infinite number of CPUs).

Comment Re:Apple specific? (Score 1) 122

So then why do they not just have a line-in jack?

A dock connector gets you line level output which needs less fiddling to get acceptable results from. I'll also bet there are far more people with a iPod dock connector on their iPod/iPhone then there are folks that walk around with a 3.5mm male-male cable.

However I think a 3.5mm line-in would be a good thing to have in addition to a dock connector.

Comment Re:Damage Meters built into client (Score 1) 175

The threat meter is arguably much more "necessary". I need to know, in real time, during the fight, if I'm about to pull aggro.

The default UI can be told to make a sound and flash the edges of the screen when you get "close" to pulling aggro (90% or so I think). Not at all perfect, but will do in a pinch. The tank is more in need of an addon, as nothing tells them when someone else is getting close (as far as I know).

Comment Re:Damage Meters built into client (Score 1) 175

So instead of clicking on a player's name in the raid frame and then hitting 1 for a flash heal or 2 for a shield

I use to use grid+clique for that, but with a minor change you don't need addons:

Macro: /cast [@mouseover] flash heal

Macro: /cast [@mouseover] shield

Put flash on the 1, the shield on 2. Mouse over whoever needs that flash and press 1. No "click then press". It works nicely (you can also even set it up to use target or self if there is no mouseover).

I've discarded clique. I still use grid as it can be configured to display a LOT more information in a small space then the built in unit frames, and just as importantly it can be configured NOT to display information you don't care about.

Slashdot Top Deals

Outside of a dog, a book is man's best friend. Inside of a dog, it is too dark to read.