Not the same exact thing but can you see why trying to back up your confirmation bias can be asinine?
Not the same exact thing but can you see why trying to back up your confirmation bias can be asinine?
I imagine it will be approximately the same, or less, as the uptake for Obj-C when iPhones became a thing, which is "not terribly impressive".
Suddenly becoming one of the fastest growing programming languages in use and making several top ten lists isn't terribly impressive? Ok...
And now you're telling me that Swift -- which is essentially a tweaked Obj-C -- is "the biggest new language in a long time"? You can't even USE the language to program on anything other than OS X and iOS!
So, one of the most popular platforms on the planet (Apple is going to sell 71 million iPhones this quarter alone) isn't significant? Also when you say that it's a "tweaked Obj-C" that shows you have no idea what you're talking about.
I'm not seeing it, man. If a single popular smartphone and 10% desktop OS market share were enough for a language to piggyback off of to mainstream adoption, Objective C would be mainstream for cross-platform development. And it's not.
I like how you cite a number for OS X but not for iOS.
One last thing? Apple's only the "world's biggest company" because it overcharges for all its shit products, and stupid people don't see what a bad deal they're getting. In importance to the programming community, they're well below Google and Microsoft. Don't believe me? Take a look at C#'s popularity versus Obj-C.
Wow, where to begin. First you try and poison the well by saying that yes, Apple is the world's biggest company but only because they charge money. For their "shit products" no less. However, iOS is sitting at 44% market share which is #2 only to Android at 47%. But Android is only at 47% because it's on everything from high end Samsung devices to the crappy devices you can get at the checkout line at your local grocery store. Your disdain is for a company whose OS is only #2 to an OS that literally built its empire on "shit products".
But that's not the best part. The best part is that your example of a well done programming language is C#. I love C#. I've made my living in C# for close to a decade. It's a fantastic language. It is also, like Swift, a proprietary language designed by one company for their own proprietary OS. That's your yardstick. Yes, there is an always-behind implementation by the open source community but it's also a language that's over ten years old, as opposed to Swift which is literally six months old come Monday.
Again, this is a new, modern programming language introduced by the biggest company on earth for one of the biggest platforms on the planet and the uptake on it is unprecedented. C# didn't experience uptake this quickly because Microsoft had to explain what
I've been doing Obj-C for a few years now and I'm using Swift in a new project.
Swift all the way, mainly because Swift is just a much nicer language. Obj-C has a bizarre late 80's syntax which is not found anywhere else so it's very strange. Except for random places where it's not. There was a half-assed "Objective-C 2.0" which introduced dot notation but not everywhere or consistently. There's tons of things you can do with it that are unsafe and shouldn't work (found out a lot in translating some Obj-C code to Swift)
There's still going to be a bunch of Cocoa stuff to mess with (i.e., there's no intrinsic date concept so you have to mess with NSDate) but at this point learning Objective-C is a waste of time. At best you will have a few more online resources to consult with versus Swift but Swift is the biggest new language in a long time - a language designed by the biggest company on earth for one of the most popular platforms on the planet. The uptake is more or less unprecedented.
Anyone who prefers Obj-C just doesn't want to learn something new. Apple didn't invent a new language because of hipness reasons, they did it because their platforms are saddled with this shitty language which is missing modern conventions and is difficult to learn and use.
Just use Swift.
but is there any reason to not have windows that simply rotate 180 degrees so that they can be cleaned from the inside?
Fifty or more floors up the wind flying through would be enough to usurp anything in your office not nailed down.
You would have to design office spaces such that window washers would be able to get in and clean the windows which is tricky and messy especially given a lot of windows go to people who have offices with locked doors.
There's probably a ton of architectural issues involved with a building where very high up you could potentially have openings on a regular basis. One day one of the revolving windows doesn't close right and Susie from accounting trips and falls and lands on the improperly closed window and falls to her death.
A stock trader on a bad day knows the window can be opened so he jumps to his death
This is something a lot of smart people have thought about for decades and the end result is no, there's not a better way. But let's not stop a bunch of computer engineers on Slashdot from thinking they have a better solution after a couple of minutes brainstorming.
The second person you're referring to does not have Ebola. The deputy did not come in contact with the patient.
The patient was not at home when he went in to the apartment.
The family of the patient was home but they were not showing symptoms (still aren't) and so they could not have spread the disease even if they have it.
Ebola is not an airborne virus so a facemask would have been pointless.
Basically you're a moron and the fact that you're doing so on a site famous for science facts and propagating the truth is just sad.
My take on OpenArena was based largely on this comment from last year which reads in part:
I had done all the work necessary to update the OpenArena port to the latest version at the time, and then played "follow the patchlevels cause their dev practices suck" for several more versions. I edited their wiki, writing out directions for getting the game running from source on FreeBSD, which was pretty easy to do...Which they promptly deleted and said, "just use the Linux version."
When I was working on the port I asked them repeatedly what the build deps were and such...They didn't know. They generally just banged on it and installed stuff until OA built and ran. Never once did they actually document what it took to build the game. They were truly representative of the kids-table level of QA/RE that seems to be commonplace in the small-project OSS development community at large. How many times did they make a major release, followed quickly by several patches to fix minor oversights that resulted in major problems and could have been avoided with checklist of "what to check before we release?"
The person who you reference, Time Doctor, who heads up the ioquake3 project, is the polar opposite: someone who's probably done more for Linux gaming than just about every other developer combined. Also, the original posted said he had to switch to using ioquake3's code for the FreeBSD port because of the OA assholes.
Time Doctor posted a follow-up comment:
The experience of working with the OpenArena project was similar to that described by HEMI_426. At this point they have cut off communication with us and I would be surprised, but happy, if that relationship ever improves.
So, now we are attempting to create our own freely distributable, creative commons licensed, game to distribute whenever anyone downloads ioquake3 that won't be "adults only" and won't have anything to do with OpenArena's direction.
Time Doctor is widely credited as being the "go to" person if you want to make a Linux port of your game and don't know how. He's personally responsible for the Humble Bundle having Linux games, which is one of the biggest catalyst for the recent surge in Linux gaming and may have led to SteamOS.
Your AC hit and run bashing makes me wonder if you're part of the OA project, which if true basically means that no, it hasn't changed.
I'm a Quake Live Pro subscriber (got a year as part of a QuakeCon package) and I've been playing Quake 3, then Quake Live, since 1999 when it was released.
I'm sure the hardest of the hardcore players will find something to complain about but really, the changes are fine. If the changes attract a lot of new players at the expense of the old guard then fine, the game will be better for it in the long run. The real test will be when the game hits Steam soon and a critical mass of people will have access to it. And like even the article points out - there's still a way to play the old way, and the most popular mode - duels - is unchanged.
And really, the original Quake 3 game has been open source for nine years now, if the old guard really wants to all you would need to do is make a version that uses Quake 3's assets but adds a matchmaking system or a server browser that's up to 2014 standards. To some extent that's what the original goal of Quake Live was, whether or not it ever achieved that is debatable but I know I can fire up QL and be in a game I like in less than a minute. If you think you can do better, go for it.
And as I write that it occurs to me that this is to some extent what the OpenArena project was supposed to be about but nine years in all we have is a dodgy 0.8 release and a core group of developers who are reportedly representative of the absolute worst qualities of the open source movement (slow to release, hostile to newcomers, actively sabotaging any FOSS ports that aren't Linux, etc.). So to some extent people who did think they could do better (albeit slightly different aims - OA wants to not rely on Q3 assets) have tried and not really gotten much of anywhere with it.
Perhaps just recompiling it against the latest SDK (still targeting 4.0 or whatever) would be sufficient.
I'd say have some sort of "hey are you still there?" email from Apple but making sure you can still compile the thing and re-submit it would be enough of a barrier that people with the Justin Bieber Slideshow apps wouldn't bother with.
App doesn't compile in the latest SDK? Well you better get on that. Don't like it? Go to a non-curated platform.
Just an idea.
Also, define "ran fine" - ran at max settings at 60fps with no stuttering or framerate drops? It definitely ran acceptable in some configurations on release but no one could max it out at a high resolution on day one.
But yeah that was a new idea at the time - the idea of a game being so graphically advanced that it outstripped the hardware of the era. It was always a thing that so long as you had the beefiest system then any game on the market could run perfectly. Games like Quake 3 just gracefully added features like curved surfaces when it was possible to do so. Crysis and ports like GTA4 were the first to say "no your shit still can't run the max".
To some degree it was about the messaging (had the mode been labeled "extreme" instead of "high" it might not have bothered the high end people so much) but really I think the initial issue was that the demo they released proved to everyone that it ate shit on their system. I had a 7800GT (I think) and even at the lowest settings it was crap.
For a recap: they came out with Crysis (the first one) in 2007, and it didn't sell as much as they wanted it to. They blamed piracy. I'm sure the game was pirated, probably a lot, but I don't think that's why it wasn't selling like they wanted it to. It wasn't selling like they wanted it to because it was released at a time when PC's weren't powerful enough to run it. By which I mean, in 2007 when it launched it was literally impossible to run it at the best settings. Like, it was impossible to build a PC that could run it at max settings at a high resolution at a high framerate.
And people knew this because they released a demo. You got a first hand look at how this game was going to turn your PC into a slideshow. So people didn't buy the game because they knew they didn't have the pipe to smoke it. Releasing a demo probably hurt Crysis' initial sales.
And this wasn't unforeseen - in the runup to the game's release people expressed surprise that EA, who had been all about cross platform development or cutting off the PC, here they were releasing a game just for the PC which a lot of people couldn't run.
So, the game didn't sell either because of system requirements or piracy or both. And again, I'm not saying the game wasn't pirated, I'm just saying that Crytek claimed this was the only reason it wasn't selling, and in no possible way could it be linked to the fact that they released a game which just told every PC owner on earth their system wasn't good enough.
That's not the real dick part to me though. The real dick part was when the CEO said their "proof" of piracy was that the patch for the game was downloaded more times than the copies of the games that had been sold.
OK, think way back to 2007. Hard as it is to believe, Crysis wasn't on Steam. Back then it wasn't a given that your PC game would be on Steam. Consider Fallout 3 was released in 2008 on disc-only, no digital services at all, and had GFWL baked in. Two years after that Fallout: New Vegas launches as a Steamworks title on Steam on day one, no GFWL in sight. The switch was quick but in 2007 it hadn't happened yet.
So by that logic when Crytek released a patch for Crysis, people had to go manually download it. So I can see a shred of logic to the idea that if more people are downloading the patch than buying the game then some number of pirated copies are getting patched.
The thing is, the statement doesn't make sense. How many more times are we talking here? I know back then I personally downloaded the patches a few times, usually after I would format and reinstall the game (this being before Steam made that sort of unneccessary). If the patch was downloaded 10x as much then you might have a point. But how do you even know how many times it was downloaded? The file was mirrored everywhere (I think FilePlanet still existed, etc.) did you add up all the downloads? Do all those services even give download numbers? Why are you not providing more evidence for your case?
Crytek's CEO also lamented how the Call Of Duty games were selling more copies. At the time, Crysis had sold less than a million copies whereas the CoD game of the year had sold ten million. The CoD games which had the advantage of being on consoles as well. Disregarding the fact that Crysis would hit the 1M mark soon (and according to Wikipedia has sold over 3M overall as of 2010), the CoD game sold better due to better marketing and just generally being a better game.
To be fair this was that dark era in PC gaming of the console games selling 9-10x their PC counterparts, to the point where some developers wanted to drop the PC entirely. However, if Cryek wanted to get into console gaming just do it, don't give us some sort of "you're all horrible software pirates" argument on your way out the door.
So they released Crysis 2 on PC, 360 and PS3. How did that go? They sold about three million copies, and less than the original game has sold on PC alone. Crysis 3's sales figures have not been fully revealed.
THIS is the problem I have with the "piracy is the problem" argument. Yes piracy is a problem but there's so much more to it and going to console development didn't fix their issues. Their real issue seems to be that they can't run a company worth a damn.
Anyone can write software
No they can't.
If you switched me with a sales guy I could do the sales guy's job on a technical level. I could talk to people, I could make calls, I could ask for money. I couldn't do it nearly as well as the sales guy and I'm an antisocial introvert so I'm all wrong for the job on a proficiency level but I could do their job, albeit poorly.
The sales guy can't do my job at all. He wouldn't know an IDE from his own ass. He has no idea how to write code. He sure as fuck doesn't know how to interface with COM objects or write cross platform code or perform code signing to get apps to run on mobile devices. Not "I can do it but not do it well", he can't do it at all. Not even halfass.
This isn't me trying to say programmers are special snowflakes, this is me saying that programming is fundamentally more difficult than anything normal people do, and most normal people don't actually want anything to do with it, which is why you often hear "hire a programmer" and not "learn how to program"
So with all due respect, no not just "anyone" can write software.
The Fire Phone runs Mayday, Amazon's live tech support service for devices.
I haven't experienced it myself but when I see the Amazon Kindle Fire commercials where they demonstrate you can talk to a live Amazon person to help you use your tablet, my first thought was "that would be great for my parents", especially since it would lessen the number of calls I would get from them on how to do something with their technology device du jour.
You would think that something locked down like an iOS device wouldn't lend itself to needing this kind of tech support help, but in certain areas - especially phone calls - there's a certain level of resistance to technology complexity with the older crowd. It sounds like I'm being mean with regards to age but I have known several older people over the last few years who went out and bought an iPhone because it was the new shiny thing and then took it back because they couldn't figure out how to use it or didn't like how complicated it made things. As much as it makes perfect sense to you and I that the phone is a more generalized computing device nowadays and wanting to make a phone call is basically launching a program, the older set knows that you used to just open the fucking thing and start dialing.
I'm not sure if the Fire Phone will make all that better (in particular I can almost guarantee my parents in particular would fucking hate the 3D screen thing) but I do think perhaps there's an untapped market out there for people who want a less-smartphone. After all, isn't that basically what "locked down" Android tablets like the Kindle Fire and the Nook are? Google, Apple and Microsoft are all trying to outdo each other on technical whiz-bang, and this entry from Amazon doesn't seem to impress the Slashdot crowd at all. Maybe this one is for our parents?
Or state very clearly (not in the fine print) that said device or software will likely cease to work past some date, but is guaranteed to work until that date.
They have done exactly that for many many years.
Look at the back of the Battlefield 1942 box - the game was released in September 2002 and it states that they only guarantee it can be played online until September 2003. This isn't in the fine print, it's on the back of the package like you said it should be, so that you can read it before you buy the game.
This caused quite a stink back in 2002 because people thought it meant that they absolutely would cause the games to stop working at that time but really EA was just covering their ass because they had been sued already by people who didn't get that Ultima Online required a subscription fee because it wasn't spelled out on the box well enough
Instead, EA has supported online for BF1942 through GameSpy for close to 12 years now. And you think they're assholes for going way beyond what they promised and don't release source code. And your other suggested fix is exactly what they did over a decade ago when they fucking released the game but you're too goddamn stupid to know what you're talking about.
My first gig out of college was for the same University I graduated from, and I worked on a mainframe doing COBOL programming, and some scripting in a proprietary language called NATURAL which I've never seen used anywhere else, ever.
One project I was handed was to update the 1098-T form. It's basically the IRS tax form for tuition writeoffs. Every year we had to produce a 1098-T form for every student which basically detailed what they paid in tuition. Every year the form was a little different (of course) so every year our generation program had to be updated.
What I got handed was basically a program which drew the form and then printed the data on the correct parts of the form. And when I saw drew the form, I don't mean we had a PDF or JPEG or whatever of the form, we actually recreated the form with whatever bog standard graphics package we used. Like, you would literally say go to (X1,Y1) and draw a line to (X2,Y2), then a line from (X2,Y2) to (X3,Y3), etc. It was like programming in LOGO, but for a legal purpose and without the cool turtles.
This doesn't sound like such a big deal, and it wasn't too bad, but what was tedious was the fact that you would program all this in, then run the program against a single fake student's data, and then you headed to the printer. The printer, in this case was the print room and it was three floors down and a few hallways away. Then you waited for the printout. Which would print as soon as anyone else's job who was in (virtual) line in front of you was done. The time it took to accomplish this was basically random.
And when you found out whatever small change you made didn't work, had the wrong effect, got the numbers backwards, etc. you got to do this all over again. Make small change, compile, run, wait hour(s) for result, lather, rinse, repeat. All with no GUI, no preview, no nothing. Oh, and the program I had, comparing the form printed last year to the actual 1098-T form from the IRS' site was not a 1:1 recreation - it had basically the same info as the source form but it wasn't a dead-on match. I'm guessing this was good enough for the IRS, and either no one had ever bothered to make this thing picture perfect, or the motivation to do so got lost along the way. Lord knows I wasn't going to do it either.
Over time of course you started to average out how long it took to get the printout and you'd wait at least that long before going to get it. And of course this wasn't anywhere near as bad as "come back tomorrow to see if it worked", but that whole process sucked and I don't miss that job at all.
Oh, and this was in ~2002 or so. I didn't really want to be a mainframe programmer but I had little experience in a shitty economy and I was told/promised that they'd be moving to an "all new web-based system within the next six months". When I quit two and a half years later to move to a better gig, it was still "within the next six months". I learned a lot from that job, I guess.