Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IOS

Swift Vs. Objective-C: Why the Future Favors Swift 270

snydeq writes: InfoWorld's Paul Solt argues that It's high time to make the switch to the more approachable, full-featured Swift for iOS and OS X app dev. He writes in Infoworld: "Programming languages don't die easily, but development shops that cling to fading paradigms do. If you're developing apps for mobile devices and you haven't investigated Swift, take note: Swift will not only supplant Objective-C when it comes to developing apps for the Mac, iPhone, iPad, Apple Watch, and devices to come, but it will also replace C for embedded programming on Apple platforms. Thanks to several key features, Swift has the potential to become the de-facto programming language for creating immersive, responsive, consumer-facing applications for years to come."

Comment Re:Integrity much? (Score 1) 127

But all too often a moderator obviously born a Puritan steps in and ruins the fun.

A Puritan? Hardly. The social justice warriors enforcing political correctness everywhere aren't, and have never been, from the Right, let alone the Christian Right.

Evangelicals haven't had serious sway in this country for 75-175 years (depending on your specific issues in question and threshold). It's all about the left wing and grievance-mongering. Reminds me of the mid-90's, before South Park made being politically incorrect palatable to the masses again.

Comment Re:See it before (Score 2) 276

Some people keep saying but I have yet to see any personal evidence for this alleged "trend". All of my relatives, all of my friends and all of my colleagues (with not a single exception!) have a PC and a tablet in their household, if they own a tablet. I have never heard of anyone who uses a tablet but does not use a PC or laptop at the same time.

I also highly doubt that there is any statistical evidence for this trend, according to which clearly more people have a tablet and no PC/laptop than there are people with tablet and a PC/laptop. Probably for mobile phones but not for tablets. Tablets are additional throwaway/low lifespan gimmicks rather than replacements of PCs and laptops.

There are some, but mostly those are people who aren't using personal computers to produce in the first place. In the early-mid 90's a common refrain (I remember my parents even saying it at one point) was "Why do we need a computer?" For those people, virtually everything you could respond to answer that question with (intercommunication with others, organizing, entertainment, writing text documents, etc) is served by a combination of smartphone and/or tablet with access to internet-enabled applications. Maybe a tablet + keyboard (or a Chromebook) for extended writing sessions.

The business users, academics, and developers are still there, but they now make up a much smaller fraction of the overall computer market. When you add back in enterprise users where corporate policy aligns well with thin client / network computing paradigms, you get an even smaller fraction that needs local personal computing... basically just those above, plus those who need or want local control for reliability, network reliability, custom performance (eg, gamers) or philosophical reasons.

So a need for local apps will still be present for some folks, but the large surge in the late 90s and 2000s doesn't have a need for a true PC. It's a shame, because it increases the barrier to entry from consumer to developer (Apple products, ironically were great at *reducing* that back in the Hypercard-installed-on-all-Macs days), but it's good because it lowers the barrier to entry for access to computing resources generally.

tl;dr: Desktop apps (and "personal computers that aren't smartphones or tablets") are going to shrink back down to the market they were before the late 90's. Congrats, all of us, on becoming geeks again. =)

Software

Ask Slashdot: What's the Future of Desktop Applications? 276

MrNaz writes: Over the last fifteen years or so, we have seen the dynamic web mature rapidly. The functionality of dynamic web sites has expanded from the mere display of dynamic information to fully fledged applications rivaling the functionality and aesthetics of desktop applications. Google Docs, MS Office 365, and Pixlr Express provide in-browser functionality that, in bygone years, was the preserve of desktop software.

The rapid deployment of high speed internet access, fiber to the home, cable and other last-mile technologies, even in developing nations, means that the problem of needing offline access to functionality is becoming more and more a moot point. It is also rapidly doing away with the problem of lengthy load times for bulky web code.

My question: Is this trend a progression to the ultimate conclusion where the browser becomes the operating system and our physical hardware becomes little more than a web appliance? Or is there an upper limit: will there always be a place where desktop applications are more appropriate than applications delivered in a browser? If so, where does this limit lie? What factors should software vendors take into consideration when deciding whether to build new functionality on the web or into desktop applications?
Games

Psychologist: Porn and Video Game Addiction Are Leading To 'Masculinity Crisis' 950

HughPickens.com writes: Philip Zimbardo is a prominent psychologist from Stanford, most notable for leading the notorious Stanford prison experiment. He has published new research findings based on the lives of 20,000 young men, and his conclusion is stark: there is a developing "masculinity crisis" caused by addiction to video games and pornography. "Our focus is on young men who play video games to excess, and do it in social isolation — they are alone in their room," says Zimbardo. "It begins to change brain function. It begins to change the reward center of the brain, and produces a kind of excitement and addiction. What I'm saying is — boys' brains are becoming digitally rewired."

As an example, Zimbardo uses this quote from one young man: "When I'm in class, I'll wish I was playing World of Warcraft. When I'm with a girl, I'll wish I was watching pornography, because I'll never get rejected." Zimbardo doesn't think there's a specific time threshold at which playing video games goes from being acceptable to excessive. He says it varies by individual, and is more based on a "psychological change in mindset." To fight the problem, he suggest families need to track how much time is being spent on video games compared to other activities. "He also called for better sex education in schools — which should focus not only on biology and safety, but also on emotions, physical contact and romantic relationships."
Debian

Linux Mint Will Continue To Provide Both Systemd and Upstart 347

jones_supa writes: After Debian adopted systemd, many other Linux distributions based on that operating system made the switch as well. Ubuntu has already rolled out systemd in 15.04, but Linux Mint is providing dual options for users. The Ubuntu transition was surprisingly painless, and no one really put up a fight, but the Linux Mint team chose the middle ground. The Mint developers consider that the project needs to still wait for systemd to become more stable and mature, before it will be the default and only option.

Comment Re:It's the Millenials (Score 1) 405

65-85 = Gen X
85-05 = Millenials
05-25 = Digital Natives

Gotta disagree on that, judging from totally unscientific personal experience watching each incoming set of undergraduates at my local state university.

65-80 = Gen X
80 - 88 = Gen Y (if that)
90+ = Millennials

For political and cultural purposes, becoming "politically aware" somewhere around 2006 is about where I'd draw the line. A quick determiner is to ask them how much they remember about 9/11. If it's vague things about the adults being worried, or their 3rd grade teacher bringing them in for an announcement, they're probably a Millennial.

Comment Gen X - Gen Y - Millennial differences (Score 1) 405

Whether it's due to accelerating change, proximity, or whatever, there's arguably a pretty large difference even across those 10 years or so. Born in '79, I graduated HS in 1996, which puts me right at the borderline of Gen X and the early Gen Y's. I spent several years working at McDonald's before leaving college to work in the tech industry (just in time for the dot com implosion, natch).

I could more or less imagine friends of mine over the next few years also working at McDonald's... I can't imagine college friends now (born in the early/mid 90's) doing it -- it's seen as beneath them.

Comment Re:No, but your own choices are. (Score 1) 179

Of course, ymmmv, but I've never seen so much hate and vitriol directed at any president as what Obama has had to endure. Endless anti-Obama bumper stickers, even after he has no more terms to run for! And of course all the endless propaganda about how he's a secret muslim out to destroy the country. I find that the liberals tend more to argue the policy, whereas the cons do the name-calling and conspiracy theories. I never pay attention to how many friends I have on FB, so I can't say how many cons de-friended me. I don't defriend people for having a different point of view, though I may hide them if I just can't take the constant stream of hate.

Were you politically involved, or anywhere near a college campus, during the 2000's? The Bush hatred was strong. They didn't call it "Bush Derangement Syndrome" for nothing. And this was even before 9/11 and the 2003 Iraq War... Liberals never really got over the Florida election recount, hence faculty members turning their backs on him during mid-2000 commencement speeches.

Of course, the Internet was quite different then, and social networking as we know it was basically pre-infancy, but various political blogs developed strongly during this time, and all it takes is to scroll back into the 2002-2007-era Daily Kos or Democratic Underground archives to see outrage arguably on the same level as what you might find today. (I'm discounting the New World Order conspiracy theorists, who are along the same lines as the 9/11 Truthers, but accusations of a conspiracy around faking a birth certificate frankly pale in comparison to accusations of a conspiracy to attack your own country because Halliburton.)

I mean, I can't even imagine the outrage that would be present on the left if someone came up with a cover like this in an alt weekly with Obama on it: http://americandigest.org/sidelines/2012/08/if_anyone_deser.html. Meanwhile, people got bent out of shape at one parody New Yorker cover.

Part of this might be related to the slight age gap between the average liberal and average conservative, at least in the broad range of folks I know. Many people who are (now) conservative are those who are roughly in their 30's, and have strong memories of the 2000's and 9/11. Those in their 20's came of age in in the Obama era and don't have as much recollection of the political state before c. 2007/08.

Businesses

Technology and Ever-Falling Attention Spans 147

An anonymous reader writes: The BBC has an article about technology's effect on concentration in the workplace. They note research finding that the average information worker's attention span has dropped significantly in only a few years. "Back in 2004 we followed American information workers around with stopwatches and timed every action. They switched their attention every three minutes on average. In 2012, we found that the time spent on one computer screen before switching to another computer screen was one minute 15 seconds. By the summer of 2014 it was an average of 59.5 seconds." Many groups are now researching ways to keep people in states of focus and concentration. An app ecosystem is popping up to support that as well, from activity timing techniques to background noise that minimizes distractions. Recent studies are even showing that walking slowly on a treadmill while you work can have positive effects on focus and productivity. What tricks do you use to keep yourself on task?

Comment Re:No, but your own choices are. (Score 1) 179

If you de-friend someone (or large groups of someones), their stories are basically not going to be on your feed in the first place, and liberals have been shown to be more likely to de-friend conservatives over political differences than conservatives de-friend liberals http://www.washingtonpost.com/blogs/the-switch/wp/2014/10/21/liberals-are-more-likely-to-unfriend-you-over-politics-online-and-off/

In my experience, the reason for this is that conservatives push out a lot of hate in their postings and liberals don't. No one wants to read a lot of nasty name-calling.

In my circle, it's been widely the other way around... or at least it used to be, circa 2008 (Obamamania) - early 2012. By the time of the actual election things had moderated down somewhat, and it's been better since. But my feed was *filled* with pro-leftwing, anti-rightwing links of vitrol, often to ThinkProgress or Salon during that time, with lots of associated name-calling ("Those damn Rethunglicans", etc.)

I've been heavily involved in the arts community over the years, and had (and still do have) many friends still in college. The liberal skew was *extremely* strong.

Right before the 2008 election, when *everyone* was changing their profile pic to the Obama "HOPE" image/logo, I replace mine with the McCain/Palin logo. Friend count dropped by 10 in the first 30 minutes.

Comment Re:No, but your own choices are. (Score 5, Insightful) 179

Which differs from XX year olds who have no basic understanding of liberal principles, or presume that there's no other possible motivation for some random liberal policy than abject hatred (especially of America!) and/or slavish devotion to the government that is stealing their money/freedom/religion in what way exactly?

My point is that's is very hard to NOT have a "basic understanding of liberal principles", because they're the "default" view you see in most media and entertainment, and in most humanities coursework. "Income inequality is ipso facto bad" and "raise the minimum wage" are not difficult to understand the meaning behind; there's no need to assert a hatred of America. OTOH, "raising the minimum wage won't really help" is not easy to understand (at first), and it's quite simple to simply assert that someone who'd say that is "greedy" and wants more money, screwing over everyone else, and leave it at that.

Comment No, but your own choices are. (Score 5, Informative) 179

If you de-friend someone (or large groups of someones), their stories are basically not going to be on your feed in the first place, and liberals have been shown to be more likely to de-friend conservatives over political differences than conservatives de-friend liberals http://www.washingtonpost.com/blogs/the-switch/wp/2014/10/21/liberals-are-more-likely-to-unfriend-you-over-politics-online-and-off/

Unless you're a complete recluse or are making a conscious effort to sequester yourself from any popular culture, it's virtually impossible to be in your teens or 20's and not be exposed to various legitimate liberal political stances -- most often during college years. OTOH, it's quite easy to never interact with any "real life" legitimate conservative arguments, other than straw men that the liberal political arguments are using.

Thus you end up with 25 year olds who have no basic understanding of conservative economic principles, or presume that there's no other possible motiviation for some random socially conservative policy than abject hatred and/or slavish religious belief.

Earth

Global Carbon Dioxide Levels Reach New Monthly Record 372

mrflash818 writes: For the first time since we began tracking carbon dioxide in the global atmosphere, the monthly global average concentration of carbon dioxide gas surpassed 400 parts per million in March 2015, according to NOAA's latest results. “It was only a matter of time that we would average 400 parts per million globally,” said Pieter Tans, lead scientist of NOAA’s Global Greenhouse Gas Reference Network. “We first reported 400 ppm when all of our Arctic sites reached that value in the spring of 2012. In 2013 the record at NOAA’s Mauna Loa Observatory first crossed the 400 ppm threshold. Reaching 400 parts per million as a global average is a significant milestone."

Slashdot Top Deals

I'd rather just believe that it's done by little elves running around.

Working...