Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Bitcoin

Bitcoin Volatility Puts Miners Under Pressure 290

Posted by timothy
from the coulda-shoulda-woulda dept.
An anonymous reader writes "The virtual currency Bitcoin lost 21 per cent of its value yesterday, equating to a total loss this year of 44 per cent. Reports have suggested that this rapid fall is squeezing computer supporting systems and is raising alarm about its future viability. Bitcoin's value fell to $179.37, 85 per cent lower than its record peak of $1,165 at the end of 2013. In total, nearly $11.3bn has been lost in Bitcoin's value since its 2013 high. The decline has raised concern for Bitcoin 'miners' who support the transactions made in the digital currency, and whose profits become squeezed as its price falls against traditional currencies." The Coindesk article in the linked story gives a blow-by-blow on yesterday's valuation drop; right now, Bitcoin has jumped back up and stands at just over $216.
Input Devices

Apple Awarded Gesture-Control Patent 105

Posted by Soulskill
from the nothing-is-obvious-to-the-uspto dept.
mpicpp points out a report that Apple has been awarded a broad patent for gesture control of a computer interface (8,933,876). The company inherited the patent after their acquisition of motion-sensor company PrimeSense in 2013. (PrimeSense's technology is used in Microsoft's Kinect gesture control system.) Here's the patent's abstract: A method, including receiving, by a computer executing a non-tactile three dimensional (3D) user interface, a set of multiple 3D coordinates representing a gesture by a hand positioned within a field of view of a sensing device coupled to the computer, the gesture including a first motion in a first direction along a selected axis in space, followed by a second motion in a second direction, opposite to the first direction, along the selected axis. Upon detecting completion of the gesture, the non-tactile 3D user interface is transitioned from a first state to a second state.
Microsoft

Ask Slashdot: Is an Open Source .NET Up To the Job? 421

Posted by Soulskill
from the good-steps-or-irrelevant-steps dept.
Rob Y. writes: The discussion on Slashdot about Microsoft's move to open source .NET core has centered on:

1. whether this means Microsoft is no longer the enemy of the open source movement
2. if not, then does it mean Microsoft has so lost in the web server arena that it's resorting to desperate moves.
3. or nah — it's standard Microsoft operating procedure. Embrace, extend, extinguish.

What I'd like to ask is whether anybody that's not currently a .NET fan actually wants to use it? Open source or not. What is the competition? Java? PHP? Ruby? Node.js? All of the above? Anything but Microsoft? Because as an OSS advocate, I see only one serious reason to even consider using it — standardization. Any of those competing platforms could be as good or better, but the problem is: how to get a job in this industry when there are so many massively complex platforms out there. I'm still coding in C, and at 62, will probably live out my working days doing that. But I can still remember when learning a new programming language was no big deal. Even C required learning a fairly large library to make it useful, but it's nothing compared to what's out there today. And worse, jobs (and technologies) don't last like they used to. Odds are, in a few years, you'll be starting over in yet another job where they use something else.

Employers love standardization. Choosing a standard means you can't be blamed for your choice. Choosing a standard means you can recruit young, cheap developers and actually get some output from them before they move on. Or you can outsource with some hope of success (because that's what outsourcing firms do — recruit young, cheap devs and rotate them around). To me, those are red flags — not pluses at all. But they're undeniable pluses to greedy employers. Of course, there's much more to being an effective developer than knowing the platform so you can be easily slotted in to a project. But try telling that to the private equity guys running too much of the show these days.

So, assuming Microsoft is sincere about this open source move,
1. Is .NET up to the job?
2. Is there an open source choice today that's popular enough to be considered the standard that employers would like?
3. If the answer to 1 is yes and 2 is no, make the argument for avoiding .NET.
IOS

Ask Slashdot: Objective C Vs. Swift For a New iOS Developer? 211

Posted by Soulskill
from the past-vs-future dept.
RegularDave writes: I'm a recent grad from a master's program in a potentially worthless social science field, and I've considered getting into iOS development. Several of my friends who were in similar situations after grad school have done so and are making a healthy living getting contract work. Although they had CS and Physics degrees going into iOS, neither had worked in objective C and both essentially went through a crash courses (either self-taught or through intensive classes) in order to get their first gigs. I have two questions. First, am I an idiot for thinking I can teach myself either objective C or Swift on my own without any academic CS background (I've tinkered in HTML, CSS, and C classes online with some success)? Second, if I'm not an idiot for attempting to learn either language, which should I concentrate on?
Sci-Fi

Battlestar Galactica Creator Glen A. Larson Dead At 77 186

Posted by timothy
from the free-to-roam-the-galaxy-etc. dept.
schwit1 writes Glen A. Larson, the wildly successful television writer-producer whose enviable track record includes 'Six Million Dollar Man', Quincy M.E., Magnum, P.I., Battlestar Galactica, Knight Rider and The Fall Guy, has died. He was 77. From the article: Battlestar Galactica lasted just one season on ABC from 1978-79, yet the show had an astronomical impact. Starring Lorne Greene and Richard Hatch as leaders of a homeless fleet wandering through space, featuring special effects supervised by Star Wars’ John Dykstra and influenced by Larson’s Mormon beliefs, Battlestar premiered as a top 10 show and finished the year in the top 25. But it was axed after 24 episodes because, Larson said, each episode cost “well over” $1 million.
Android

Ars Dissects Android's Problems With Big Screens -- Including In Lollipop 103

Posted by timothy
from the point-fifth-world-problems dept.
When it comes to tablets, Google doesn't even follow its own design guidelines." That's the upshot of Ars Technica writer Andew Cunningham's detailed, illustrated look at how Android handles screens much larger than seven inches, going back to the first large Android tablets a few years ago, but including Android 5.0 (Lollipop) on the Nexus 10 and similar sized devices. Cunningham is unimpressed with the use of space for both practical and aesthetic reasons, and says that problems crop up areas that are purely under Google's control, like control panels and default apps, as well as (more understandably) in third party apps. The Nexus 10 took 10-inch tablets back to the "blown-up phone" version of the UI, where buttons and other UI stuff was all put in the center of the screen. This makes using a 10-inch tablet the same as using a 7-inch tablet or a phone, which is good for consistency, but in retrospect it was a big step backward for widescreen tablets. The old interface put everything at the edges of the screen where your thumbs could easily reach them. The new one often requires the pointer finger of one of your hands or some serious thumb-stretching. ... If anything, Lollipop takes another step backward here. You used to be able to swipe down on the left side of the screen to see your notifications and the right side of the screen to see the Quick Settings, and now those two menus have been unified and placed right in the center of the screen. The Nexus 10 is the most comfortable to use if it's lying flat on a table or stand and Lollipop does nothing to help you out there.
Operating Systems

FreeBSD 10.1 Released 123

Posted by timothy
from the longstanding-contributions dept.
An anonymous reader writes Version 10.1 of the venerable FreeBSD operating system has been released. The new version of FreeBSD offers support for booting from UEFI, automated generation of OpenSSH keys, ZFS performance improvements, updated (and more secure) versions of OpenSSH and OpenSSL and hypervisor enhancements. FreeBSD 10.1 is an extended support release and will be supported through until January 1, 2017. Adds reader aojensen: As this is the second release of the stable/10 branch, it focuses on improving the stability and security of the 10.0-RELEASE, but also introduces a set of new features including: vt(4) a new console driver, support for FreeBSD/i386 guests on the bhyve hypervisor, support for SMP on armv6 kernels, UEFI boot support for amd64 architectures, support for the UDP-Lite protocol (RFC 3828) support on both IPv4 and IPv6, and much more. For a complete list of changes and new features, the release notes are also available.
Advertising

Why the Time Is Always Set To 9:41 In Apple Ads 109

Posted by samzenpus
from the ticking-away-the-moments-that-make-up-a-dull-day dept.
jones_supa writes If you have looked carefully, the clock has traditionally been always set to 9:42 in Apple advertisements. You could see it across various commercials, print ads, and even on Apple's website. The explanation is simple: That's the time in the morning that Steve Jobs announced the very first iPhone in 2007. Around 42 minutes into his keynote address he said "Today Apple is going to reinvent the phone." The picture of the phone was carefully scheduled to pop up at that moment. "We design the keynotes so that the big reveal of the product happens around 40 minutes into the presentation", Apple's Scott Forstall confirms. The time was even slightly tweaked in 2010, when the very first iPad was released, so that when it was revealed, it displayed a different time: 9:41.
Crime

Is the Outrage Over the FBI's Seattle Times Tactics a Knee-Jerk Reaction? 206

Posted by samzenpus
from the wait-a-second dept.
reifman writes The Internet's been abuzz the past 48 hours about reports the FBI distributed malware via a fake Seattle Times news website. What the agency actually did is more of an example of smart, precise law enforcement tactics. Is the outrage online an indictment of Twitter's tendency towards uninformed, knee-jerk reactions? In this age of unwarranted, unconstitutional blanket data collection by the NSA, the FBI's tactics from 2007 seem refreshing for their precision.
Apple

Apple A8X IPad Air 2 Processor Packs Triple-Core CPU, Hefty Graphics Punch 130

Posted by samzenpus
from the give-me-the-numbers dept.
MojoKid writes When Apple debuted its A8 SoC, it proved to be a modest tweak of the original A7. Despite packing double the transistors and an improved GPU, the heart of the A8 SoC is the same dual-core Apple "Cyclone" processor tweaked to run at higher clock speeds and with stronger total GPU performance. Given this, many expected that the Apple A8X would be cut from similar cloth — a higher clock speed, perhaps, and a larger GPU, but not much more than that. It appears those projections were wrong. The Apple A8X chip is a triple-core variant of the A8, with a higher clock speed (1.5GHz vs. 1.4GHz), a larger L2 cache (2MB, up from 1MB) and 2GB of external DDR3. It also uses an internal metal heatspreader, which the Apple A8 eschews. All of this points to slightly higher power consumption for the core, but also to dramatically increased performance. The new A8X is a significant power house in multiple types of workloads; in fact, its the top-performing mobile device on Geekbench by a wide margin. Gaming benchmarks are equally impressive. The iPad Air 2 nudges out Nvidia's Shield in GFXBench's Manhattan offscreen test, at 32.4fps to 31 fps. Onscreen favors the NV solution thanks to its lower-resolution screen, but the Nvidia device does take 3DMark Ice Storm Unlimited by a wide margin, clocking in at 30,970 compared to 21,659.
Sony

How Sony, Intel, and Unix Made Apple's Mac a PC Competitor 296

Posted by samzenpus
from the back-in-the-day dept.
smaxp writes In 2007, Sony's supply chain lessons, the network effect from the shift to Intel architecture, and a better OS X for developers combined to renew the Mac's growth. The network effects of the Microsoft Wintel ecosystem that Rappaport explained 20 years ago in the Harvard Business Review are no longer a big advantage. By turning itself into a premium PC company with a proprietary OS, Apple has taken the best of PC ecosystem, but avoided taking on the disadvantages.
Programming

Building Apps In Swift With Storyboards 69

Posted by samzenpus
from the build-it-better dept.
Nerval's Lobster writes Apple touts the Swift programming language as easy to use, thanks in large part to features such as Interface Builder, a visual designer provided in Xcode that allows a developer to visually design storyboards. In theory, this simplifies the process of designing both screens and the connections between screens, as it needs no code and offers an easy-to-read visual map of an app's navigation. But is Swift really so easy (or at least as easy as anything else in a developer's workflow)? This new walkthrough of Interface Builder (via Dice) shows that it's indeed simple to build an app with these custom tools... so long as the app itself is simple. Development novices who were hoping that Apple had created a way to build complex apps with a limited amount of actual coding might have to spend a bit more time learning the basics before embarking on the big project of their dreams.
Programming

Ask Slashdot: Swift Or Objective-C As New iOS Developer's 1st Language? 316

Posted by timothy
from the two-roads-diverge-but-do-they-loop dept.
macs4all (973270) writes "I am an experienced C and Assembler Embedded Developer who is contemplating for the first time beginning an iOS App Project. Although I am well-versed in C, I have thus-far avoided C++, C# and Java, and have only briefly dabbled in Obj-C. Now that there are two possibilities for doing iOS Development, which would you suggest that I learn, at least at first? And is Swift even far-enough along to use as the basis for an entire app's development? My goal is the fastest and easiest way to market for this project; not to start a career as a mobile developer. Another thing that might influence the decision: If/when I decide to port my iOS App to Android (and/or Windows Phone), would either of the above be an easier port; or are, for example, Dalvick and the Android APIs different enough from Swift/Obj-C and CocoaTouch that any 'port' is essentially a re-write?"
Bug

Apple Yanks iOS 8 Update 203

Posted by samzenpus
from the our-bad dept.
alphadogg writes Within hours of releasing an iOS 8 update to address assorted bugs in the new iPhone and iPad operating system Apple has been forced to pull the patch, which itself was causing iPhone 6 and 6 Plus users grief. Reports filled Apple support forums that the iOS 8 update was cutting off users' cell service and making Touch ID inoperable. The Wall Street Journal received this statement from Apple: "We have received reports of an issue with the iOS 8.0.1 update. We are actively investigating these reports and will provide information as quickly as we can. In the meantime we have pulled back the iOS 8.0.1 update."

CChheecckk yyoouurr dduupplleexx sswwiittcchh..

Working...