Forgot your password?
typodupeerror

+ - ACM and the Professional Programmer->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Vint Cerf was a co-designer of the TCP/IP protocols and can rightly claim to be one of the "Fathers of the Internet." He's a vice president at Google and was president of the ACM from 2012-2014. On behalf of that organization, he has an essay that gently questions the relevance of the ACM now that so many (most?) software engineers and other computer science professionals are not members, and may not have formal CS education at all. As he puts it: "The question that occupies my mind, especially as the membership in ACM has not grown in a way commensurate with the evident growth of programmers, is whether and how ACM can adapt its activities and offerings to increase the participation of these professionals.""
Link to Original Source

+ - Who Must You Trust?->

Submitted by CowboyRobot
CowboyRobot (671517) writes "In ACM's Queue, Thomas Wadlow argues that "Whom you trust, what you trust them with, and how much you trust them are at the center of the Internet today."
He gives a checklist of what to look for when evaluating any system for trustworthiness, chock full of fascinating historical examples.
These include NASA opting for a simpler, but more reliable chip; the Terry Childs case; and even an 18th century "semaphore telegraph" that was a very early example of steganographic cryptography.
FTA: "Detecting an anomaly is one thing, but following up on what you've detected is at least as important. In the early days of the Internet, Cliff Stoll, then a graduate student at Lawrence Berkeley Laboratories in California, noticed a 75-cent accounting error on some computer systems he was managing. Many would have ignored it, but it bothered him enough to track it down. That investigation led, step by step, to the discovery of an attacker named Markus Hess, who was arrested, tried, and convicted of espionage and selling information to the Soviet KGB.""

Link to Original Source

+ - QA Testing at EA and Netflix->

Submitted by CowboyRobot
CowboyRobot (671517) writes "To millions of gamers, the position of QA (quality assurance) tester at Electronic Arts must seem like a dream job. But from the company's perspective, the overhead associated with QA can look downright frightening, particularly in an era of massively multi-player games. Hence the appeal of automated QA testing, which has the potential to be faster, more cost-effective, more efficient, and more scalable than manual testing. While automation cannot mimic everything human testers can do, it can be very useful for many types of basic testing. Still, it turns out the transition to automated testing is not nearly as straightforward as it might at first appear. Joining the discussion is Jafar Husain, a lead software developer for Netflix. Previously he worked at Microsoft, where one of his tasks involved creating the test environment for the Silverlight development platform."
Link to Original Source

+ - The NSA and Snowden: Securing the All-Seeing Eye->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Bob, Toxen, one of the developers of Berkeley Unix among other things, has an article at the ACM in which he describes how the NSA could have prevented Edward Snowden from releasing classified information. "The NSA seemingly had become lax in utilizing even the most important, simple, and cheap good computer security practices with predictable consequences, even though it has virtually unlimited resources and access — if it wants it — to the best computer security experts in the country... Consider that one of Snowden's jobs was copying large amounts of classified data from one computer to a thumb drive and then connecting that thumb drive to another computer and downloading the data. He likely secreted the thumb drive on his person after downloading the data he wanted and took it home. This theft could have been prevented rather easily with the use of public-key encryption... The NSA should have had a public/secret key pair created for each sysadmin who needed to transfer data and a separate account on each computer for each sysadmin to transfer this data.""
Link to Original Source

+ - The Curse of the Excluded Middle->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Erik Meijer, known for his contributions to Haskell, C#, Visual Basic, Hack, and LINQ, has an article at the ACM in which he argues that "Mostly functional" programming does not work. "The idea of "mostly functional programming" is unfeasible. It is impossible to make imperative programming languages safer by only partially removing implicit side effects. Leaving one kind of effect is often enough to simulate the very effect you just tried to remove. On the other hand, allowing effects to be "forgotten" in a pure language also causes mayhem in its own way. Unfortunately, there is no golden middle, and we are faced with a classic dichotomy: the curse of the excluded middle, which presents the choice of either (a) trying to tame effects using purity annotations, yet fully embracing the fact that your code is still fundamentally effectful; or (b) fully embracing purity by making all effects explicit in the type system and being pragmatic by introducing nonfunctions such as unsafePerformIO. The examples shown here are meant to convince language designers and developers to jump through the mirror and start looking more seriously at fundamentalist functional programming.""
Link to Original Source

+ - Don't Settle for Eventual Consistency->

Submitted by CowboyRobot
CowboyRobot (671517) writes "At ACM's Queue magazine, Wyatt Lloyd of Facebook writes that "Systems that sacrifice strong consistency gain much in return. They can be always available, guarantee responses with low latency, and provide partition tolerance. We coined the term ALPS for systems that provide these three properties—always available, low latency, and partition tolerance—and one more: scalability. Scalability implies that adding storage servers to each data center produces a proportional increase in storage capacity and throughput. Scalability is critical for modern systems because data has grown far too large to be stored or served by a single machine. The question remains as to what consistency properties ALPS systems can provide. Before answering this, let's consider the consistency offered by existing ALPS systems. For systems such as Amazon's Dynamo, LinkedIn's Project Voldemort, and Facebook/Apache's Cassandra, the answer is eventual consistency.""
Link to Original Source

+ - Samsung's Position On Tizen May Hurt Developer Recruitment->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Samsung isn’t making it easy for developers. The company may have released a handful of SDKs for its latest devices, but Samsung’s non-committal approach to its Tizen platform is probably going to cost it developer support. Samsung’s first smartwatch, released in October last year, ran a modified version of Google’s Android platform. The device had access to about 80 apps at launch, all of which were managed by a central smartphone app. Samsung offered developers an SDK for the Galaxy Gear so they could create more apps. Developers obliged. Then Samsung changed direction."
Link to Original Source

+ - Please Put OpenSSL Out of Its Misery->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Writing for the ACM, Poul-Henning Kamp claims that "OpenSSL must die, for it will never get any better." The reasons being that OpenSSL has become a dumping ground of un-organized contributions. "We need a well-designed API, as simple as possible to make it hard for people to use it incorrectly. And we need multiple independent quality implementations of that API, so that if one turns out to be crap, people can switch to a better one in a matter of hours.""
Link to Original Source

+ - Can't Find a Parking Spot? There's an API for That->

Submitted by CowboyRobot
CowboyRobot (671517) writes "California-based Streetline has partnered with cities, universities, parking garages and transit agencies in the United States and internationally to build a large, smart parking network. The network is powered by ultra-low-power wireless sensors that it deploys in its partners’ locations. The data collected by those sensors is analyzed by the company’s Smart Parking Platform to determine parking availability. The system distinguishes between metered and unmetered spaces, and it can identify special kinds of spaces, such as electric vehicle (EV) and accessible spaces."
Link to Original Source

+ - A Primer on Provenance->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Lucian Carata and colleagues at the University of Cambridge describe the importance of knowing where data comes from and what processes may have altered it. In "A Primer on Provenance", they describe how sometimes the transformations applied to data are not directly controlled by or even known to developers, and information about a result is lost when no provenance is recorded, making it harder to assess quality or reproducibility. Computing is becoming pervasive, and the need for guarantees about it being dependable will only aggravate those problems; treating provenance as a first-class citizen in data processing represents a possible solution."
Link to Original Source

+ - Converter Makes JSON as Understandable as a Spreadsheet->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Eric Mill, a developer at the Sunlight Foundation, has created a JSON-to-CSV converter where users can simply paste JSON code into a box and have the code automatically reformatted and re-colored, then converted into an easily readable table of data. A complete CSV of the table data can also be downloaded. The converter uses JavaScript so it runs without a server and the JSON conversions happen inside the browser."
Link to Original Source

+ - Multipath TCP->

Submitted by CowboyRobot
CowboyRobot (671517) writes "The Internet relies heavily on two protocols. In the network layer, IP (Internet Protocol) provides an unreliable datagram service and ensures that any host can exchange packets with any other host. Since its creation in the 1970s, IP has seen the addition of several features, including multicast, IPsec (IP security), and QoS (quality of service). The latest revision, IPv6 (IP version 6), supports 16-byte addresses. The second major protocol is TCP (Transmission Control Protocol), which operates in the transport layer and provides a reliable bytestream service on top of IP. TCP has evolved continuously since the first experiments in research networks. MPTCP is a major extension to TCP. By decoupling TCP from IP, TCP is at last able to support multihomed hosts. With the growing importance of wireless networks, multihoming is becoming the norm instead of the exception. Smartphones and data centers are the first use cases where MPTCP can provide benefits."
Link to Original Source

+ - Eventually Consistent: Not What You Were Expecting?->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Storage systems continue to lay the foundation for modern Internet services such as Web search, e-commerce, and social networking. Pressures caused by rapidly growing user bases and data sets have driven system designs away from conventional centralized databases and toward more scalable distributed solutions, including simple NoSQL key-value storage systems, as well as more elaborate NewSQL databases that support transactions at scale.

Eventual consistency is increasingly viewed as a spectrum of behaviors that can be quantified along various dimensions, rather than a binary property that a storage system either satisfies or fails to satisfy. Advances in characterizing and verifying these behaviors will enable service providers to offer an increasingly rich set of service levels of differentiated performance, ultimately improving the end user's experience."

Link to Original Source

+ - Rate-Limiting State->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Writing for ACM's Queue magazine, Paul Vixie argues, "The edge of the Internet is an unruly place." By design, the Internet core is stupid, and the edge is smart. This design decision has enabled the Internet's wildcat growth, since without complexity the core can grow at the speed of demand. On the downside, the decision to put all smartness at the edge means we're at the mercy of scale when it comes to the quality of the Internet's aggregate traffic load. Not all device and software builders have the skills and budgets that something the size of the Internet deserves. Furthermore, the resiliency of the Internet means that a device or program that gets something importantly wrong about Internet communication stands a pretty good chance of working "well enough" in spite of this. Witness the endless stream of patches and vulnerability announcements from the vendors of literally every smartphone, laptop, or desktop operating system and application. Bad guys have the time, skills, and motivation to study edge devices for weaknesses, and they are finding as many weaknesses as they need to inject malicious code into our precious devices where they can then copy our data, modify our installed software, spy on us, and steal our identities."
Link to Original Source

+ - IBM Invests $100M to Help Developers Build Cognitive Applications->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Hard on the heels of a recent move to make its Watson supercomputer a service that developers can invoke via RESTful APIs, IBM is now making $100 million available to help developers build cognitive computing applications that can run on top of Watson. The funding is part of a larger $1 billion investment that IBM is making to create a formal Watson Group and is designed to encourage independent software vendors to become part of an ecosystem that IBM is trying to build around Watson."
Link to Original Source

Overdrawn? But I still have checks left!

Working...