Forgot your password?
typodupeerror

+ - There's No Such Thing as a General-purpose Processor-> 1

Submitted by CowboyRobot
CowboyRobot (671517) writes "David Chisnall of the University of Cambridge argues that despite the current trend of categorizing processors and accelerators as "general purpose", there really is no such thing and believing in such a device is harmful.

"The problem of dark silicon (the portion of a chip that must be left unpowered) means that it is going to be increasingly viable to have lots of different cores on the same die, as long as most of them are not constantly powered. Efficient designs in such a world will require admitting that there is no one-size-fits-all processor design and that there is a large spectrum, with different trade-offs at different points.""

Link to Original Source

+ - JavaScript and the Netflix User Interface->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Alex Liu is a senior UI engineer at Netflix and part of the core team leading the migration of Netflix.com to Node.js. He has an article at ACM's Queue in which he describes how JavaScript is used at Netflix. "With increasingly more application logic being shifted to the browser, developers have begun to push the boundaries of what JavaScript was originally intended for. Entire desktop applications are now being rebuilt entirely in JavaScript—the Google Docs office suite is one example. Such large applications require creative solutions to manage the complexity of loading the required JavaScript files and their dependencies. The problem can be compounded when introducing multivariate A/B testing, a concept that is at the core of the Netflix DNA. Multivariate testing introduces a number of problems that JavaScript cannot handle using native constructs, one of which is the focus of this article: managing conditional dependencies.""
Link to Original Source

+ - Security Collapse in the HTTPS Market->

Submitted by CowboyRobot
CowboyRobot (671517) writes "HTTPS has evolved into the de facto standard for secure Web browsing. Through the certificate-based authentication protocol, Web services and Internet users first authenticate one another ("shake hands") using a TLS/SSL certificate, encrypt Web communications end-to-end, and show a padlock in the browser to signal that a communication is secure. In recent years, HTTPS has become an essential technology to protect social, political, and economic activities online. At the same time, widely reported security incidents (such as DigiNotar's breach, Apple's #gotofail, and OpenSSL's Heartbleed) have exposed systemic security vulnerabilities of HTTPS to a global audience. The Edward Snowden revelations (notably around operation BULLRUN, MUSCULAR, and the lesser-known FLYING PIG program to query certificate metadata on a dragnet scale) have driven the point home that HTTPS is both a major target of government hacking and eavesdropping, as well as an effective measure against dragnet content surveillance when Internet traffic traverses global networks. HTTPS, in short, is an absolutely critical but fundamentally flawed cybersecurity technology."
Link to Original Source

+ - Why Is It Taking So Long to Secure Internet Routing?->

Submitted by CowboyRobot
CowboyRobot (671517) writes "We live in an imperfect world where routing-security incidents can still slip past deployed security defenses, and no single routing-security solution can prevent every attacks. Research suggests, however, that the combination of RPKI (Resource Public Key Infrastructure) with prefix filtering could significantly improve routing security; both solutions are based on whitelisting techniques and can reduce the number of autonomous systems that are impacted by prefix hijacks, route leaks, and path-shortening attacks."
Link to Original Source

+ - Certificate Transparency->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Ben Laurie (of Google, among many other organizations) explains how we cannot have a secure Web as long as we rely so much on the current method of depending on third-party certificate authorities. "Ultimately, we want to ensure that Web users are actually talking to whom they think they're talking to, and that no one else can intercept the conversation. That's really an impossible goal—how can a computer know what the user is thinking—but for now let's reduce the problem to a slightly different one: how to ensure the Web user is talking to the owner of the domain corresponding to the URL being used." His solution is to use "public, verifiable, append-only logs." Ideal certificate transparency allows everyone to participate, introduces no latency, relies on no third party, and is automatic."
Link to Original Source

+ - ACM and the Professional Programmer->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Vint Cerf was a co-designer of the TCP/IP protocols and can rightly claim to be one of the "Fathers of the Internet." He's a vice president at Google and was president of the ACM from 2012-2014. On behalf of that organization, he has an essay that gently questions the relevance of the ACM now that so many (most?) software engineers and other computer science professionals are not members, and may not have formal CS education at all. As he puts it: "The question that occupies my mind, especially as the membership in ACM has not grown in a way commensurate with the evident growth of programmers, is whether and how ACM can adapt its activities and offerings to increase the participation of these professionals.""
Link to Original Source

+ - Who Must You Trust?->

Submitted by CowboyRobot
CowboyRobot (671517) writes "In ACM's Queue, Thomas Wadlow argues that "Whom you trust, what you trust them with, and how much you trust them are at the center of the Internet today."
He gives a checklist of what to look for when evaluating any system for trustworthiness, chock full of fascinating historical examples.
These include NASA opting for a simpler, but more reliable chip; the Terry Childs case; and even an 18th century "semaphore telegraph" that was a very early example of steganographic cryptography.
FTA: "Detecting an anomaly is one thing, but following up on what you've detected is at least as important. In the early days of the Internet, Cliff Stoll, then a graduate student at Lawrence Berkeley Laboratories in California, noticed a 75-cent accounting error on some computer systems he was managing. Many would have ignored it, but it bothered him enough to track it down. That investigation led, step by step, to the discovery of an attacker named Markus Hess, who was arrested, tried, and convicted of espionage and selling information to the Soviet KGB.""

Link to Original Source

+ - QA Testing at EA and Netflix->

Submitted by CowboyRobot
CowboyRobot (671517) writes "To millions of gamers, the position of QA (quality assurance) tester at Electronic Arts must seem like a dream job. But from the company's perspective, the overhead associated with QA can look downright frightening, particularly in an era of massively multi-player games. Hence the appeal of automated QA testing, which has the potential to be faster, more cost-effective, more efficient, and more scalable than manual testing. While automation cannot mimic everything human testers can do, it can be very useful for many types of basic testing. Still, it turns out the transition to automated testing is not nearly as straightforward as it might at first appear. Joining the discussion is Jafar Husain, a lead software developer for Netflix. Previously he worked at Microsoft, where one of his tasks involved creating the test environment for the Silverlight development platform."
Link to Original Source

+ - The NSA and Snowden: Securing the All-Seeing Eye->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Bob, Toxen, one of the developers of Berkeley Unix among other things, has an article at the ACM in which he describes how the NSA could have prevented Edward Snowden from releasing classified information. "The NSA seemingly had become lax in utilizing even the most important, simple, and cheap good computer security practices with predictable consequences, even though it has virtually unlimited resources and access — if it wants it — to the best computer security experts in the country... Consider that one of Snowden's jobs was copying large amounts of classified data from one computer to a thumb drive and then connecting that thumb drive to another computer and downloading the data. He likely secreted the thumb drive on his person after downloading the data he wanted and took it home. This theft could have been prevented rather easily with the use of public-key encryption... The NSA should have had a public/secret key pair created for each sysadmin who needed to transfer data and a separate account on each computer for each sysadmin to transfer this data.""
Link to Original Source

+ - The Curse of the Excluded Middle->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Erik Meijer, known for his contributions to Haskell, C#, Visual Basic, Hack, and LINQ, has an article at the ACM in which he argues that "Mostly functional" programming does not work. "The idea of "mostly functional programming" is unfeasible. It is impossible to make imperative programming languages safer by only partially removing implicit side effects. Leaving one kind of effect is often enough to simulate the very effect you just tried to remove. On the other hand, allowing effects to be "forgotten" in a pure language also causes mayhem in its own way. Unfortunately, there is no golden middle, and we are faced with a classic dichotomy: the curse of the excluded middle, which presents the choice of either (a) trying to tame effects using purity annotations, yet fully embracing the fact that your code is still fundamentally effectful; or (b) fully embracing purity by making all effects explicit in the type system and being pragmatic by introducing nonfunctions such as unsafePerformIO. The examples shown here are meant to convince language designers and developers to jump through the mirror and start looking more seriously at fundamentalist functional programming.""
Link to Original Source

+ - Don't Settle for Eventual Consistency->

Submitted by CowboyRobot
CowboyRobot (671517) writes "At ACM's Queue magazine, Wyatt Lloyd of Facebook writes that "Systems that sacrifice strong consistency gain much in return. They can be always available, guarantee responses with low latency, and provide partition tolerance. We coined the term ALPS for systems that provide these three properties—always available, low latency, and partition tolerance—and one more: scalability. Scalability implies that adding storage servers to each data center produces a proportional increase in storage capacity and throughput. Scalability is critical for modern systems because data has grown far too large to be stored or served by a single machine. The question remains as to what consistency properties ALPS systems can provide. Before answering this, let's consider the consistency offered by existing ALPS systems. For systems such as Amazon's Dynamo, LinkedIn's Project Voldemort, and Facebook/Apache's Cassandra, the answer is eventual consistency.""
Link to Original Source

+ - Samsung's Position On Tizen May Hurt Developer Recruitment->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Samsung isn’t making it easy for developers. The company may have released a handful of SDKs for its latest devices, but Samsung’s non-committal approach to its Tizen platform is probably going to cost it developer support. Samsung’s first smartwatch, released in October last year, ran a modified version of Google’s Android platform. The device had access to about 80 apps at launch, all of which were managed by a central smartphone app. Samsung offered developers an SDK for the Galaxy Gear so they could create more apps. Developers obliged. Then Samsung changed direction."
Link to Original Source

+ - Please Put OpenSSL Out of Its Misery->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Writing for the ACM, Poul-Henning Kamp claims that "OpenSSL must die, for it will never get any better." The reasons being that OpenSSL has become a dumping ground of un-organized contributions. "We need a well-designed API, as simple as possible to make it hard for people to use it incorrectly. And we need multiple independent quality implementations of that API, so that if one turns out to be crap, people can switch to a better one in a matter of hours.""
Link to Original Source

+ - Can't Find a Parking Spot? There's an API for That->

Submitted by CowboyRobot
CowboyRobot (671517) writes "California-based Streetline has partnered with cities, universities, parking garages and transit agencies in the United States and internationally to build a large, smart parking network. The network is powered by ultra-low-power wireless sensors that it deploys in its partners’ locations. The data collected by those sensors is analyzed by the company’s Smart Parking Platform to determine parking availability. The system distinguishes between metered and unmetered spaces, and it can identify special kinds of spaces, such as electric vehicle (EV) and accessible spaces."
Link to Original Source

+ - A Primer on Provenance->

Submitted by CowboyRobot
CowboyRobot (671517) writes "Lucian Carata and colleagues at the University of Cambridge describe the importance of knowing where data comes from and what processes may have altered it. In "A Primer on Provenance", they describe how sometimes the transformations applied to data are not directly controlled by or even known to developers, and information about a result is lost when no provenance is recorded, making it harder to assess quality or reproducibility. Computing is becoming pervasive, and the need for guarantees about it being dependable will only aggravate those problems; treating provenance as a first-class citizen in data processing represents a possible solution."
Link to Original Source

There is no royal road to geometry. -- Euclid

Working...