Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Slashdot videos: Now with more Slashdot!

  • View

  • Discuss

  • Share

We've improved Slashdot's video section; now you can view our video interviews, product close-ups and site visits with all the usual Slashdot options to comment, share, etc. No more walled garden! It's a work in progress -- we hope you'll check it out (Learn more about the recent updates).

×

Comment: Re:fees (Score 1) 379

I live in Boston. In most of the city, there is only one consumer provider that meets the current definition of broadband (25Mbps or faster): Comcast. (A few parts of the city have a second one, RCN. And you can get business-oriented broadband services from other companies but they are crazy expensive.) Slower services are available: DSL from Verizon and CLECs, service from Sprint and FreedomPop on what is left of the Clearwire network, LTE from cell phone providers, and satellite internet if you have a clear view of the sky and can put up the dish.

Comment: Re:If you hate Change so much...... (Score 1) 498

by Shirley Marquez (#49155277) Attached to: Users Decry New Icon Look In Windows 10

I got curious when I saw the quote. If I did, I figured that maybe some other reader would as well.

Dijkstra is most famous for his letter to the ACM titled "Go to statement considered harmful". A recent study showed that the GOTO statement as used by current programmers is not harmful - but that is largely Dijkstra's doing (and all the other people who pushed for modular programming and better control flow). Nowadays about the only use of GOTO is as a way of breaking out of loop structures if the language doesn't have another way to do it, but I go back far enough to remember the horrible spaghetti code that people used to write. Heck, I wrote some of it myself.

Comment: Re:If you hate Change so much...... (Score 1) 498

by Shirley Marquez (#49148353) Attached to: Users Decry New Icon Look In Windows 10

On your quote... the story is more complicated that "it originated in California". Excerpts from the Wikipedia article on OOP ( https://en.wikipedia.org/wiki/... ):

"Terminology invoking "objects" and "oriented" in the modern sense of object-oriented programming made its first appearance at MIT in the late 1950s and early 1960s. In the environment of the artificial intelligence group, as early as 1960, "object" could refer to identified items (LISP atoms) with properties (attributes)..."

"The formal programming concept of objects was introduced in the 1960s in Simula 67, a major revision of Simula I, a programming language designed for discrete event simulation, created by Ole-Johan Dahl and Kristen Nygaard of the Norwegian Computing Center in Oslo."

"The Smalltalk language, which was developed at Xerox PARC (by Alan Kay and others) in the 1970s, introduced the term object-oriented programming to represent the pervasive use of objects and messages as the basis for computation."

So yes, the term comes from California. But the early work was done elsewhere.

Comment: Re:Operating at 20W gives zero improvement. (Score 1) 113

by Shirley Marquez (#49139041) Attached to: AMD Unveils Carrizo APU With Excavator Core Architecture

Not really surprising.

Getting the most out of any processor requires processor-specific optimization. Unfortunately for AMD, Intel has the lion's share of the market, so developers pay more attention to getting software to run well on Intel processors. Some of the top tier games that get used for benchmarks have been hand-optimized for Intel, as have productivity applications such as video encoders and Photoshop. (The last two have also benefited historically from Intel having better SIMD implementations. That is probably still true. But an A-series AMD processor with properly optimized OpenCL code might be better still.)

Intel is in the developer tools business as well. They sell a compiler that generates code that is very good for Intel processors and very bad for AMD. Any application that is built with Intel tools is going to make AMD look bad.

Finally, there is the OS issue. Because of the way AMD used paired cores with some shared elements (cache and FPU), getting the most out of the FX series processors requires changes to the process scheduler. (The simplified version: threads of the same process and multiple instances of the same application should be assigned to paired cores; unconnected applications should be spread to different core pairs whenever possible. That maximizes the effectiveness of the shared cache. The shared FPU is of little concern unless you have applications that do math with long doubles; it can do two 64 bit operations simultaneously but only one 128 bit operation.) The most popular OS on the market, Windows 7, has not made the necessary adjustments, nor has any earlier version. Windows 8 and later have, as have recent Linux kernels. Mac OS probably has not, but Apple has never made a computer with an AMD processor so it isn't relevant unless you own a Hackintosh.

Comment: Re:Question In Headline (Score 1) 151

by Shirley Marquez (#49121033) Attached to: Is Sega the Next Atari?

1830 was also republished recently by Mayfair Games. That's one more.

Hasbro is actually pretty good about licensing games when there is continuing interest but not potential for mass market numbers. But they don't own the rights to all the old Avalon Hill games; some were sold off back when AH still existed or were under contracts where the rights reverted back to the designers.

Comment: Re:How does this compare to radio? (Score 1) 303

by Shirley Marquez (#49120661) Attached to: Pandora Pays Artists $0.001 Per Stream, Thinks This Is "Very Fair"

In the US, broadcast radio stations pay no performance royalties at all. That's right, zero. They do pay songwriter royalties. They are also likely to receive promotional funds from record companies that at least offset any royalties they pay.

Spotify is an interesting case because it has both free and premium tiers, and the rate of pay for the two sets of listeners is very different. A listen by a premium listener is currently worth about 10 times as much as a free listener. Basically, the way it works is that 70% of their subscription revenue gets divided among all the listens by premium members, and 70% of their advertising revenue gets divided among all the free plays. (I suspect there are a few additional complications but that's close enough for our discussion.) The gap between the two rates may narrow in the future if the company sells more ads and/or manages to charge more for them.

Some people think that both of Spotify's payment rates are too low. Some others think the rate for free plays is too low and wanted to restrict their content to premium members, but Spotify won't let them do that; it's all or nothing. The all or nothing approach may be better in the long term, because it will increase the value of the free tier and make it more attractive to advertisers. Spotify also believes that it is good for business, because it's easier to get people into the fold first and then upsell them on getting rid of ads than it is to make them pay from the start. (Reference: http://www.buzzfeed.com/reggie... )

There is also the question of how the expected upcoming product from Apple will affect the on-demand streaming market. Apple already owns Beats Music but hasn't promoted it heavily since the acquisition, probably because they plan to replace it with a new Apple-branded service. Most analysts believe that Apple won't offer a free tier; Beats does not though they do offer a free trial. If significant amounts of music goes bypasses Spotify because artists don't like the low payment rate for free Spotify plays (this has already happened with a few like Taylor Swift), Spotify may have to change its position and allow premium-only content.

Comment: Re:Good grief... (Score 1) 672

by Shirley Marquez (#49120477) Attached to: Bill Nye Disses "Regular" Software Writers' Science Knowledge

I took a computer architecture class where that was the end point. We started by defining a simple architecture. (Each pair of students did their own; the available resources in the FPGA we were using pretty much limited us to 8 bit architectures.) Next was to write an assembler and an emulator for our processors. (We used Java in the class, largely because its cross-platform nature meant that students could code on whatever computer they owned and the TAs would be able to run the programs. Any reasonably modern high level language would have served as well; these were not the kind of programs that used fancy language features.) The final stage was to write a VHDL description of the CPU, load it into a board, and run code on it.

That was the most intense class I took during my education. (It's a graduate level course but I took it as part of an undergraduate degree program.) The one class was nearly a full time job.

Comment: Re:Good grief... (Score 1) 672

by Shirley Marquez (#49120165) Attached to: Bill Nye Disses "Regular" Software Writers' Science Knowledge

In my opinion, a proper CS program should include some education in computer architecture. EE isn't really necessary, but you can do VHDL or Verilog programming to implement computers and that's probably low level enough for the kind of understanding that a computer scientist needs.

But a lot of the people who go to college to learn about programming aren't in CS programs, they are in software engineering programs. Sometimes those programs are called CS, but don't contain anything like theory of computing and are strictly about planning and writing code. Given their narrower intent, it is reasonable for such a program (IF it is properly named as an SE program rather than CS) to omit education on computer architecture. I do think it's a good thing for anybody who really wants to understand computers to study; any student of any aspect of computing who is at a university that offers a course on computer architecture should strongly consider taking it.

The other question is how much of the other sciences a CS or SE graduate is required to study. At the top tier schools, those degrees are generally part of a science or engineering program that requires a broader base in the sciences. (For example, any degree from MIT requires two semesters of physics, one each of chemistry and biology, and two more semesters of science from a list of eligible courses, one of which cannot be in the student's major. They also require two semesters of calculus.) Many lower-tier schools will let you get a CS or SE degree without taking any coursework in the sciences other than computing classes. That is probably why Nye is dismissive of students who don't come from a major university.

Comment: Re: heres another lie. (Score 1) 237

by Shirley Marquez (#49109439) Attached to: Ten Lies T-Mobile Told Me About My Data Plan

You have named one of the big risks of iOS development. In general, Apple is reluctant to approve apps that compete with their own apps; they allow competing apps from major tech companies because it would be too unpopular not to, but a smaller developer is at their mercy. One of my nightmare scenarios would be to come up with a new idea, spend a year developing an iOS app, and then have Apple reject it because they were secretly working on the same thing.

Android does not have that particular risk. Competing with an app from Google might be difficult but at least the company will let you try; they have no prohibition on apps that compete with their own. Another key difference is that sideloading is possible; even if the Google Play store won't carry your app, you can offer it through other channels. Gambling apps and apps with sexually explicit content can be sold for Android though not through Google's store. Reputable third party stores will still ban the other kinds of content that Google prohibits, such as spyware, Trojan horses, and other kinds of malware.

Comment: Re:Fool me once, shame on you... (Score 1) 252

by Shirley Marquez (#49109391) Attached to: No Tech Bubble Here, Says CNN: "This Time It's Different."

Facebook has network effects in its favor. Basically, Facebook is popular because Facebook is popular; people want to be on the social network that their friends are on. The company also has Instagram and the Oculus Rift.

Uber has a good idea but it's not one that can be protected. They have no way to keep customers and drivers from defecting to Lyft or other services. There is no particular advantage other than habit to using Uber today just because you used Uber yesterday.

Neutrinos have bad breadth.

Working...