Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror

Comment Re:System76 (Score 1) 237

I have a System 76 17" Bonobo, 2012 model. 2.4Mhz i7, 16GB RAM, NVidia GPU. It is an excellent machine, but has one extremely annoying flaw: the internal wifi chip, an RTL8188ce, does not work properly. It appears to be a known driver issue that has been open for a long time (http://askubuntu.com/questions/205575/12-10-x64-rtl8188ce-intermittent-slow-internet-connection). None of the suggested software workarounds worked for me, so I have to use a small-form-factor USB wifi adapter.

I also have a Dell Latitude E6400 (Core 2 Duo, 4GB) which is a decent Linux platform. However, it has problems waking up from suspend about 20% of the time and requires a power cycle to recover :-(

I also have a Dell Latitude E4300, which is essentially the 11" version of the E6400. Hardware is practically identical. It also has wake-from-suspend problems occasionally.

All those machines are running Ubuntu 14. Whatever silly animal name goes with that. The Dells can be had used for a couple hundred bucks and aren't noticeably less capable for miscellaneous computing and light software development than today's business-class machines. System76 machines are a little pricey, but can replace a desktop pretty effectively.

Comment Re:Ah hah (Score 1) 558

Hmm, interesting. Pin count is definitely a factor that Intel would want to manage for cost's sake. So I retract my previous comment, which naively assumed that "if our architecture can address it, we'll physically enable that to happen". OTOH I see that some people on the Intel forum have successfully run that chip with 64GB of RAM; but 32GB is the maximum "supported" amount.

Comment Re:Ah hah (Score 1) 558

I don't think there are CPUs that are designed to address at most 32GB of RAM. Those limitations are set by the motherboard chipsets, not the CPU. Even 32-bit Intel chips have been able to address 64GB since Page Addressing Extensions were added lo these many years ago (like, 20th century IIRC).

Comment Re:Transparency (Score 1) 103

Electronic voting vastly reduces the complexity on the collection side, but then the tamperability problem looms supreme, but this could almost be solved with enough crypto cleverness, except that the public trust story then requires a tiny bit of numeracy beyond grade six math.

Perhaps an ecosystem of vote verification tools built upon a well-understood verification algorithm with an open-source reference implementation would alleviate the crypto-innumeracy problem. The voter wouldn't have to understand the math; only how to enter a magic number, press a button, and check for the expected result. Multiple implementations help ensure that no one can game the verification process (or just implement the algorithm incorrectly) without being caught out in short order. The verification would have to be implemented in such a way that the voter could tell that their vote was recorded correctly, but could not demonstrate to another that they had voted a particular way. Obviously, the average voter can't tell whether the verification algorithm is implemented correctly, but it's not the voters who would be checking that. Any interested party could do so, and the voting population at large would be using tools that were subject to scrutiny, either by direct examination of the code or by comparison of output with the reference implementation.

Also, the raw vote data of all elections (which of course would contain no data allowing voters to be personally identified) should be public, so that watchdog organizations can check that the outcome is in accordance with the votes cast. Such a data store should allow vote counts to be verified, and allow any randomly chosen voter to check that their vote is recorded correctly. I'm reasonably certain this would be straightforward to accomplish with a cryptographic voting system, but it would be basically impossible with physical ballots.

Comment SICP, AoA (Score 1) 637

Read "Structure and Interpretation of Computer Programs" by Abelson & Sussman (http://mitpress.mit.edu/sicp/) and "The Art of Assembly Language Programming" by Hyde (http://www.plantation-productions.com/Webster/www.artofasm.com/index.html). Do at least some of the exercises. Bask in the knowledge that the Java that makes up your peers' sole exposure to the art and science of programming is a mere corner case in the coherent universe of Turing-complete symbol systems. (Recommendations from my idiosyncratic experience, obvs.)

Comment Re:Agile doesn't mean that the project won't fail (Score 1) 349

"Then this document was fed into a series of code generation engines"

Who specified, wrote, debugged these magical code generation engines? Who verified that the (Nx)100KLOC that emerged from them was correct?

Nice that the project is still in operation. Are you sure it's doing what the spec says?

Comment Re:CS (Score 1) 704

I took a course on human/computer interaction last year, and in that course it became clear to me that there is a significant empirical scientific aspect to software development, though this may not be widely recognized as a distinctly separate thing from the main "engineering" aspect. So I'd say there are three streams within "CS":
  1. The theory part, which really is a branch of mathematics. Automata, complexity, compressibility, etc.
  2. The software-development part, which is an art that aspires to become a branch of engineering. (Yes, you can study and quantify and measurably improve the outcomes of a development process, so there is an empirical element here, also. But most software development as practiced is, I would guess, more art than science. Organizations that I've been involved in that actively pursue things like SEI levelling seem to lose interest after a while.)
  3. The human/computer interaction part, which is an empirical science in its infancy that has overlap with cognitive science and psychology.

Comment Maria Doria Russell, MT Anderson (Score 1) 1365

"The Sparrow", by Maria Doria Russell, is the most heartbreaking first contact story I've read. Summary: we are just about guaranteed to do it wrong, with tragic consequences, because aliens will be, y'know... ALIEN.

"Feed" by M T Anderson is the third side of the 1984/Brave New World triangle, and the one I see as most likely to describe our future:

1984: government controls all information.
BNW: nobody has to control information because no one really cares about it.
Feed: there's so much information (85% of it BS) that it's impossible to sort the wheat from the chaff.

Comment Re:Drivers, traffic lights, and sensors (Score 1) 423

I have spent many years in the traffic-management industry, and I can tell you that municipal governments spend enormous efforts getting traffic light synchronization right. It is by no means a trivial task, and it's simply impossible to please all the drivers all the time -- especially those who insist on violating the parameters that make synchronization practical in the first place, primarily speed limits. You know, in the US, those white signs with the big black numbers on them? Those things TELL YOU THE SPEED for which the signal corridor is synchronized. You may have to wait through a red, but once you get to green, if you follow the posted speed limits (which hardly anyone actually does IME) you shouldn't hit another for quite a while. I find that in my town I can consistently avoid stopping along a particular 7-mile corridor by driving within about 2mph +- the posted limit.

It's also important to keep in mind that there are often constraints that prevent perfect synchronization. The intersection of two major corridors will often foul up sync on both streets, since the motion of vehicle platoons is rarely amenable to perfect interlacing at the intersection point. And then there's the whole "nobody follows the bloody speed limits" thing.

Did I mention that no one pays any attention to speed limits?

Comment SICP (Score 1) 396

Read "Structure and Interpretation of Computer Programs" by Abelson and Sussman. Do as many of the exercises as you can make time for. You will end up a better programmer than most CS graduates.

That book does two things that I found revelatory:

1) Exposes you to functional programming and important features thereof (such as referential transparency) from the very beginning.

2) Starts with a very high-level, functional model of computation (the Scheme language, which is essentially lambda calculus), and proceeds toward increasingly low-level models of computation, culminating in a Scheme evaluator based on a traditional stack machine. Many programming books tell that story in reverse, starting with memory and registers and instruction sets, and then explaining how high-level language constructs map onto those low-level concepts. By starting at the functional level and elaborating increasingly detailed models of physical computation, Abelson and Sussman drive home the point that computation is an abstraction that can be implemented, and thought about, in lots of different ways.

It really is a mind-blowing experience. I read it after 15 years as a professional programmer in languages from assembly through C to Lisp and Prolog, and I learned A LOT from it.

Comment Yet more anecdotal evidence... (Score 1) 395

...as if there's any other kind.

I use Sprint, and I've never had any problems with service or coverage, except that where I live the only available data service is EVDO. But I can live with that, since I use my DSL most of the time anyway. I do a significant amount of travel to various US cities, and my phone always works.

A few weeks ago I went into a local Sprint store to get a phone for my mom. I explained that I already had six lines on my "Family Everything" plan, which technically isn't allowed (s'posed to be 5 max), but I was grandfathered in from an older plan and had been a Sprint customer for like nine years. I explained that even though I was already over the line limit, I wanted to add mom's new phone as a NEW LINE to my EXISTING 3000-MINUTE PLAN. "No problem", said the sales droid. "Cool!" thinks I.

One month later I get my Sprint bill, and it's $250 higher than the previous month. WTF????? Turns out, they added a totally new account for the new phone and gave it a shitty 200-minute pool. And mom -- whose idea of high-tech is a toaster oven with a timer and was practically peeing her pants in fear when I presented her with the phone -- had apparently overcome her reservations and burned up the freakin' airwaves to the tune of 800 minutes.

So I called Sprint "Customer Care" and explained the situation. AMAZINGLY, the service person promptly moved mom's phone into my Everything plan, and told me she'd submit a refund request for $220. "Great!" I said. The other $30 was what I'd been expecting to pay for the new line anyway.

Next day, I get a call from a different layer of the Sprint customer service hierarchy. "Sorry," she says, "We can't process your refund because your plan is invalid -- you have seven lines on a 5-line plan, and the computer just won't let us issue refunds for invalid plans." I was welcome to keep my seven lines, she said, but, oh dear poor us, our computers just won't (sniffle) allow us (sob) to audit your account for a refund.

"That is the stupidest goddamn thing I've ever heard," says I. "The provisioning system will allow low-level service-trons to configure plans that the accounting system can't audit? C'mon, pull the other one." But she was immovable on this point. And I immediately made plans to switch providers as soon as I'd got my soul out of hock from the three recent phone upgrades I'd done.

One month later, I get my Sprint bill, and lo and behold, there's a mysterious $219 credit.

So now I am happy... but I'm not sure why.

Comment Re:3 Languages are a good start (Score 1) 537

I'd add a declarative language to that list, such as Prolog, Mercury, Haskell, or (the pure subset of) Scheme. A year of working with Prolog taught me more about programming (and computer science in general) than any other experience of my 20-year career.

Anyway, after awhile they all start to look like Lisp.

Slashdot Top Deals

A computer without COBOL and Fortran is like a piece of chocolate cake without ketchup and mustard.

Working...