Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror

Comment Re:ORACLE = One Raging Asshole Called Larry Elliso (Score 2) 405

Yes, and CPU implementation is a further subdivision of computer elements and architecture.

Then the details of register use are an even further subdivision.

I've done work in the field since the 70's. The last time I had to worry about CPU architecture was as a junior in college in 1971 when I was porting Spacewar from a PDP-1 to a PDP-8. Mostly these days it's all about algorithms in high level languages.

Aren't these details important to understand for what might be non-obvious reasons?

To give an example, take a look at gaming consoles. Performance and graphics get better over time because programmers write "to the hardware" with increasing precision as the product ages, and this provides a benefit that's almost completely unseen in modern general purpose computing.

The technique of "expanding the loop" to increase performance is a common tactic taken by programmers, no? I know I've done it with scripts before. Are modern compilers simply so good at optimizing binary code that extending this from a high-level language to the assembler it compiles to is simply not a reasonable thing to expect a programmer to be able to do?

I know I'm showing my ignorance on the subject, but of all the things I've ever researched for my own enrichment, C/C++ types of languages constantly fail to make sense to me... and I'd like to know more about the interplay between them and what actually goes on at the level of the CPU assembler-type code itself, but it seems like even getting partially fluent enough to understand what's going on in a dozen lines of ASM is an impossible dream. :P

Comment Re:simple (Score 1) 262

Meanwhile, for those of us with root, iFile is freaking fantastic....

I'd really like to see Apple implement an approval process for privileged code. Establish the world's tightest NDA for a private source code auditing program, or require that all privileged code be open source.

Allow people to do what they want with their phones if developers can make the peer-reviewed cut, preserving the quality of the user experience Apple prides themselves on but can only achieve by locking you out of the device you own. What most jailbreakers really want is not root... we just want our experience to be open to the same innovation that computers have facilitated since their inception, and we don't want to have to make the compromises ourselves to get it done. As Apple has made abundantly clear: the compromise is their job.

Comment Re:ORACLE = One Raging Asshole Called Larry Elliso (Score 2) 405

Sorry if you don't like pointers. But the coddling of CS students with Java courses these days is just laughable.

I don't quite understand it either. Isn't the point of Computer Science to teach people the principles behind the operations of the computers we use every day?

If you're not learning the premise of CPU registers, memory allocation, jumping, returning, and all that happy crap that I honestly wish I understood better, what in the hell is the actual science behind modern computing then? Doesn't it make sense to start with the raw assembler involved in the top three (or so) CPU architectures and then move up from there, learning the foundation that they all compile down to in the end anyway, so that you can see the impact that your compiled code will have as you're writing it?

Primarily I'm just agreeing with your notion that Java is likely an inappropriate medium for conveying the concepts behind the magic that we interact with every day in the millions of CPU cycles that can transpire during a single keystroke; the processes occurring in Windows alone that take privileged, direct hardware access for something like USB communication and keyboard/mouse interaction and lock it right-the-hell down to the point where the operating system can extend these things into User mode without allowing them to compromise the kernel is a titanic feat of engineering... and I've known people with Bachelors' degrees in computer science that seem basically oblivious to the nature of how that even works. It's entirely possible that they didn't care, but I digress.

Granted, my work with interpreted scripting languages causes me to find the concept of pointers and type casting and memory allocation and garbage collection to be inane, silly chores that I shouldn't have to worry about to Get Things Done.... but someone should, and their credentials should reflect that. The fundamental principles underlying modern computer systems and their continuing evolution depend on it.

Comment Re:Am I on Slashdot? (Score 1) 295

My God, the luddites have taken over Slashdot tonight.

When I have 10Gb at home, I'll:

* Boot every PC from a remote server. No need even for local swap.

I was doing this with my own desktop until I finally decided to purchase some SATA 3 SSDs. Even drag/drop file copy operations were working great, but they typically would run 60-80 MB/s once local RAM cache filled up. The SSDs in my machine effortlessly push 150-200, and they're Xen-based PV devices from Windows' perspective.

The reason that they're not in my SAN is because I simply don't know how to set up multipath iSCSI, nor do I have support on my switches for 802.3ad. I suppose I could plug my machine directly into my SAN four times... but that's bordering on absurd for a home setup. It would be far simpler to just have Moar Bandwidth on my local links... which I would totally do if it was affordable!

That said, I'm quite curious about your home setup as far as network booting goes (see the sig :P). Any chance you've blogged about it or something?

Comment Re:Copper? (Score 4, Insightful) 347

Expensive, but rock solid and quick (vs fast).

Every time I see a statement like this, it reminds me we could really use some better single-word descriptors to disambiguate a connection that is

  • High vs. Low bandwidth
  • High vs. Low Latency
  • All possible combinations of the two

Not that we don't understand what you meant of course! I just have a feeling that "fast" or "quick" will be rather ambiguous ways to describe a network connection for a rather long time :P

Comment Re:iPad with AirDisplay (Score 1) 141

I tried the same thing and couldn't stand the awful performance---turns out the performance woes are caused by the virtual display adapter. Try Splashtop, and just force your computer to connect a second monitor to your existing video card. You can usually force it with Windows, but a simple resistor in a dvi adapter will work too as a dummy plug.

Even at 2048x1536, I was able to watch movies, play 3D games, and more all as if it was connected locally. It's rather impressive, though it's a shame that Splashtop's Virtual Display product doesn't work this way... the only thing that isn't automated in my approach is the attach/detach, and Splashtop liked to fuck the display positioning, so I'd fix it manually. I ended up ditching the setup because I don't have more than one OEM iPad charger, and I haven't seen a non-OEM one that is wired properly for the iPad, so it'll die if I use anything other than the stock charger, and also because the app doesn't background... which meant I had to unfuck positioning every time I locked the device or got up from my desk to do something else and took it with me....

I probably should have written them about it but I didn't care enough and attached a second monitor. If you use this daily, you might be interested though!

Comment Re:I'm not paying $1000 for a damaged phone (Score 1) 75

I know you're trying to prove that the AC is a dickhead, but seriously, if you think that the answer to "Why is the computer in my pocket doing X or Y instead of Z" is "because you're a fool for having a computer in your pocket when clearly the only thing that really belongs there is a cordless phone from 1992," you're deluding yourself and insulting the audience of this website. The level of connectivity that a smartphone provides is only the very tip of the iceberg for what the future holds for the entire human race, and you're an even bigger idiot to deny that fact than is the person who buys a smartphone without ever downloading an app or visiting a website on it.

The problem here isn't a lack of due diligence on the part of the consumer; it's a lack of ethics on the part of the manufacturers and service providers, because though they know full well what their real place is in the OEM/ISV/ISP trifecta that makes up modern computing, the public at large hasn't quite yet figured out that extending this paradigm into the pocket isn't a privilege which one ought to pay through the nose to have. And as long as they can keep it that way, they're going to take the homefield advantage to the bank for as long as they can by forever-locking-down hardware before they sell it and charging you $10 for 3 minutes of full-speed bandwidth utilization.

Comment Re:There are two kinds of programming languages... (Score 1) 312

I'm so glad I get to use this legacy software!

Said nobody ever.

Nor did I. I just pointed out, and rightly so, that a lot of software these days suffers from enormous bloat and lack of optimization.

But if it does the job, and it's fast and useful on modern hardware, who cares?

If you happen to listen to the Security Now! podcast, Steve Gibson usually complains every week or two about Chrome taking up more and more memory---like a whopping 250 MB---to display a single window with an empty tab. To that, I say back: I'm having trouble giving a flying fuck about a quarter gigabyte of RAM when EVERYTHING on my system can't even seem to gobble up HALF of the 16 gigs I have. And that's on Windows. With superfetch.

If the code runs well on a modern system, and it allows programmers to focus on performance by being lazy with memory, what the hell does it matter? This isn't 2005. People figured out that RAM is an important statistic in the computer they want to purchase, and OEMs started realizing that if they DOUBLE the memory in a computer for a pathetically small amount of money, it turns out that their products won't be titanic piles of shit.

Programmers are starting to realize that they can offload their sloppiness to memory management and focus on CPU cycles. If Chrome is the shining example of why that's a good thing, then personally, I think we're doing something right!

Perfect code isn't impossible, it's just a ton of time, work, and money. There's nothing wrong with shuffling the budget to take advantage of the realities of today.

Comment Re:Spectrum? (Score 1) 128

Wouldn't the most sensible approach be to carve out the physical area the spectrum occupies into different three-dimensional "slices" then?

Consider a more focused triangulation of the individual handsets, to the point where you can get within a meter or so of accuracy. Use low bandwidth, non-hogging links to determine a phone's location, and then migrate the data channel to a number of dedicated antennas that are dynamically aimed at a handset. Boost the power and constrain the FoV to what's appropriate for a device at a given distance, and hand off between focusable antennas on the same tower as the target moves.

Literally just shoot people with high-bandwidth spectrum they can have all to themselves.

Comment Re:iTunes (Score 1) 519

So there obviously is a proper way to do this, but how does it work?

Shouldn't the service sit idle, waiting for an announcement from the USB stack about a device plug-in event, check if it's an iSomething, and then signal over to the user mode application that iTunes should launch? Isn't this what Filter drivers are for? Couldn't they stick that filter between the iSomething and the USB root hub?

Comment Re:No iTunes for the Windows Store (Score 2, Informative) 519

I do not work for Microsoft and as an owner of an iPod, which requires iTunes to transfer music from my computer onto the device, I can tell you that the Windows version of iTunes is probably the shittiest piece of software ever written.

I actually thought that iTunes was just absolutely awful because it was iTunes. And then I got a Mac.

Turns out that iTunes (while it's still a feature-overpacked piece of trash) is really only this terrible on Windows. On OS X, it just sucks because it's crowded and confusing, but it does run pretty well.

Slashdot Top Deals

Biology grows on you.

Working...