Comment Re:wow, really? (Score 1) 807
Maybe. You'd be better off with 8GB. I run a stack like yours and going from 4->8 was a godsend.
Eclipse (or in my case Aptana) really likes so suck the juice, so to speak
Maybe. You'd be better off with 8GB. I run a stack like yours and going from 4->8 was a godsend.
Eclipse (or in my case Aptana) really likes so suck the juice, so to speak
But the total cost not to upgrade is $0.
It is not $0. You have to factor in the cost of everything you give up by not upgrading. For example, the inability to run modern browsers (i.e. the cost of giving up the ability to visit modern websites). The cost of not being able to run modern software (i.e. the cost of not being able to view and edit modern file formats). The cost of not being able to use modern hardware (i.e. plugging in a solid state disk, plugging in a modern digital camcorder, or plugging in one of those Lytro cameras). The cost of electricity (old hardware is way less energy efficient).
Then there is the cost of not being able to find replacement components should something break. If you lost your CPU, what would it cost to replace it in terms of not just replacement cost, but the cost to find the part, and the cost of downtime while waiting for the part? If the motherboard blew up, could you find an exact replacement? If not, will the replacement be compatible with the rest of your hardware? Will you still be able to find drivers for it?
No. The longer I've been in this industry, the more I've become convinced that there is a significant cost to *not* upgrading and the *longer* you put off upgrades, the more expensive it will be once you eventually upgrade. It is easier, far less risky and thus far less costly to make incremental upgrades than it is to make sweeping changes every so often. You will at some point eventually have to bite the bullet and upgrade or you and your skills will become irrelevant. Might as well do it in small doses then as some large wholesale change.
4GB has been like the minimum threshold for any halfway decent computer built in the last four or five years. It is peanuts for ram. These days, 8GB or even 16GB should be considered the minimum for any new machine.
This is supposed to be a fucking tech site! A "massive amount of memory" should be a desktop machine built in some guys garage with 2TB of ram on it. The machine should be running some nightly build of an obscure, hand-compiled Linux distro with a nightly build of Firefox (ideally with a completely different rendering engine spliced in, just for bonus points). Instead we have some fuckwit who is running what seems to be a rickety old P4 bitching about a god damn three year old browser and I'm sitting here replying to somebody who thinks 4GB of ram is "massive". How fucking depressing is that?
That I even need to type this on Slashdot of all places astounds me. No wonder people consider Slashdot irrelevant. These topics shouldn't even be open for debate. You are a nerd. This is a tech site. What the fuck is there to debate? More Ram == Good. Newer Versions == Better. Always. Upgrade your fucking browser. Build a new machine. Embrace fucking change. What the hell people?
Slashdot. News for tech luddites. Stuff that mattered 10 years ago.
Here's the full list of Reddit comments relating to that topic:
http://www.reddit.com/r/technology/comments/onplj/feds_shut_down_megaupload/
God help us when people cite reddit as a source of truth.
This has to be the dumbest fucking story I have ever read on this site. I can't tell if this article is serious or meant for a laugh. Sadly I think it is serious.
As I write this, there is a nearby access point named "CIA Surveillance Van". You think that is the fucking CIA? Should submit a story about that?
Jesus fucking christ. This is a new low for this site.
I watched the video on their site, and afterwards YouTube showed a video for this:
mywowcomputer.com
They look identical except for the logo. Is one of them a clone?
You'd still be able to do that with an X server running on top of Wayland, and there are various ways Wayland apps can be displayed on a remote server, avan on top of X.
P.S. I would love to have some guarantees that X would survive and I would be able to run a GUI app remotely, but something tells me that the days when I was taking that for granted are counted.
Wayland should be able to support remote applications, it just probably won't use the X11 protocol. I've read that some of the possibilities for remote applications in Wayland are more efficient than the X11 protocol (though that doesn't mean they can't be used with X).
What's motivating Wayland-adoption is, I believe, a desire not for fancy animations and useless effects, but for smoother graphics. Nowadays there are all kinds of glitches and other imperfections. Also, Wayland is supposed to make the process of drawing graphics on the screen more efficient somehow and simplify the graphics stack or stuff like that.
It would be good if there was a practice of entrusting the code and all game assets to some foundation that would release the full game under an open-source license 20 years after release.
The premise does not support the conclusion.
Most people I know don't want to customize anything, they just want to use their computers to accomplish a task. The difference between an ordinary user and a super l337 hacker who goes as far as adding panel applets is huge.
Well, according to the Seattle PI, they are accused of stealing more than $750,000 In computer equipment and other items. So no, these guys did just a little bit more than a $20 charge on some dudes card.
Even issh for that matter (still haven't figured out how to consistently copy in that app)?
I'd say RDP, the program, has some of the gestures figured out. Two finger tap = right click. Double tap= double click. The problem is how to translate things like "click, hold and drag" or "Slide the slider". A lot of that might be the protocol itself (doesn't windows have accessibility hooks so things know "this widget should behave like a scrollbar"?
I dunno. It is one of the reasons flash is not supported—those were designed for a mouse. A touch interface is a whole new ballgame that is uncharted water. There is no mouse, but there are perhaps ten fingers that can control an interface.
I think the game makers will be the ones to figure out how to exploit the possibilities. I have tons of games that would never work with a mouse.
How are most of these cheesy CSI-type programs created? I would assume they are done in flash. Are they usually interactive, in other words if the actor presses a button it does some predefined animation, or is the whole thing one long animation that the actor needs to time against?
Somebody here has to have created one of these...
There must be more to life than having everything. -- Maurice Sendak