Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Comment Re:Missing piece to this puzzle (Score 2) 165

Read the article - they are working with Facebook on that, but it takes a lot longer. Now, though, they should have 14 months to get it done.

What I wonder, though, is what sort of stuff could be in the account that would incriminate him? Surely the parents of the victim cooperated and provided *her* FB password, which would have given them access to any mutual communication?

Comment Re:One good thing came from MoviePass, at least... (Score 1) 122

Oh, interesting! I am not well read about things like this outside the US. Interestingly, the AMC A-List thing is only about the cost of 2 tickets a month... which is why I could justify it. Prior to joining I only saw about 2 movies a month, sometimes less.

I wonder if they will see the failure of MoviePass as a chance to (slowly) increase the price of A-List over time? The cost of 5 tickets a month would be closer to $50 or $60 most places here in the US, maybe even more if you figure in the higher price of premium movies (Imax, 3D, etc).

Comment One good thing came from MoviePass, at least... (Score 3, Informative) 122

AMC Stubs A-List. If you aren't familiar, it is AMC's own version of a subscription plan, and at more survivable pricing. $20 a month for 3 movies a week, for yourself only, with no blackouts or limits on the type of movie (3D, Imax, etc). If you watch even 2 movies a month it should break even, and anything more nets you a savings... while it isn't so dirt cheap that it will kill AMC. I would love it if they'd add an upgrade for another, say, $10 a month that would allow you to use the 3-per-week to cover others (as long as you were with them)... but that may be more niche than they want to go, or it might not be justified price-wise. If I take my family to the movies, AMC still gets a lot of month (wife + 3 kids) even if my ticket is already covered under the A-List plan. Plus any food we buy. I don't think AMC would have come up with this idea if it weren't for the competition from MoviePass.

Comment Re:None (Score 1) 95

As I mentioned, my reply was not so much directed at the original question as at the comment above mine (the original "None" comment).

Regarding the original Ask Slashdot question, I am sure if I am qualified to answer... but it seems to me that the question is pretty wide-ranging and not very focused. The query includes mention of 2D and 3D visual content creation, consumer software, VR, graphics research, patents, and more. I would think that checking out conferences like SIGGRAPH and GTC would showcase a lot of the ongoing research and development in these fields, but I don't follow any publications (which is what the question ended with a request for).

Comment Re:None (Score 1) 95

Oh for Pete's sake... so does that mean nothing NVIDIA or Intel says can be true, just because they sell GPUs / CPUs?

Sure, a company selling something certainly might stretch the truth to try and get folks to buy things - but that doesn't mean that all the stuff said (or written) by anyone selling a product is automatically incorrect. Where I work we actually don't do much advertising, or make crazy claims: we actually run real-life software to see how it performs, and then publish the results publicly. We could keep the info to ourselves, if we wanted to only benefit our company and our customers, but we don't. We could also make wild claims without backing them up, but again we talk about what we test, how we test it, and then let the results speak for themselves.

I'm not even going to plug the name of the company or link to our website again, since that isn't even the point of the conversation we were having. I didn't bring it up until you tried to accuse me of just being "an IT guy who reads Intel's marketing" (effectively claiming I don't really know what I am talking about, and that I am not qualified to comment on this stuff). Linking to my extensive writings and research was the fastest way I knew of to prove you wrong.

Now, can we all just go back to being civilized? Computer hardware has made huge advancements, but you have a fair point that for an average, basic computer user it hasn't be a hugely dramatic shift in the last 10 years. I would compare that to cars: we've got huge advancements in electric and hybrid tech (especially batteries) going on, as well as the beginnings of smart / self-driving cars, but the vast majority of drivers are still using gas-powered vehicles that aren't all that different from the cars of 10 years ago. So car tech *is* constantly being researched and improved, but that trickles down to most folks on a much slower timetable. Is that sort of the idea you were trying to get across?

Comment Re:None (Score 1) 95

That is just my personal system, which is actually running a CPU from ~ 4 generations ago (so about 5 years old). I'm on the cusp of an upgrade (likely this year) to a 6-core at over 4GHz, and if I were into any applications that benefited from higher core counts I could get a 16-core AMD or 18-core Intel processor. It all comes down to what an individual user needs, wants, and of course can afford.

If your point was more along the lines of "basic Internet and office application usage isn't any more complex today than it was 10 years ago, so a computer with similar specs will still do the trick" - then you'd have actually been fairly correctly. But many areas of computer use can and do use far more powerful hardware today: gaming, video editing, 3D rendering, photogrammetry, machine learning, and more. Since the original question was about graphics advancements, it isn't really fair to come back with an answer that 'nothing much has changed' (a rough paraphrase of your comment).

Comment Re:None (Score 1) 95

As I noted, my comment was just in regards to the sweeping - and very incorrect - statement made by 110010001000. I work with computers, but I do not personally perform research to design new hardware or software approaches to graphics. The original question is also very wide-open, so much so that I do not feel I can directly answer it... but I didn't want to leave such a disparaging comment about computer technology unchallenged.

Could you perhaps enlighten me as to what you think I missed regarding the original "article" (really just a user-submitted question)?

Comment Re:None (Score 1) 95

The top-end desktop CPUs in 2008 from Intel were the first generation of the Core i7 series. The maximum amount of RAM that CPU supported was 24GB. Today, the top-end Intel and AMD desktop processors support 128GB of memory (Core X series and Threadripper both) - and if you go over to the single-socket Xeon W you can get 512GB. Dual-socket CPUs support more now, and supported more back then, but we are looking at ~5 times the RAM capacities today that were available then. 2-4GB was typical for an average desktop back then, if you don't want to look at the high-end, and now 16GB is easy to get while many systems have even more.

On the GPU side, as an AC also noted below, memory capacity has gone up substantially. The top-end desktop GPUs in 2008 had 1GB of video memory, while top-end cards today are at 11 or 12GB (depending on whether you consider the Titan series to still be desktop cards, or if you consider that to max-out at the 1080 Ti). Workstation cards go even higher, with capacities double what you find on consumer cards (24GB on the Quadro P6000, for example).

And to debunk your statement that "The computer you had in 2008 is essentially the same as you have now in 2018":

In 2008, I had a dual-core CPU running ~2.5GHz with 8GB of memory, a GPU with 512MB of memory, and a brand-new 80GB Intel SSD that could push about 250MB/s read and write speeds.

Today I have a quad-core CPU running at ~3.4GHz with 32GB of memory, a GPU with 8GB of memory, and two SSDs (500GB at ~550MB/s and 400GB at over 1500MB/s).

So my CPU has doubled in core count, increased over 50% in clock speed, and is several times faster overall thanks to a myriad of other improvements. My GPU is on the order of 30 times faster with 16 times the amount of memory. My SSDs have over 10 times the total capacity and are almost 2 to 6 times faster (and I don't have cutting-edge SSDs, myself).

Oh, and generational changes? They do still average 18-24 months from Intel on the CPU side and NVIDIA on the GPU side. Sometimes a little longer, sometimes shorter. From 2008 to 2018 we've had ~8 generations in the Core series from Intel. NVIDIA has gone from Tesla to Fermi, then to Kepler, Maxwell, Pascal, and now Volta (though that has only shown up on GPU-compute oriented cards thus far). So 8 generations over 10 years from Intel breaks down to about 1.25 years per generation, or 15 months. 6 generations over 10 years for NVIDIA is an average of 1.66666... years, or 20 months.

Comment Re:None (Score 2) 95

Ha ha ha ha... wow, that is... wow, so wrong.

GPUs have increased many-fold in performance since 10 years ago. Not even the fastest video card from back then could power a VR headset today, or support modern gaming on a 4K monitor. CPUs have made less of an increase in raw clock speeds, but have made huge jumps in core count and instructions per clock (especially in specialized areas, like vector units). RAM capacities have gone through the roof. Drive technology has made the jump from HDD to SSD, and then from SATA-based SSDs to PCI-Express.

Yes, from one generation to another is usually a relatively small difference - but with generational changes every 18 to 24 months, over the course of a decade you are looking at much bigger improvements than your comment stated. And this is all without talking about things like using GPUs for general-purpose computation, which has vastly improved performance in many areas of computing.

By the way, this is intended more as to refute the parent comment - not as a direct answer to the subject of the main post.

Comment Re:Sprint - Low Price Family Plan w/ Unlimited Daa (Score 1) 226

I can't speak for Dallas, but in the Seattle area there are certainly places where I get no / very poor reception. It usually seems to be a combination of location + being indoors, in the lower floor of my church for example, but I've run into a few dead spots as I drive around as well (there are two on my way to / from work). I think that is one place that Verizon (and I assume AT&T) have an edge, but for me the unlimited data plan / price has been much more important than a dropped call or data interruption here and there.

Slashdot Top Deals

Say "twenty-three-skiddoo" to logout.

Working...