Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Graphics Hardware

NVIDIA Turing-Based GeForce GTX 1660 Ti Launched At $279 (hothardware.com) 94

MojoKid writes: NVIDIA has launched yet another graphics card today based on the company's new Turing GPU. This latest GPU, however, doesn't support NVIDIA's RTX ray-tracing technology or its DLSS (Deep Learning Super Sampling) image quality tech. The new GeForce GTX 1660 Ti does, however, bring with it all of the other GPU architecture improvements NVIDIA Turing offers. The new TU116 GPU on board the GeForce GTX 1660 Ti supports concurrent integer and floating point instructions (rather than serializing integer and FP instructions), and it also has a redesigned cache structure with double the amount of L2 cache versus their predecessors, while its L1 cache has been outfitted with a wider memory bus that ultimately doubles the bandwidth. NVIDIA's TU116 has 1,536 active CUDA cores, which is a decent uptick from the GTX 1060, but less than the current gen RTX 2060. Cards will also come equipped with 6GB of GDDR6 memory at 12 Gbps for 288GB/s of bandwidth. Performance-wise, the new GeForce GTX 1660 Ti is typically slightly faster than a previous gen GeFore GTX 1070, and much faster than a GTX 1060. Cards should be available at retail in the next few days, starting at $279.
This discussion has been archived. No new comments can be posted.

NVIDIA Turing-Based GeForce GTX 1660 Ti Launched At $279

Comments Filter:
  • by Anonymous Coward

    I currently have a 1070, and this mess of cards is very weird.

    The 2080's are to expensive for me, the 2060 and 2070's don't offer that much more for the price, and these mid range cards are about the same.

    I had been hoping that nvidia, putting out all sorts of weird cards at so many levels might put out a 2060ti or 2070ti.

    Otherwise its see y'all in 2-3- years.

    At least the 1660ti seems to be good performance for the money.

    • Well, there's a couple of advantages to the 1660ti over the 1070 (which I also have). One is better framerates at resolutions higher than 1080p (and I'm at 1440), and then there's lower power consumption. Part of me wants to upgrade for those two facts alone. Selling my 1070 and getting one of these would make the 1660ti only come out to about $80, so it's worth considering.

    • The 2080's are to expensive for me, the 2060 and 2070's don't offer that much more for the price

      They do offer more, they just offer something useless. You're paying for ray-tracing hardware and anti-aliasing which is largely unused by games so far. It's their "killer feature" without actually a contract to kill.

      Mind you NVIDIA haven't had a good record of making sensible release choices. E.g. the 1070 Ti which didn't make sense (for them) as it effectively destroyed the market for the 1080 given the price and performance. Then there's the 1060, with 3GB, with 6GB, some with GDDR5 some with GDDR5X with

  • I want something that just works, I can code, tinker, bug fix. No more s3cr3ts! https://www.youtube.com/watch?... [youtube.com]

Kleeneness is next to Godelness.

Working...