GeForce 16 series

From Wikipedia, the free encyclopedia
GeForce 16 series
Geforce GTX 16 series logo with slogan.pngASUS TUF Gaming X3 20190601.jpg
Top: Logo of the GeForce 16 series
Bottom: Nvidia GeForce GTX 1660 ASUS TUF Gaming X3
Release dateFebruary 22, 2019; 2 years ago (February 22, 2019)
CodenameTU11x
ArchitectureTuring
ModelsGeForce GTX series
Transistors
  • 4.7B 12 nm (TU117)
  • 6.6B 12 nm (TU116)
  • 10.8B 12 nm (TU106)
Fabrication processTSMC (FinFET)
Cards
Entry-level
  • GeForce GTX 1650
  • GeForce GTX 1650 (GDDR6)
  • GeForce GTX 1650 (TU106)
Mid-range
  • GeForce GTX 1650 Super
  • GeForce GTX 1650 Ti
  • GeForce GTX 1660
  • GeForce GTX 1660 Super
  • GeForce GTX 1660 Ti
API support
Direct3DDirect3D 12.0 (feature level 12_1)
OpenCLOpenCL 3.0
OpenGLOpenGL 4.6
VulkanVulkan 1.2
History
PredecessorGeForce 10 series
VariantGeForce 20 series
SuccessorGeForce 30 series

The GeForce 16 series is a series of graphics processing units developed by Nvidia, based on the Turing microarchitecture, announced in February 2019.[1] The 16 series, commercialized within the same timeframe as the 20 series, aims to cover the entry level to midrange market, not addressed by the latter. As a result, the media have mainly compared it to AMD's Radeon RX 500 series of GPUs.

Architecture[]

The GeForce 16 series is based on the same Turing architecture used in the GeForce 20 series, omitting the Tensor (AI) and RT (ray tracing) cores exclusive to the 20 series. The 16 series does, however, retain the dedicated integer cores used for concurrent execution of integer and floating point operations.[2] On March 18, 2019, Nvidia announced that via a driver update in April 2019 they would enable DirectX Raytracing on 16 series cards starting with GTX 1660, together with certain cards in the 10 series, a feature reserved to the RTX series up to that point.[3]

Products[]

The GeForce 16 series launched on February 23, 2019, with the announcement of the GeForce GTX 1660 Ti.[4] The cards are PCIe 3.0 x16 cards, produced with TSMC's FinFET process. On April 22, 2019, coinciding with the announcement of the GTX 1650, Nvidia announced laptops equipped with built-in GTX 1650 chipsets.[5] TU117 doesn't support Nvidia Optical Flow,[6] which is useful for motion interpolation software. All TU117 GPUs use Volta's NVENC encoder (which has a marginally better improvement over Pascal) instead of Turing's.[7]

Model Launch Code name(s) Transistors (billion) Die size (mm2) Core

config[a]

L1 Cache (KB) L2 Cache (KB) Clock speeds Fillrate Memory Processing power (GFLOPS) TDP (watts) Launch price (USD)
Base core clock (MHz) Boost core clock (MHz) Memory (MT/s) Pixel (GP/ s)[b] Texture (GT/s)[c] Size (GB) Bandwidth (GB/s) Type Bus width (bit) Single precision (boost) Double precision (boost) Half precision (boost)
GeForce GTX 1650[7] April 23, 2019 TU117-300-A1 4.7 200 896:56:32:14 896 1024 1485 1665 8000 53.28 93.24 4 128 GDDR5 128 2661 (2984) 83.16 (93.24) 5322 (5967) 75 $149[8]
GeForce GTX 1650 (GDDR6)[9][10] April 3, 2020 1410 1590 12000 50.88 89.04 192 GDDR6 2527 (2849) 79 (89) 5053 (5699) 75 $149
GeForce GTX 1650 (TU106)[11] June 29, 2020 TU106-125-A1 10.8 445 90 N/A
GeForce GTX 1650 (TU116)[12] July 1, 2020 TU116-150-KA-A1 6.6 284 75 $149
GeForce GTX 1650 Super[13] November 22, 2019 TU116-250-KA-A1 1280:80:32:20 1280 1530 1725 55.2 110.4 3916 (4416) 122 (138) 7832 (8832) 100 $159
GeForce GTX 1660[4] March 14, 2019 TU116-300-A1 1408:88:48:22 1408 1536 1785 8000 73 135 6 GDDR5 192 4308 (5027) 135 (157) 8616 (10053) 120 $219
GeForce GTX 1660 Super[14] October 29, 2019 TU116-300-A1 14000 336 GDDR6 125 $229
GeForce GTX 1660 Ti[4] February 22, 2019 TU116-400-A1 1536:96:48:24 1536 1500 1770 12000 88.6 177.1 288 4608 (5437) 144 (170) 9216 (10875) 120 $279
  1. ^ Shader Processors; Texture mapping units; Render output units; Streaming multi-processors
  2. ^ Pixel fillrate is calculated as the lowest of three numbers: number of ROPs multiplied by the base core clock speed, number of rasterizers multiplied by the number of fragments they can generate per rasterizer multiplied by the base core clock speed, and the number of streaming multiprocessors multiplied by the number of fragments per clock that they can output multiplied by the base clock rate.
  3. ^ Texture fillrate is calculated as the number of TMUs multiplied by the base core clock speed.

Reception[]

Tom's Hardware criticized the GTX 1650, noting that the 8GB variant of the RX 570 is "[...] faster, less expensive and better able to handle games with big memory requirements."[15]

Forbes described the GTX 1650 as "[...] a tempting choice for a super-small and power-efficient gaming PC [...]".[16] However, "[...] when it comes to value and bang for your buck, AMD’s RX 570 is the clear choice overall."[16]

PC Gamer described the GTX 1650 Super as a "[...] no-brainer" and a "[...] super easy recommendation [...]" over the original 1650, as it only costs $10 more and is "[...] consistently around 30 percent faster [...]".[17] However, they criticized the small amount of VRAM (4GB) and noted that users who have a PC which does not have PCIe power cables available have to "[...] stick with the vanilla 1650 [...]" because the Super variant has a higher power consumption (75W vs 100W).[17] Altogether, it has a "great overall value" and for now "[...] claims the crown for best budget graphics card", being consistently faster than the RX 580 and coming close to the RX 590 when the VRAM is not a limiting factor.[17]

They found that the GTX 1660 Super "[...] almost makes the 1660 Ti look redundant [...]" and noted that "[...] it outperforms anything AMD currently offers in the same price range [...]", being "[...] about 20 percent faster than the RX 590 [...]".[18]

References[]

  1. ^ NVIDIA Newsroom (February 22, 2019). "New GeForce GTX 1660 Ti Delivers Great Performance Leap for Every Gamer, Starting at $279 | NVIDIA Newsroom". Nvidianews.nvidia.com. Retrieved October 22, 2019.
  2. ^ "The NVIDIA GeForce GTX 1660 Ti Review, Feat. EVGA XC GAMING: Turing Sheds RTX for the Mainstream Market". Anandtech.
  3. ^ "Accelerating The Real-Time Ray Tracing Ecosystem: DXR For GeForce RTX and GeForce GTX". NVIDIA.
  4. ^ a b c "16: THE NEW SUPERCHARGER - GEFORCE GTX 1660 Ti". NVIDIA. February 22, 2019. Retrieved February 22, 2019.
  5. ^ "GEFORCE GTX 1650 GAMING LAPTOPS". Nvidia.
  6. ^ "NVIDIA Optical Flow SDK". Nvidia.
  7. ^ a b "GeForce GTX 1650 Graphics Card". NVIDIA. Retrieved October 22, 2019.
  8. ^ "Turing Now Starts at $149: Introducing GeForce GTX 1650". Nvidia.
  9. ^ "GeForce GTX 1650 Graphics Card". NVIDIA. Retrieved April 3, 2020.
  10. ^ Dexter, Alan (April 3, 2020). "Nvidia's GTX 1650 gets GDDR6 because 'the industry is running out of GDDR5'". PC Gamer. Retrieved April 4, 2020.
  11. ^ "GeForce GTX 1650 (TU106)". Retrieved June 29, 2020.
  12. ^ "NVIDIA GeForce GTX 1650 GDDR6 TU116". Retrieved July 20, 2020.
  13. ^ "GeForce GTX 1650 SUPER Graphics Card". NVIDIA. Retrieved November 22, 2019.
  14. ^ "GeForce GTX 1660 SUPER Graphics Card". NVIDIA. Retrieved November 22, 2019.
  15. ^ Angelini, Chris (2019-04-24). "Nvidia GeForce GTX 1650 4GB Review: This Turing Fails the Test". Tom's Hardware. Retrieved 2021-02-07.
  16. ^ a b Leather, Antony. "Nvidia GTX 1650 Versus AMD RX 570: What's The Best Budget Graphics Card?". Forbes. Retrieved 2021-03-27.
  17. ^ a b c Walton, Jarred (2019-11-23). "Nvidia GeForce GTX 1650 Super review". PC Gamer. Retrieved 2021-02-07.
  18. ^ Walton, Jarred (2019-10-29). "Nvidia GeForce GTX 1660 Super review". PC Gamer. Retrieved 2021-02-07.

External links[]

Retrieved from ""