A resolution of 1024x768 or higher is advised.
Thank you, common type of warning from 1998.
Rules for the use of services at pcrpg.org
Desktop Computer GPU History
I've owned many video cards through the years. The same can be said for processors, motherboards, hard drives, and other things. As a way to chronicle the history of the GPUs I've used, I'm listing them here along with some of my memories associated with each one.
- S3 Virge DX - This is the first video card I ever bought. It cost $20 and took a month to arrive which seemed like an eternity as a school-age kid. While the Virge DX supposedly supported 3D, I could never seem to get it to work.
- nVidia RIVA TNT - Creative Labs branded--I remember the distinctive yellow box art. It cost about $120 which was a huge sum at the time as a student. I remember imagining how powerful it was as I waited for it to arrive. I believe I even imagined an array of them powering a Holodeck at one point. This is also my first personal experience with 3D acceleration under my own control--I finally had it for myself instead of just seeing it now and again at older friends' houses. I remember being absolutely floored with the difference it made when playing Quake online. I could finally fire up GLQuake and GLQuakeWorld. I was enamoured with the colored lighting effects on powerups like the blue glow from a person with the quad damage rune. The smooth textures and fluid framerate were also astounding when coming from the low-res, software-rendered, 12-fps world I was used to for so long. I still have this one, and it's actually in use today in a storage server.
- nVidia RIVA TNT2 Ultra - This one was Guillemot branded, and I remember being very proud of owning this one. I don't recall how I came to own it, but considering its launch price was excessive compared to the TNT generation before it it's likely I had some assistance. The full name for it was "Maxi Gamer Xentor 32", admittedly a dumb-sounding name. That said, I spent many hours playing Quake 2 with it. I later sold the card to my dad for a system where he needed an AGP video card and made about $100 back. In 2012 or 2013, I regained possession of it due to its complete and utter obsolescence. The machine he was using it in has been completely upgraded. I don't know if it still works, frankly, but plan to try it during a retro computing spurt some time. I do know the fan has long since ceased spinning as is the custom for those tiny 40mm fans from that era.
- nVidia GeForce 3 Ti200 - This time around I had an eVGA branded card. This was significant at the time as their Ti200 boards were renowned for their overclockability compared to their competitors. I don't think I ever managed to overclock it to the levels of a Ti500 like some others had been able to, but I did definitely overclock it. I remember playing Quake 3 Test and later Quake 3 Arena with this card and having a real blast doing it. It made all the difference in the world as I had been trying to play Q3A with my by that time very-aged TNT2 Ultra. It had worked, but it wasn't nearly as fast as my brother's Matrox Millennium II. Another brother of mine eventually traded his Ti500 down to me, and I passed my Ti200 on to the one with the Millennium. I don't believe I know where either of those two cards are at anymore, unfortunately. Consequently, this is the last 3D game I can remember my brother with the Matrox card ever playing with me. He never really developed a serious interest in computer games as I had.
- nVidia GeForce 3 Ti500 - A trade-up enabled by my brother passing the card down after an upgrade as mentioned above.
- ATI Radeon 9500 - This was my first experience with an ATI video card. It was technically ATi at the time, but I prefer the later all-caps spelling these days. It was substantially faster than the GeForce 3 Ti500 I was still using, but it struggled with Doom 3 nonetheless. I remember having to play at 800x600 instead of the resolutions I normally preferred to play games at like 1280x960. By this time I had a 21" 1600x1200 monitor. I believe this card failed within a year and was replaced once via warranty. I don't believe I have this card anymore, but I did hold onto it for many years after its retirement. It even made its way into a cardboard gaming computer that I took to high school for fun during Electronics class at one point. The fan had died (surprise!), and I had rigged up another 40mm fan by hot gluing it to the heatsink. You'll never guess what happened during a long session of Doom 3 one time.
- ATI Radeon 9700 All-In-Wonder - At the time I was using it, I never did figure out for sure if this card was equivalent to a Radeon 9700 or Radeon 9700 Pro, but today I'm pretty sure it was the former. I had used capture cards to enable my video capture hobby in the past, but my brother had suggested I get an AIW to enable a better experience of it. Admittedly, my history with Pinnacle PCTV capture cards had been mixed. Lots of problems and poor software. The Radeon card captured better, but it didn't come with any software to actually enable its use. I ended up re-using the same capture program that had been bundled with the Pinnacle card in the end. As for gaming, I mostly played multiplayer Shifter and Team Aerial Combat (Tribes mods) by this point. No plans had been explicitly laid out as such, but as it turned out I would have this card for a very, very long time.
- ATI Radeon X800 Pro - This card was obtained via a trade-up program that I found on ATI's website in the last half of 2005. They offered to trade anyone up to an X800 Pro if they had a qualifying trade-in, and it so happened that I did with the Radeon 9700 All-In-Wonder. The ultimate cost to me ended up being something like $100--it was a super-good deal. Better yet, this card was unlockable into an X800 XT Platinum Edition, the highest SKU available at the time with the exception of the minor X850 XT PE revision that had come out that spring. I remember playing the Source Forts HL2 modification online around this time and being really proud of having a flagship (nearly) again. The core reason I was itching for an upgrade though was so that I could play the newly released Battlefield 2142 better. While not apparent at the time, the Frostbite engine would time and again force me to try to upgrade hardware through the years with its poorly optimized performance. I eventually gave this card away to a friend at college after it had killed two motherboards it was plugged into at class. Yes, he was fully aware of why I was giving it away and what it had been apparently doing. I'm not sure what he did with it.
- ATI Radeon X1900 XT - While not the highest-end SKU in the series line-up, it was close enough to fit my needs. This was part of my first major system upgrade in years. I remember my brother being surprised that I hadn't gone for the high-end XTX model. This card is the one I used throughout the rest of my college years, and I remember playing lots and lots of Counter-Strike Source using it. I was really into the Zombie Mod scene at the time. This was back before Valve's Left4Dead had launched. Eventually, the card started exhibiting artifacts and had to be replaced. I remember talking to tech support, but I don't think it ended up being replaced under warranty. I no longer have this card.
- ATI Radeon 2900 Pro - When this generation launched, I was floored with its sheer design. The 512-bit memory bus was technically impressive. Yes, the core was big and hot without really nailing nVidia's offerings to the wall, but I loved it nonetheless. I bought a 1 GB Pro as once again the highest-end SKU was just priced out of my reach. This was in an age where 512 MB was considered the high-end norm, but it is perhaps the first sign of an appreciation I later developed for cards with as much VRAM as possible. VRAM is a big decider in how long a card will remain viable these days, more so even than the cores themselves it seems. This card ended up in one of my dad's computers for a while, I believe. It may have eventually failed. I'm not sure where it is at now.
- ATI Radeon 3870 - When the die shrink of R600 launched, I had to upgrade. It was essentially the same as the 2900 Pro I had had before with the same 320 shaders, 16 ROPs, and 16 TMUs. It only featured a 256-bit memory bus after the shrink, but performance didn't change noticeably. If anything it was very slightly faster due to the other minor tweaks made to the core during the shrink. The card itself was single slot and astoundingly inexpensive--$159 by the time I ordered in April of 2008 despite its launch price of $269 just six months earlier. This upgrade was more of a sidegrade to a very slight downgrade technically speaking, but it was done more out of a want to play with the new 3870 than out of any technical need. This card sits on my shelf for bench duty to this day despite being purchased long ago.
- ATI Radeon 4850 - I remember when these launched back in 2008. I was watching the build-up to the launch closely on tech news sites and forums. I bought a pair of them with rush shipping from Newegg for $189 each on July 1, just a week after launch. This was my first experience with multi-GPU gaming which is something I had wanted to play with for a long time but could never justify the price of. I've been multi-GPU ever since, though not always when I first buy into a new generation. I remember being amazed at how little power the 4000 series pulled when idle. Load power wasn't even that bad. My strongest memory associated with gaming on these cards would be playing Team Fortress 2. Unfortunately, this was at a time when I had a dual socket machine which had a lot of cores but very low clock speed. My gaming performance suffered. The VRM heatsink with its grid of copper cylinders really grabbed me aesthetically, and it was perhaps that insignificant detail which pushed me toward buying them as early as I did. I know that's strange, but hey. Over time, I ended up buying two more used 4850s as a means to have some cheap quadfire fun in Battlefield Bad Company 2. In quadfire the cards had plenty of shader power with their combined total of 3200 SPs but with only 512 MB of VRAM they weren't able to play demanding, modern games sufficiently. It was fun but a little disappointing.
- ATI Radeon 5850 - Before trying my hand at 4850 quadfire, I had already moved onto a Radeon 5850 in my desktop. I later bought a second 5850 to help out with Battlefield Bad Company 2 when it launched, but I knew at the time it would be a gamble. I had used multi-GPU setups long enough by that point to know that not all games benefited from a second card. My fears were confirmed when I installed the second card, too. While my framerate counter showed nearly double what I had had with a single card, the experience was no smoother. It was just as stuttery and jerky as before when I was playing at 25 fps. This observation was made a few years before the whole microstutter debate started up online, and I felt like I had been somewhat vindicated when it became clear that my experience was exactly what was happening under the hood--ATI's cards were bad at not displaying the frames from both cards in a meaningful way. Instead, many frames would be mostly dropped despite having been technically rendered and registering on a framerate counter.
- AMD Radeon 6970 - When the 6000 series launched, I was again ready for an upgrade. My biggest concern was to make Bad Company 2 run better. My friends and I weren't playing it as much by this time, but it was still in the rotation. Performance was much, much better than it had been with my 5850s despite only having a single card and a framerate that wasn't any higher than the pair of cards had managed to produce. Thanks again, frame timing. I later added a second 6970 from the used market when Diablo 3 launched. They could be had for about $260 by that point which is what I paid to the eBay seller I bought mine from. Years later I picked up two more 6970s for a second go at quadfire. The cards were down to about $125-150 each by that point as I had bought them not a month before GPU prices went crazy from cryptocoin mining. Prices soared to nearly double after I bought mine. heh Great timing for once. Here I am today, actually. The number of 6970s I have in my desktop varies over time as I use them for bench tasks or want to stuff other cards in my machine, but I still own four working ones. For the record, I've seen my desktop pull as much as 1.1 kW from the wall when playing Battlefield 4. It's definitely time for an upgrade. Now that GPU prices have crashed since they are no longer cost effective for miners to crunch with, I plan to pick up a used 290, 290X, or maybe even a pair of one or the other on eBay for a great price. This is the first Radeon series that AMD launched under its own brand name following their acquisition of ATI four years prior in 2010.
- AMD Radeon 270X - I finally upgraded from Radeon 6970s to Radeon 270X's in the summer of 2014. Following the largest electricity bill I had ever received, I knew it was time to move off of 40nm. The 270X is just a tweaked Radeon 7870 from what I can tell. The specifications are the same, but AMD went as far as to give it a new die name. They've never done that before for a rebrand, so I have to think something is different even if it is mostly 7000-series in terms of architecture. The cards use significantly less power than the Radeon 6970s I was using before despite being about 45% faster. Every single watt that is present 24/7 costs about a dollar per year in power, so the drop from 160+ to ~125 watts at idle will save a significant amount over time. Unfortunately, I keep having trouble with displays not waking back up after power saving mode has been triggered. It also frequently experiences BSODs if I try to redetect my displays following this event. Youtube HTML5 videos also randomly go black during playback. I eventually had to revert to a 6970 pending driver fixes for a time. As of 2015 I'm back on a single 270X though, and everything seems to work well with a single card and current drivers.
- AMD Radeon 390 - Purchased on 2015-06-24, received the next afternoon. Between the delivery and the drugs they gave me during surgery that morning, I was having a great day. I experimented with overclocking for a while, but ultimately it proved unnecessary. There just isn't enough headroom to make a meaningful difference. I liked the idea of an 8 GB Radeon 290 with higher stock clocks which is why I went for the 390 instead of one of the aftermarket 8 GB 290 variants. There was a slight price premium.
- AMD Radeon Vega 64 - Purchased on 2017-10-30 for $517 from Newegg (via eBay) and delivered four days later on a Friday. I was pretty excited to receive something new after having passed on upgrading to a Fury or Fury X once they became affordable. Cryptocurrency miners had caused the stock of most high end AMD cards to become exhausted, and the best AMD card that was available until the launch of Vega was the midrange RX 580. That card is essentially equivalent in performance to the Radeon 390 I already had.
I waited patiently for about three months for the post-launch pricing of Vega cards to return to a reasonable state. Prices had exploded to about 30% or more over MSRP after the very limited initial stock was gone. Newegg periodically dropped their pricing, and within the few weeks before I finally purchased I had seen them list cards at $599, a week later at $570, and a week later at $550. I considered purchasing at that point but someone pointed out that Newegg was listing the Gigabyte Vega 64 on eBay for $517. At 3% over MSRP, I figured it was finally a good time to buy. Later that day Newegg listed the Sapphire Vega 64 on their regular site for its proper $500 MSRP. That’s just the way things go.
Truth be told, the Vega 56 model is the card to get. It’s just a bit slower than Vega 64 and $100 less expensive. I had always liked Fury’s 4096 shaders though and knew that I wanted the full die product this time around instead of the one-step-down-but-better-value-ratio model I would historically purchase. It’s simply cool technology. At $500, the price point isn’t too extreme.
I'm looking forward to playing PUBG and seeing how the performance improves. I get about 45-60 fps most of the time with the Radeon 390. Unfortunately, it will be a hassle to use for a while. I've been running Debian for about a year now, and AMD is in the middle of getting a display code (DC, formerly "DAL") overhaul merged into the Linux kernel. It looks to be scheduled for kernel version 4.15 which at the time of writing is the upcoming branch. 4.13 is the latest stable branch and 4.14 is going through release candidates. Without the new display code, no video output will be available from Vega based cards. It's available in AMD's closed source binary drivers already, but I've not messed with those in modern times due to how nice in-kernel driver support is. For the most part it "just works" without any hassle. Such is not the case for third party kernel module based drivers.
Update: Now that I've received the card and have installed it, I have some comments regarding my thoughts and expectations in that last paragraph. As it turns out, I do have video output when using Debian despite being on a 4.13 kernel. Basic frame buffer graphics do work (both text mode and graphical), but there is no hardware 3D and presumably other advanced function blocks like UVD and HDMI audio aren't available. As such, the system falls back and uses the Gallium software renderer. It works great for desktop use other than its not supporting the features required for f.lux to change the color temperature of my screen at night. The desktop experience is as quick and responsive as usual. Video is still smooth and basic 3D works fine. I can play Quake, but modern games get single-digit frame rates even with a 16 core CPU and video settings set to minimum with very low resolutions. I was able to get CSS to run at about 15 fps with some bots, so that was fun to play with.
Accelerated 3D in Windows yields a good experience. Playing PUBG I've seen my framerates increase quite substantially. I at times exceed 100 fps in-game with 85-90 fps being pretty common. Low frame rate areas that yielded 45 fps are now probably around 70 fps. Dips and minimums seem to have been raised to a degree that makes aiming less troublesome. I'm pretty satisfied with the results, so I suppose this was money well spent. One interesting side effect seems to be a faint buzz in my right audio channel when my frame rate in PUBG hits about 90 fps. It gets quieter below that point and louder above, so I can roughly tell what my frame rate is at any given time from the sound. heh I tried moving the video card up two slots in case it was inducing a current in the output from my sound card, but there was no change. It must be getting in somewhere else.
Hits for this page (Since 6/20/2014):
Other places you need to visit:
NOTICE: This page is written in overly-complicated PHP.
Document rendered in 0.003 seconds.