Monday, June 16, 2014

My Seiki 4K


4K is on the way. You've seen 4K displays at trade shows, like the one on the Coolux stand at LDI showing their media server, and it's likely that you will start to see them in video production circles. I decided I would get one as a second monitor for my workstation for a couple of reasons. First of all, a large 4K monitor would provide plenty of real estate to display several different windows on one display. Secondly, I heard that Seiki 4K TVs are cheap. I did some research and read several reviews, almost all of which had good things to say about the Seikis (and it's the same brand Coolux had at LDI, so I had seen one in person).

I opted for the 39" (diagonal measure), which is 34" wide, and since I'm viewing it from about 42" away, I have a viewing angle of about 44 degrees. That's slightly larger than the 40 degrees or less viewing angle recommended by THX, but it's much less than the 60 degrees viewing angle recommended by some people. I think it works great as is. Or I should say, I think it is going to work great at that viewing distance.

I say that because, although I've had the TV connected to my computer for a couple of weeks now, I have not been able to get it to display anything higher than a resolution of 1920 x 1080. The problem is my graphics card, or so I thought. 

I have a GeForce GTX560 SC 2G graphics card in my computer that I bought about three years ago. I couldn't get it to output HDMI to the display, so I assumed that it was too old and that I needed a new graphics card. How do you find the right graphics card?

I started doing some research and I quickly confirmed what I had long suspected; that it's very difficult to find the right graphics card based on manufacturer's specs. The specs are filled with terms that might make sense to someone, but not to me. Seriously, does anyone really know how much better a graphics card with 4.3 TFLOPS will perform compared to one with 3.79 TFLOPS? Or one with 2048 stream processors versus one with 1792? Not me. So I just looked for a graphics card that is capable of 4K resolution (4096 x 2160) and had as much video RAM as I could afford. I narrowed my options and then started reading reviews. When I read a review that said the Gigabyte AMD Radeon HD 7950 3GB is the "best bang for the buck," my interest was piqued. This is a graphic card built by Gigabyte based on the AMD Radeon HD 7950 graphics processing unit (GPU), and out of 124 reviews, 84 people gave it five out of five stars. I had my chosen card. I thought I would save a few bucks and bought a used one for about $150, which is less than half of the price of a new one. Since Gibabyte warranties their graphics cards for three years, I figured it was a safe buy.

Maybe I should have paid the price for a new one because when I got it, there was a problem. I installed it fairly quickly and easily, but the HDMI output didn't work. I swapped the cable and tested the HDMI input to the monitor using another source, and I was convinced the problem was with the graphics card. I called tech support (which was not a bad experience at all) and they suggested that I flash the BIOS with the latest software. After a 24-hour glitch (their server was down for a whole day), I downloaded an app and the BIOS file, but it wouldn't install. I kept getting an error, so I had to send the card in for repair. 

In the interim, I was using the VGA output from the built-in graphics card on the motherboard to my second monitor (the Seiki 4K). When I ran a program with any graphics, like the grandMA2 onPC along with grandMA 3D visualizer, there was about a one or two second delay between the time I would click on a button and the action would occur. That kind of latency will drive you crazy and it makes the program almost completely unusable. But it's the perfect illustration of why we pay a lot of money for a good graphics card. 

Actually, we don't pay nearly as much as we used to for high performance graphics cards. About five to seven years ago, I researched a graphics card that was recommended by a software manufacturer, and it was $1500. I opted for a much cheaper one, which worked well enough. The last card I had before the HD 7950 was a GeForce GTX560 SC 2G, and it cost about $200 new. It actually still works well. In fact, after I removed the HD 7950 to send it for repair, I reinstalled the 560 and I decided to update the BIOS. In the process I learned that it actually supports 4K resolution, and for a brief moment I thought I would get to see 4K on my monitor for the first time. But it wasn't to be. It turns out that my CPU, which is an AMD Athlon 64 X2 dual core processor, doesn't support the new BIOS, so no 4K for me, for now at least.

I thought about buying a new processor so I could update the firmware and output 4K. I did some research (there's always more research to do!) and I found that if I replaced the CPU, I would also have to replace the motherboard because the system requirements to support 4K output from the 560 calls for a chip with an AM3+ socket, and I have an AM2+ motherboard. But the prices are very reasonable—about $100 for a new CPU and only about $60 for another motherboard. Since I already have two graphics cards, both of which are 4K capable, maybe I'll build another workstation. They're getting more scarce since most people are migrating to tablets and smart phones.

In the meanwhile, I'm still waiting for my 7950 graphics card to come back from being repaired. More to come...

1 comment:

Note: Only a member of this blog may post a comment.