Jump to content

4K TVs for Computer Monitor?


Recommended Posts

I've got a few friends that have bought 4K TVs https://whatever-tech.com/best-full-motion-tv-wall-mount/ for computer monitors using DAW software and they seem to be happy, but I haven't seen them as they are in a different city. I've been looking at 4K computer monitors in the 30-32 inch sizes and they are $800++ which is a bit out of line for me. Tonight I bought a Samsung 40" 4K TV with a resolution of 3840 x 2160 for $447 at Walmart with free shipping. The Samsung 40" is just about the perfect width with my BM5s on each side for the optimum near-field setup.

I'm running an EVGA SuperClocked GeForce GTX 760 2 Gb w/ G-SYNC video card in my DAW PC which has a max resolution of 4096 x 2160 which should support this TV. If it doesn't work out I can always put it in the bedroom.

Times are changing and just curious if anyone else is doing this besides my friends or thinking about doing it or just opinions.

Thanks a lot

Edited by Yaheltar
Link to comment
Share on other sites

IFF I was looking for a new tube, I might consider that course myself. I've thought about it, but I already own a 52" 1080 TV and three 27" wide (1280) computer monitors (two for my tower system, one to augment my laptop), so I'm not in the market at this time. The technical benefits don't outweigh the money spent right now. I'd rather drop another 450 on my mortgage or keep it in the bank for the inevitable next crisis (water heater furnace, refrigerator, pet medical bill, etc)

 

But... it is reasonable at this point to go 4K, something I wouldn't have considered under any circumstance aside from winning the lottery just a year or two ago, so +1 for sharing the idea.

Link to comment
Share on other sites

check the refresh rates on the tv's. last I heard they are typically optimized for 32 FPS

Refresh rates on newer 2 and 4K TVs is 100 to 200Hz and higher. I'm using an eight year old Samsung 37" 1080p Smart TV that can go to 100Hz but is set to 60Hz refresh rate thanks to a signal from my GPU. TV sets don't work in fps. That's something for Video cards.

Link to comment
Share on other sites

You want to make sure they have a gaming or PC mode, that will disable a lot of the "Enhancements" and cut the input lag. With the 4K Samsung I occasionally use renaming HDMI 1 to PC enables gaming mode.

Link to comment
Share on other sites

I've got a few friends that have bought 4K TVs for computer monitors using DAW software and they seem to be happy,

 

I've got a 4k TV and I also run DAW software, but I don't use the 4k in the studio with the DAW. I've thought about it and I can see the advantages of it, but I'm operating under the theory that the TV has the potential to add interference in the audio signal, so a really bad idea. I could have it wrong, but as long as the 2 LED monitors I'm using with the DAW are giving me what I want then I don't see the point in swapping them with the 4K.

 

I do have an old desktop hooked up to the 4k in the living room and it's really nice for watching music videos and some gaming, I would recommend that. If money is an issue you could take a look at RCA. It's not as good a picture quality as the higher end makes, but looked at simply on a bang per buck it might be the best deal. I noticed that Walmart had a 60 inch RCA 4k on Black Friday for under $300. At that price, even if it only lasted a couple of years, it wouldn't be much different than swapping out a video card. On the other hand, the higher end 4k's can be major investments.

Link to comment
Share on other sites

  • 3 weeks later...

I'm a long time member of [H]ardforum. Many helpful guides are available, one of which is directly related to the topic.

 

2015 Samsung 4k TV as a Monitor Set Up Guide

 

There are differences between make and models of TVs so search first and if you haven't found an answer, ask and ye shall receive.

Link to comment
Share on other sites

I've got a few friends that have bought 4K TVs https://whatever-tech.com/best-full-motion-tv-wall-mount/ for computer monitors using DAW software and they seem to be happy, but I haven't seen them as they are in a different city. I've been looking at 4K computer monitors in the 30-32 inch sizes and they are $800++ which is a bit out of line for me. Tonight I bought a Samsung 40" 4K TV with a resolution of 3840 x 2160 for $447 at Walmart with free shipping. The Samsung 40" is just about the perfect width with my BM5s on each side for the optimum near-field setup.

I'm running an EVGA SuperClocked GeForce GTX 760 2 Gb w/ G-SYNC video card in my DAW PC which has a max resolution of 4096 x 2160 which should support this TV. If it doesn't work out I can always put it in the bedroom.

 

Times are changing and just curious if anyone else is doing this besides my friends or thinking about doing it or just opinions.

 

Thanks a lot

 

Something to consider (if you haven't already discovered it): the GTX 760 only offers HDMI 1.x so while it can do 4K over its HDMI output, it will limit you to 24 or 30Hz (30 FPS). DisplayPort will get you 60 Hz (60 FPS) - you just need an adapter (DisplayPort to HDMI 2.0, specifically - they're a few bucks online, and you can get it where the adapter is part of a cable so its one and done).

 

Source: https://www.geforce.com/hardware/desktop-gpus/geforce-gtx-760/specifications (see footnote 1 at bottom)

 

Second to consider: 4K output is not available in Windows XP (this is documented in the driver release notes or observable if you setup Windows XP on the card - only mentioning it because the GTX 760 is compatible with Windows XP-forward; if you use anything newer (Vista, 7, 8, 8.1, 10, Linux, etc) this is a non-issue).

 

 

 

check the refresh rates on the tv's. last I heard they are typically optimized for 32 FPS

 

 

I've never heard of 32 fps (but that doesn't mean it doesn't exist - there's all sorts of standards for video) , but a lot of video/movie content is either 24 or 30 fps, which is what most TVs spend their time dealing with. Most HDTVs that I've seen will support 60 Hz (60 FPS) out of the box, from their HDMI or VGA inputs. Component inputs tend to be a lot more variable.

 

To clarify some (and build on what JimmyRJump said): displays work in 'field rate' (refresh rate) which is in hertz (Hz) while the video/game content is in 'frame rate' (FPS) - the display cannot 'display' content with a higher frame rate than its field rate (that doesn't mean it won't work it just means the extra frames are discarded). So if your display is a maximum of, lets say, 60 Hz (which is very common for LCDs) then 60 FPS is the most it can display. Conversely if you have a 240 Hz display, you can do up to 240 FPS. As I said, a lot of HDTVs I've seen support 60 Hz out of the box, as long as the source can deliver whatever resolution at that setting (see above note on the GTX 760 and HDMI support - the card *can* do 4K60, but not over its HDMI connection).

 

 

 

 

 

check the refresh rates on the tv's. last I heard they are typically optimized for 32 FPS

Refresh rates on newer 2 and 4K TVs is 100 to 200Hz and higher. I'm using an eight year old Samsung 37" 1080p Smart TV that can go to 100Hz but is set to 60Hz refresh rate thanks to a signal from my GPU. TV sets don't work in fps. That's something for Video cards.

 

 

 

You have to be careful with the advertised refresh rates on some TVs - some of them really do support 120Hz (usually they also support 3D too), but a lot will advertise some sort of 'motion interpolation' and advertise high refresh rates that simply do not exist (see Wikipedia for more about this: https://en.wikipedia.org/wiki/Motion_interpolation). There are of course TVs that do both - have a higher-than-60Hz refresh rate *and* offer motion interpolation. If the TV is properly capable of >60Hz but your graphics card does not offer that mode, you can create a custom display mode - with modern TVs this is generally not risky because if it does not like the custom display mode it will just tell you that (and not display the image), the computer will (by default) reset to last-known-good at 15s and all is well.

 

This is a really old guide (the example images are dated) but roughly covers how to do this in nVidia's drivers: https://www.nvidia.com/en-us/drivers/custom-resolutions/

 

AMD has it more up to date: https://www.amd.com/en/support/kb/faq/dh-032

 

 

To echo what JimboUK said:

Also you will probably want to disable "motion interpolation" for most games (if your TV has it) because it usually adds some appreciable input latency (as in, you move the mouse and there's like half a second or a second of lag before the screen updates) because the processor inside the TV has to work with the data. Some games it isn't so bad though - just really depends (on some TVs its also really easy to turn it on/off so you can test it easily, versus going through 9 layers of menu).

 

Finally, if the TV has some sort of 'dynamic contrast' mode you might want to disable it - in my experience this is the biggest 'downside' with TVs for gaming, because the dynamic contrast tends to be setup (I assume) to work with video content (like a movie), so it will dim the whole thing in very dark scenes - this is fine if I'm watching a movie (or even browsing the web) but its awful if playing a game like Fallout or Doom that has *a lot* of dark scenes, or something like Skyrim where a lot of the menus are predominately black background, so the whole thing dims down and makes it hard to read/see what is going on.

Link to comment
Share on other sites

FPS (frames per second) has naught to do with refresh rate. Refresh rate will happen at the set value of the screen regardless of the fps count of the GPU. Fps is entirely the field of the GPU and the TV doesn't care how high the fps of the video is. The fps video signal is sent from the GPU to the TV and the TV refreshes that signal, be it 60fps or 600fps, at its set refresh rate. Only thing that can happen is the so-called screen tearing when the FPS is higher than the screen's refresh rate. My TV is set to 60Hz but I can play games at over 100fps without any tearing.

 

Film works at 24fps standard, video at 25fps (Pal) or 30fps (NTSC). The difference is because of the difference in Herz between the USA (60Hz) and the rest of the world (50Hz) on the electrical grid. The Herz is the amount of alternations (cycles) per second of the polarity.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...