Skip to main content

How to buy a 4K TV for all of your TV, movie, and gaming needs

Standard definition is no longer the standard. With streaming services like Netflix offering 4K viewing options, and games consoles supporting native 4K resolution on new titles, while upscaling older ones, it’s a given that as consumers, we’re accepting 4K as the new norm. There’s nothing wrong with 1080p HD TVs, but with the mass move of media towards 4K, you’ll be missing out on the full experience of what’s on offer.

There are a number of factors to consider before blowing a load of cash on any product, but with your TV being the entertainment hub of your home, it pays to do your research. Or you can have someone else do it for you, which is where we come in. We’ll guide you through the need-to-know info when it comes to making one of the heftier home purchases you’ll be faced with, and then you’ll be ready to spread your wings and fly over to our guides on the best Samsung TVs, best LG TVs, best Sony TVs, and best Panasonic TVs for gaming, as well as perusing the best 4K TVs for under $500, and our roundup of the best 4K TVs for gaming, including the 55” TCL 55R617 4K TV. But let’s not get too ahead of ourselves. Here’s everything you need to know before opening your wallet.


The size of a TV screen is measured diagonally and, based on most living rooms today, you'll want to be looking for a screen in the region of 55 inches minimum. The general rule when it comes to how far back your sofa should be from the telly is three times the size of your TV. However, 4K TVs offer four pixels in the same space that a 1080p TV would usually provide one, so that means that you can scoot up closer to the screen without losing picture quality, as it displays significantly more detail; twice as close in fact, as the ideal viewing distance is reduced to one, to one and a half times the size of your TV, so you should be sitting between 55 to 82.5 inches away from a 55 inch 4K TV. Science!

Screen: LCD vs LED vs QLED vs OLED

While it may seem confusing at first glance, there are only really two of these you need to worry about; LED and OLED. LED and LCD refer to the same thing in today's jargon, and QLED is a more snazzy version used by Samsung that aims to compete with the alternative OLED. 

LCD (liquid crystal display) screens use a backlight system to create images and, nowadays, the means by which they do so is with LEDs (light-emitting diodes). The nature of the technology means that the screens aren't great when it comes to darker images, struggling to achieve optimum black levels. The images lack the depth of OLED, but there are options available that look to improve the contrast ratio of the TVs. However you can expect the price to start climbing along with the picture quality. 

The majority of LCD TVs use edge-lit local dimming, meaning that the LED placement is along the edge of the screen. This can vary from all four edges, two opposite sides, or just the top or bottom. Areas of the picture can be dimmed, but because the number of LEDs is fairly sparse, it's debatable as to how effective this method of local dimming can be. You don't want it to be barely noticeable, but you want to avoid having large portions of the screen dimmed, adversely affecting picture quality. Because edge-lit screens don't have many LEDs, you'll find them to be more energy efficient and to have fairly slim profiles. 

Cheaper LCD TVs utilise back-lit dimming, which means that rather than areas of the screen dimming, the entire image as a whole will brighten or darken as one. As you'd imagine, this isn't ideal, and it's best to opt for edge-lit over the two, which is fast becoming the new standard.

The best option for LCD TVs however, is full-array, with the screen housing a grid of LEDs that can lighten or darken in zones, but you may notice a halo, or blooming effect, as the light from a brighter zone bleeds into a darker area next to it. This is the best - and the most pricey - option when it comes to LCD TVs. The sets won't be as thin or energy efficient as the edge-lit option, but if you want to enjoy the full benefits of HDR, which we'll address further down, it's worth spending the extra money.

QLED (Quantum dot Light Emitting Diode) is a fancier LCD option being touted by Samsung as a rival to OLED. It aims to marry the best features of LCD TVs - brightness and colours - with the best of OLED - true black and superior contrast - although they're not quite the finished article yet. The quantum dots still need a backlight for now, but further down the line, they'll be able to emit their own. QLED is probably the best option for vibrant colours and luminosity, but blooming is still an issue, so the picture still won't offer deep blacks like an OLED would. Plus, the tech inside means that the sets won't be as thin as their OLED counterparts. 

OLED (organic light-emitting diode) screens produce their own light and can be shut off, pixel by pixel, for a true black, which gives great contrast due to the lack of blooming. Unlike LCD TVs, OLEDs have better viewing angles because image quality isn't lost when the screen is viewed from the side. On the downside, they don't necessarily match the brightness of some LCD sets, and they are more expensive. There's also a chance that OLEDs will run the risk of burn-in, or ghost images, over time but that remains to be seen. 


HDR (high dynamic range) is a must for 4K TVs, and the newer models should have this feature as standard. If it doesn't say HDR, don't bother with it. HDR offers increased colour, brightness, and contrast over standard dynamic range, and there are a few different options to consider, although the main two are HDR10 and Dolby Vision.

HDR10 is the basic version, offering 10-bit colour depth compared to the 8-bit colour depth of standard dynamic range TVs. Because manufacturers can implement the technology without incurring any fees, this is the most widely adopted of the two formats. It also works the simplest way, relaying metadata at the start of a video that essentially tells the TV to encode it in HDR.

HDR10 Plus is Samsung's answer to Dolby Vision, and instead of relaying metadata at the start of a video, it's done on a scene-by-scene basis. Both HRD10 Plus and Dolby Vision have dynamic metadata, allowing for greater picture quality over HDR10's static metadata.

Dolby Vision supports 12-bit colour depth but manufacturers need to pay licensing fees to use it, so it doesn't pop up as often. You can get TVs that offer both HDR10 and Dolby Vision, so if you want to play it safe on the sidelines when watching who will win out when it comes to the battle of tech adoption for HDR, these sets will future-proof you for either outcome. 

Refresh rate

The refresh rate refers to how many times per second the on-screen image refreshes, making the motion look smoother the higher it is. It's particularly important if you're planning on playing video games on your TV - for which you'll also want a low input lag (the image that the image takes to get transferred from your console your TV). 

The standard refresh rate is 60Hz - that's 60 times per second - but you'll want 120Hz native refresh rate. Be wary of anything touting an 'effective refresh rate' that boasts higher numbers. This method uses after-effect processing and can give your video output that undesirable 'soap opera' effect. Stick to a native 120Hz and you'll be fine.  

HDMI input

Ideally, you'll want to find yourself a TV with at least four HDMI ports that support HDMI 2.0, although HDMI 2.1 is the newest version. HDMI 2.0 can support 4K video at up to 60fps compared to HDMI 1.4's 30fps, so it's a given that it's the minimum you'll want for your Ultra HD telly. HDMI 2.1 supports 4K video at up to 120fps, and 8K video at up to 60fps, but it's still very early in its rollout. It's backwards compatible with devices that use older HDMIs, but for most people, this isn't something that needs to be on the list of must-have features, although it's something to consider if you like to be on the cusp of new technology and want to future-proof your setup. 

HDCP ( High-bandwidth Digital Content Protection) is something else you need to keep an eye out for when buying a 4K TV. HDCP is basically an anti-piracy measure, but you can run into problems if your 4K TV doesn't support HDCP 2.2. Put simply, tellies with HDCP 2.1 may block the signal from HDCP 2.2 devices (consoles, Blu-Ray players etc). Older and cheaper TVs are the ones most prone to using HDCP 2.1, so be sure to check the specs before you buy.

You're now equipped to go forth and find yourself the 4K TV of your dreams. Or to the extent that your wallet allow, at least. If you're a PC gamer, you can still use a 4K TV with your rig; just check out our guide on how to use a 4K TV with your PC. And once you've got your fancy new Ultra HD TV set up, don't forget to peruse our recommendations for the best HDMI cables for gaming.

Some online stores give us a small cut if you buy something through one of our links. Read our affiliate policy for more info.  

Shabana Arif
Shabana was born looking like a girl wearing a Pikachu hoodie, so when such things became popular, she fitted right in. She writes guides, reviews and features for GR+ when she isn't screaming at Dark Souls 2 on YouTube.