After the term “4K” became common on UHD TV’s, many phone, tablet and laptop reviews have started to refer to 1440p, like on the Galaxy Note 4, as “2K”. 2K sounds a lot bigger than the 1080p term we’ve been using for ages, right?
Except not. Full HD (1920×1080) IS “2K”, and QHD (2560×1440, or four times the size of 1280×720 HD) is NOT. It’s 2.5K. But “K” is a rubbish term for electronics. It’s a film industry term where aspect ratio and resolutions aren’t standardized or exact, so it makes sense. Even if you include 1440p QHD with the 2K term because 2560 is less than 3K, it’s a statement meant to sound impressive that is factually indistinct from the unimpressive norm.
|Digital Cinema Initiatives 2K (native resolution)||2048 × 1080||1.90:1 (256:135) ~17:9||2,211,840|
|DCI 2K (CinemaScope cropped)||2048 × 858||2.39:1||1,755,136|
|DCI 2K (flat cropped)||1998 × 1080||1.85:1||2,157,840|
|1080p HDTV||1920 × 1080||1.78:1 (16:9)||2,073,600|
Professional video cameras record in 5K or 6K and the video editor crops from there. Consumer electronics are standardized, so it’s nonsense to use “about 4,000 pixels wide, give or take a couple hundred” term for a TV with a standardized resolution of 3,840×2160, and “about 5,000 pixels wide” for an iMac with a resolution of 5120×2880. Just say 2880p, 2160p/UHD, 1440p/QHD, 1080p/FHD, and 720p HD, like we always have in the past. No one is ever going to refer to FHD as 2K because the term 1080p is so ingrained in our minds, so we just need to give up 4K. It’s confusing to use a second term that’s nearly double the previously common term, thanks to the fact it refers to a different axis. It makes people think 4K is 4x 1080p in every direction, or 16x the size, when it’s really only 4x the size.