Last month Intel and Micron announced they are developing 3D XPoint memory tech, which will help to power 8K gaming in the future. But when will 8K – or even 16K – hit the mainstream? PCR asks the experts.
When asked how long until 8K becomes the norm, AMD’s gaming scientist Richard Huddy told PCR: "It will do, I’ve got no doubt about that, but we’re a little while from it. We’re only just at the stage where we can run a 4K display at a reasonable refresh rate.
"If you want to run a high-end game that has some fairly aggressive graphics settings on a 4K display, then you need a couple of Fury Xs or something comparable to that to give you 60fps that provides a quality gaming experience.
"Now. if you’re going to go for 8K gaming, you’re going to double up the pixels in both width and height. Four times as much horsepower is going to be needed.
"I think what we’ve done with HBM1… we’ll continue to improve upon it in future generations. So there’s great opportunity for us to go beyond 4K gaming. But it will still take quite a lot of graphics horsepower before we can deliver a really good experience there."
Intel and Micron’s 3D XPoint tech claims to be 1,000 times faster than NAND, which could drastically improve the performance of games in the years ahead.
When asked if 8K is really the future of PC gaming, Intel’s UK channel sales manager Matt Birch said: "It is one future, as there are a lot of new experiences the technology can enable. We believe that technologies like 3D XPoint are critical to these new experiences though and we will continue to innovate."
"When you get to 16K, improving on that buys you nothing – that’s kind of a done deal then. But if you think about how much extra horsepower that is, that’s a lot from where we are at the moment."
Richard Huddy, AMD
So where does 16K gaming come into all of this? Should we even be thinking about that yet?
Huddy explained: "I did see some stuff from the most recent Google I/O, where they were talking about a kind of ideal graphics system being able to support 16K per eye.
"Oh my!" he gasped. "They have some pretty serious aspirations there. And I understand why. If you look at the human visual system, it’s typically claimed that a person with 20/20 vision needs something like 8K vertically and 16K horizontally to get everything right.
"There’s a thing in computer technology called a nyqist limit, where if you want to represent a signal, then you have to have twice as much frequency in order to represent that signal all the time. This means if the human eye goes up to 8K by 8K, then really the display needs to go up to 16k by 16k, so that you don’t get any strobing or other effects that would emerge from wandering at just 8K by 8K.
"That’s a phenomenon that comes from the fact that although the image is perfect, the dynamics of the image aren’t quite perfect at 8K. When you get to 16K, improving on that buys you nothing – that’s kind of a done deal then. But if you think about how much extra horsepower that is, that’s a lot from where we are at the moment.
Huddy went on: "If I’m claiming we can do 4K gaming for a pair of eyes, one 4K display on a high-end gaming rig with a couple of graphics cards, if you simply scale up to 16K per eye, then you really are doing a total of 16 times as much work – four times as much on x and four times as much on y, and you’re maybe doubling the eye’s load as well.
Experts predict when 8K and 16K will hit the mainstream (left to right): AMD’s Richard Huddy, Intel’s Matt Birch and YoYoTech’s CK
"You’ve got something like 30 times as much horsepower required for that and we’re already using two of the highest end graphics cards that we can build.
"We have a good future, and will 8K be in there? Yeah, absolutely it will. And I think when we get to 16K we’ll finally say: “Okay, that’s a done deal. We’ve done everything that the human visual system needs for folks with 20/20 vision. But we’ve got a way to go before we’re ready to genuinely supply that."
So 16K is the limit, but when will we get there? How many years will we have to wait until we get such extreme resolutions?
"You’ve got a total of 32 times as much horsepower that’s required, which is two to the power of eight. That takes us about a decade or maybe 12 years to do," Huddy said. "That’s eight doublings. That’s not bad. It’ll be a done deal. That’s only taking it to the visual quality we can do right now.
"So the visual quality of each pixel will be at the kind of standard that we can achieve now – and the display will be as good as the eye could ever need. But we can do better with each pixel. In a dozen years’ time, I think we’ll be able to offer something like a VR-style environment which is very close to, if not actually hitting, photo-real. And that’s really nice.
"For the last 20 years I’ve been giving talks about photo rendering and saying it’s about ten years away. But now I can actually nail it down and say we should be able to hit that in 12 years’ time on a VR system at 16K. Compromise some small amount and take it down to 4K or 8K, and it’s clear that photo-real rendering is actually something that’s around five years away."
"8K and 16K sound good, but to be honest even 4K is not yet fully adopted by everyone – I don’t think anything beyond that would be of benefit at this stage."
So according to AMD’s predictions, 16K will become the norm in the year 2027, while photo-realistic rendering should be with us as early as 2020.
Intel says 8K (and eventually 16K) adoption will depend on a number of factors.
Matt Birch added: "We have to look at a few drivers of new technologies. The adoption will depend on the ecosystem readiness, including hardware and content and market adaption rate. Intel works closely with all industry players to ensure that IA can support and bring the best technology to end user when the industry is ready.
"First, there needs to be devices in the market to deliver beyond 8K, from monitors and TVs to the cables and graphics. Secondly, game developers and film and move industries will need to deliver content to create demand. Until these two items are cost efficient and readily available, 8K and beyond will not be the norm."
MD of gaming system builder YoYoTech, CK, says that while it’s promising for the future, 4K has not yet taken off in a major way, so the market should focus on growing that first.
"It’s imaginary at this stage," he said. "8K and 16K sounds good, but to be honest even 4K is not yet fully adopted by everyone – so let’s get some content going first.
"I’m hoping in the coming months, with Skylake and DDR4, things will improve, but at the moment 4K hasn’t really taken off. It’s really good technology to have and shout about, but it’s something that’s still in the distant future.
"I don’t think anything like that would be of benefit at this stage."
What do you think – is 16K the future? And do we even need it?