Home / Software & Gaming / Console / Project Scorpio will be the first console to support AMD’s FreeSync technology

Project Scorpio will be the first console to support AMD’s FreeSync technology

Last week, we got to see the final specifications for Microsoft's upcoming console, Project Scorpio. The hardware and its capabilities were all nicely detailed in an exclusive Digital Foundry report but it seems there are still a few surprises coming to light. It turns out that Scorpio will also be the first console to support FreeSync technology, meaning screen tearing and frame stutter could be eliminated for console gamers.

Obviously, there is a bit more to it than that. Those looking to take advantage of variable refresh rate technology will need a supported display. Scorpio specifically supports FreeSync variable refresh rates over HDMI, DisplayPort output won't be coming to consoles. With that in mind, there are fewer monitors on the market that support FreeSync over HDMI, versus FreeSync over DisplayPort.

So that narrows down the list of monitors that could be used with Scorpio, though it can be done. This console will also support AMD's FreeSync 2 update, which supports variable refresh rates in addition to HDR technology. That said, most console buyers are unlikely to be playing on a gaming monitor, so will Scorpio be able to use this technology on TVs?

The answer to that would be yes, though it will take some time for these displays to start rolling out. Scorpio supports the HDMI 2.1 standard, which was spec'd back at CES in January. It will take some time for TV makers to catch on and start including the updated HDMI 2.1 port on their own sets. With that in mind, it is unlikely that FreeSync will be a huge selling point for Microsoft's new console, but it is exciting to see variable refresh rate technology start to roll out to more mainstream markets.

KitGuru Says: With variable refresh rate technology, console games should appear to be running smoother, thanks to the removal of screen tearing and stutter due to poor frame times. However, with so many already making the jump to 4K, I doubt many are going to be willing to upgrade their TVs yet again in a year or so in order to take advantage of it via HDMI 2.1. 

Become a Patron!

Check Also

Blizzard unveils Warcraft 1&2 Remastered and major Warcraft 3 patch

The classic Warcraft RTS trilogy is now fully playable in remastered form, with Blizzard surprisingly dropping remasters for Warcraft 1&2 this week.

18 comments

  1. glad to see this. i’m never going to buy t scorpio (since its sounding like a giant waste of money) but this could push freesync far enough into the market that nvidia would have to support it.

  2. Gsync is a better version of freesync, Nvidia would never cater to this market, but would be nice if they allowed some support.

  3. Have you in any case earned a ton of cash without trading in a thing and even left out taking part in any kind of multilevel marketing sort of job and not have to make investments any-thing. I wish to talk about a venture at which most techniques is generally given to you and additionally it truly is a simple or even quite simple work. Without having to design your domain names and many more., almost all are generally presented to you at no cost at all. Anyone should certainly browse through the site so you are going to get remunerated for that. It really is as simple as that. I’m creating large sums of money from home out of this task such as 20000 dollars per thirty days or higher if you might wish to take home money same as that so therefore follow these particular very simple instructions by surfing this internet site >>>>> OLAURL.COM/15cpw

  4. Never cater to this market what do you think was in the very first xbox yep nvidia graphics.

  5. It’s interesting because NVIDIA cards not working well with Freesync monitors because they lacking driver support, what will happen when someone will try to contact computer using NVIDIA card to a TV with HDMI 2.1???

    Is that mean NVIDIA will release driver support for Freesync??? What will happen to G-SYNC then???

  6. Yeah but we are on about Freesync 2 anyways which is very much better than the current Gysnc implementation since it actually allows proper HDR pass through.

  7. Gsync also allows HDR 10bit, so that’s not true.

    http://wccftech.com/g-sync-hdr-monitors-available-q2-2017/

  8. I was referring to freesync monitors, not console.

  9. I said current Gysnc implementation in which currently Gsync does not do so either. They also require new Gsync models of monitor not existing ones to do so.

    The point was that the advantages from current Gsync to Freesync would be removed upon update to Freesync 2 in which this relates too.

  10. There is no negative input with using Nvidia cards through a Gsync monitor? It just doesn’t have any sync software running, it is a pass through system. So in that HDMI 2.1 will also not do anything different since that is also pass through.

  11. I know you want to be right, but the fact remains you dug a hole not knowing all the facts. Anyways, they both will require new hardware to be used, only upside of freesync 2 is the HDMI port upgrade. Otherwise they’re both virtually the same, except Gsync also triggers at every FPS input not just a range between 30-50 hz or 30hz-110hz. Gsync is also handled by the monitor and not the GPU drivers, so it adapts quicker with less of an affect on response rates or input lag.

    Obviously you pay for what you get. I’ve used both and from my experience Gsync feels better overall. I just don’t like the extra $150 price tag associated with it, but it was worth it for me, but not necessarily for everyone.

  12. Sorry but I haven’t at all. I work for a technical company in this area. You compared it to Gsync and I stated that isn’t true in regards to current Gsync. Don’t change what was said.

    Freesync 2 will have less overhead than that of Gsync for HDR because it is direct from source and not done the monitor. Current HDR transports (e.g. HDR10) require tone mapping twice – once from the application to the transport, and second from the transport to the native color space. This is then once from source to the Gsync unit and from that unit to the monitor. That adds in input lag in which Freesync 2 is removing that process by being on the GPU.

    You want as little work on the monitor as possible so the process should be done by the GPU. Not the other way around. You might actually want to read up on how Freesync 2 works before claiming that Gsync response rate or input lag is less.

    It also requires no extra unit and games just need to call up the correct part of the AMD driver to get HDR to work. This call up is done when turning the game on as windows does not work in HDR regardless of Gsync or Freesync as it is sRGB only.

    Now in future AMD are looking of the possibility that direct API being a factor which would be faster than all the above but that requires Vulkan/DX12 to do all the work at a lower level first so for now AMD will be offering the best HDR support out of the two if you want to talk about latest version.

  13. I feel sorry for the technical company then. They might want to find someone who actually does research and doesn’t change their story. Also you may want to look up how HDR is actually handled by software before hardware.

  14. Feel sorry all you want. I didn’t change the bloody story. The words are very clear when I stated “current” maybe you should read a dictionary on what that word means?

    Yeah the GPU runs surprise surprise of the software at driver level before it gets to either the monitor or the Gsync unit. In which case the first stage is the drive (the GPU in basic terms). The Gsync unit is adding in a further piece of hardware accordingly.

    With Freesync 2 it happens from the source feeding the GPU, so that driver then processes it and feeds it direct to the monitor.

    If you then add the Gsync unit in, you are still going from the GPU/Driver to the unit and then to the monitor. Not sure what you are missing but you seem to have it completely back to front.

    Edit: We have both Gsync & Freesync 2 units in development and thus are well aware of the technology. We are not researching by looking up some google links but design products with them in.

  15. The monitor still has to process the frames regardless of freesync or gsync. Except the drivers aren’t telling the monitor what refresh to run at through the pipeline on Gsync like it does with freesync. Gsync is offloading the extra processing work by adding another piece of hardware to handle the data directly with the monitor. There’s more I could get into, but I’d rather just point to some sources.

    http://wccftech.com/amd-freesync-nvidia-gsync-verdict/
    https://www.rockpapershotgun.com/2017/02/02/freesync-vs-g-sync-revisited-freesync-2-is-coming/
    http://www.geforce.com/hardware/technology/g-sync/faq#q1

  16. You are indeed correct about the whole refresh rate and the pipeline for that but that isn’t the same as what was being discussed in regards to HDR and how that is being processed.

    That is the issue, it isn’t published in detail in anything yet because it’s not released but the basics are there. “this could be the really critical bit – measures to reduce lag on the monitor side of the equation. Latency is apparently an issue for the image processors in HDR displays when performing internal HDR tone mapping”

    That is the part that causes far more latency than which piece of software/hardware is providing the information on which refresh rate to be using.

    Further to that theoretically the driver from the GPU should still be faster to do this process than the Gsync unit, however because different monitors/TV’s use different API’s themselves then the driver would have to be written to take advantage of this direct link. What Gsync does is force the monitor to use a specific system that Nvidia then set their driver to be able to talk with better than direct to the monitor.

    With AMD having much more input in regards to what Freesync 2 needs to offer then we are so far understanding that the advantage that Gsync has reduced when not in HDR mode. HDR mode will be faster for Freesync 2 regardless at this time because of their direct process system they are introducing.

    I don’t think you are wrong in that I believe you are trying to compare with the info you have and extrapolate from the current gen systems but until you see what Freesync 2 and Gsync do with their latest version I am happy to say that Freesync 2 is certainly the default option we would suggest anyone to be looking at generally and we would currently avoid Gsync with HDR until windows sorts out the processing there. Just a heads up but they didn’t actually sort out the problem with the Creators edition in regards to this specific element and they have not provided us with a new time frame since.

  17. I don’t disagree, HDR has much higher latency, due to the simple fact that it is pushing a higher stream of bits 10Bit over 8Bit normally. This added load causes a higher delay as the monitor or tv processes more data before posting to display 4-2-4 usually or 4-4-4 on 12bit Dolby Vision TVs.

    As far as default options, freesync could and should be widely adopted(as in all screens even Gsync ones), but isn’t, since AMD is really the only hardware that currently supports it and only consoles and pcs are really using AMD. If Nvidia was in the current consoles they’d be forced to opt to Gsync solutions, which it’s not and that’s a good thing for most consumers.

    I would much rather prefer Freesync 2 as it’s much cheaper, but the question I have is will the console support HDR12 or Dolby Vision which uses dynamic HDR instead of static HDR as in Meta Data is tailored to each scene. Seeing an OLED Dolby Vision movie is intense, if you get the chance to watch on an LG 4k oled you’ll know what I mean. I’m still waiting for them to drop to $700 to snag a nice tv.

    http://www.techhive.com/article/3074897/consumer-electronics/dolby-vision-versus-hdr-10-tv-a-format-war-and-more.html

  18. I don’t think that Dolby Vision will ever move over to PC gaming in it’s current form due to licence requirements. HDR10 is also able to use dynamic with HDMI 2.1. Whether that happens for the console will be interesting. Samsung are calling it HDR10+

    Amazon Prime & Netflix are meant to be supporting it and it is set to become the latest standard for HDR with others adopting Samsungs standard (that is royalty free). The UHD Blu-ray standard is also set to update with this new standard too.

    I can see the market for DV shrinking due to the costs. The biggest mix up though is that Technicolor is looking at having their own standard which I have been informed is in the works