When Microsoft Corp. formally introduced its Xbox One video game console in 2013, it faced a rather massive amount of criticism from enthusiast gamers because the system’s graphics processing horsepower was considerably lower compared to that of Sony’s PlayStation 4. Apparently, the company has taken critics seriously and is trying to fix the issue.
The latest Xbox One software development kit (SDK) reportedly features a number of performance-enhancement things that can greatly improve speed in games, but that requires additional work from game developers. Among other things, the new SDK enables better control over usage of ESRAM high-speed memory, which allows to greatly improve performance of the console’s graphics processing unit.
Thanks to performance enhancements implemented by the software giant, game developer Techland managed to ensure that its upcoming zombie survival open world game, Dying Light, will be able to run in 1920*1080 resolution at 30 frames per second on Microsoft’s Xbox One and Sony’s PlayStation 4.
“We were using the latest version just prior to the new release that came out on December 12,” said Maciej Binkowski, a lead game designer at Techland, in an interview with GamingBolt. “In terms of advantages, the main thing is just how much the ESRAM control has improved. The new API allows you to do a lot more with the ESRAM, things devs have always wanted to do but were not easily accessible. This together with better tools (performance investigator for Xbox) allowed us to really improve performance and tweak ESRAM usage.”
Performance Investigator for Xbox is a software tool that helps game developers analyze and debug their code to improve performance of Direct3D applications. Therefore, while performance improvements are here, they still depend on software makers, not the capabilities of the console itself. As a result, truly demanding games will continue to suffer from relatively low graphics processing horsepower of Microsoft’s Xbox One.
Discuss on our Facebook page, HERE.
KitGuru Says: Thanks to Microsoft’s constant work on performance improvements, the gap between PS4 and XB1 is getting narrower. Still, it is unclear how significantly can the company boost performance of the Xbox One going forward…
Yet esram can not hold a single 1080p frame with 32mb….
funny how 7th gen everyone was like “i dont buy console for graphics” or “you cant see above 30” or “youre just elitist” to literally PS and Xbox fanboys go at each other none stop about graphics and framerates.
GG no RE
latest game will be able to run at “1080p at 30 FPS” need i say more?
Half true Paul, a good (e.g. a CGI movie) render you cant tell 30 or 60FPS, but when its a quick job by a graphics card it matters.
But its also about image quality, things such as AA, post processing, bloom and Nvideas 3D software for example. And those effects were thrown out the window when these “next-gen” consoles decided they needed a good frame rate above 720p, all while costing less than a PC.
And yet I can still build a PC for a little less than £300 that can run games at a higher setting than the console standard @ 1080p for 40-60fps depending on the game…
Careful, if you start throwing around actual facts like that you are likely to have hoardes of unwashed console worshippers setting fire to your house.
People say this a lot
Would you be able to post here a quick PC build for under £300 on a recent game, not a game with the age of Skyrim
As if it was actual fact… It was an uninformed myth spreading BS found around troll forums.
1920×1080=2,073,600 pixels
24bit color information for each:
2,073,600*24=49,766,400 bits = 6,220,800 Bytes = 6 MB which is way less than the 32 M(ega)B(ytes)
Lol true, I actually own a x1 for exclusives. Having major arguments with my friends atm they game on PC but believe consoles have 6.5GB useable RAM throw some technical words in and say Unified RAM is better and streaming from RAM on PC, (Aka what shadow of mordor done perfectly fine when it did not have enough vram) causes major performance issues which consoles do not… Lol If fanboys want to start throwing the proverbial they can. Facts are facts. All said and done Bayonetta 2 and sunset overdrive are my gotys both are not on PC. Yet I just invested in a 2nd 970.
So, will PS4 games be allowed to perform slightly better now? …OR are they still going to shove parity down our throats?
Microsoft and their bottomless pockets.
Actually it is possible, I heard that only the Z-Buffer and some shaders are stored there, not textures, but I might be wrong.
£300 ha ha good luck with that. maybe £500-£600 yea. I dont actually care for FPS im happy with 30 if its high detail, good AA etc 60fps is not a must for me. and unfortunately for 60fps fanboys unlucky as most game devs are going down the root of 30FPS
Even then 32mb is not much at all, intel have decided to put 128mbin their extreme stuff CPUs in an attempt to future proof it 64MB was their minimum, 32mb is goign to achieve f all.
But that is different, consoles are programed at a lower level and being able to be optimized easily, PCs uses a high level API which has far more overhead and no special treatment at the hardware level as this API treats all as equals. That is why you can get away running Far Cry 4 on medium/high settings on a Xbox One, yet, a smiliarly specced APU can’t run that game on anything higher than low/med settings at playable framerates.
That’s my point, so many people say they can do it for the price of an Xbox One/PS4, but I’ve never seen someone actually do it
Don’t get me wrong, I’m a PC gamer myself, but I know that I will have to spend about £600 on a new PC, but I use it for Virtual Machines, and Photo and Video Editing too, so I’m happy to pay it
Actually that is just the APU I can build a PC for £450 that could run FC4 on those settings 1080p no problem. Also the API is not any better it uses dx and open gl in the PS4s case. X1 has a variant of windows 8.1 which btw uses less ram with steam etc open than the console OS does. The Xbox one uses PC architecture uses the same renderer and API as windows does. Similar specced. Also the PS4 has an extremely low fiull rate, the gddr5 is absolutely horrendous for cpu based games like AC (hence why it performs worse) and while its GPU is rated at 176GBps it can only ever get 120 out of it due to the CPU bottleneck.
“That is why you can get away running Far Cry 4 on medium/high settings on a Xbox One”
LOL NOPE, Where the hell do you get your information? Like god it must be so convenient to be that stupid. I don’t even want to type anything else to you because your in such god damn denial nothing will change that idiotic stance you have.
Hey chill out console fag, there is no reason to act like a inbred rage monkey, I don’t have a console, so if you want to act like some sort of superior elitist that runs the “PC Gaming race” retarded banner, go play that with somebody else. Thank you.
A Mid-Grade PC posts better numbers than the consoles. Many have tried to claim that the age of PC gaming is over, when in fact it is just coming into it’s golden age, you will see a huge surge in PC gamers the next 2-3 years. And to address the notion that there is no difference between 30 FPS and 60 FPS if they are both rendering 1080, take your 30 FPS into a MP Shooter and play against someone running 60+ FPS and you will change your tune very quickly.
Here you go:
https://www.youtube.com/watch?v=0KUpRXRGVHU
I can tell when CGI is 60 or 30. There’s still a difference. It depends how they render the video. Also, I’not sure if they render in 30 FPS or 24 FPS, either way though, real time render and post render FPS is still video output. Example, youtube has now implemented 60 FPS video playback. Every youtube video is pre-rendered and then uploaded to their servers.
You came in like a wrecking ball
It’s 24 same applies also in theaters, dvd at home tend to be 29.9, odd number lol
I’ve noticed some games default to 59 FPS for VSYNC. very strange indeed
“60FPS Fanboys…” I believe that is the first time I’ve ever heard the term applied in such a way! As soon as people feel cornered they throw out the ‘fanboy’ insult as if that magically changes everything xD
What I find amusing is that 30fps is now just a target goal, in other words it will dip (like ASSCreed Unity) Enjoy gaming at >30FPS because that will be cinematic!
Maybe the next consoles will reach that mythical 1FPS for the literature feel!
windows OS 100$, controllers 50$ ( im guessing 50$ ) You are now at 550$ without the taxes.
Or they could be people who can do basic arithmetic. 1920 width x 1080 height = 2,073,600 pixels. With a colour depth of 24 bits per pixel that is 49,766,400 bits. Possibly you have not understood that bits are not the same as bytes. That many bits is 6,220,800 bytes. Or 5.93MB, if you wish. And that is a lot less than 32MB which I hope you appreciate. It is rather sad that you adopt something as fact just because it pleases you, when just a basic check would show it to be false.
Also, the word you want is “hordes”. A hoard is what a dragon sits on.
CPU: A10-7850k £110.52
GPU: Integrated Radeon R7
MoBo: Gigabyte F2A58M-HD2 – £34.30
PSU: Corsair CX430M – £32.99
RAM: Avexir Core (2X4GB) – £58.99
HDD: Seagate Barracuda 1TB – £40.30
Case: Aerocool QS-102 – £22.99
Total: £300.09 (Prices based from best prices of Amazon, EBuyer, Scan & overclockers. Does not include delivery costs.)
Firstly the APU. Yes it’s AMD and yes its an APU but You’ll be surprised with the power this thing has. I’ve personally seen the 7850k’s little brother, 7700k in action and its a beast for £100. The 7850k features higher CPU clock speed @ 3.7GHz and 2 additional GPU cores than the 7700k for some extra horsepower.
It’s ability to overclock on both fronts alongside the use of AMD Mantle means a lot of extra juice can be squeezed out of it. Looking at realistic benchmarks from BF4 – pretty much the standard in game benchmarking – You’ll be at a comfortable 35-40fps @ 1080p using medium settings (comparable to the quality of console settings. Don’t hate, this is a budget build not a HEDT) with the occasional yet unnoticable dip into the high 20’s, and an easy 60fps @ 720p.
Since it’s an APU you know you’re not going to be hit with a bottleneck too so thats always a bonus.
Moving on the motherboard is nothing too special. I chose a Micro ATX board because you dont need to spend out extra just for additional GPU slots since you’re not using the provided ones in the first place. it also allows for the use of a smaller case which means anyone switching from a console can potentially still treat it as a console.
Again a small-sized PSU is used to power it all. 430W is probably still an overkill tbh but hey at least it allows room for an upgrade in future if you choose to buy an NVidia GPU. Tis modular so you’re not going to be fucked over with loose wires everywhere.
I selected Avexir Core 2x4GB 1600MHz (anything less than you might aswell not bother). Avexir is probably a brand you don’t know but they’re starting to really make a name for themselves and these memory modules are by no means unreliable. Plus they have lights. Yeah. Looks nice. And such.
1TB drive, pretty standard. could go to 500GB for cheaper but then I realised over 700GB of my own harddrive is purely games so fuck it, 1TB. Seagates a pretty reliable brand too and surprisingly cheap tbh.
and finally a case. I didn’t focus too much on this since it’s bassically personal preference but i found the Aerocool QS-102 to be pretty cheap and it looks like a console so again if you’re going to switch from consoles you’ll probably feel more comfortable with it.
Don’t forget this is all retail price too. If I were a company mass producing these I could get bulk discounts. No idea how much but lets assume maybe £50 total. That means at the original console price I could get £100 profit per machine sold, one third of the machine itself, thats a lot, and still a £50-£60 profit per machine if it were sold for around the price of a console now.
£300 is not ideal but some people just have to work with that budget. I agree £500-£600 is the ideal for mid-top range gaming desktops and that’s what I’ve got at the moment but nowadays it only takes merely double that, around £1100 to be precise, to make a seriously beastly high end desktop.
Plus there is no such thing as a “60fps fanboy”. 60fps is the industry standard which is currently being compromised. Those people aren’t fanboys, they just have a right to be angry.
Incoming technical post;
Actually, color is probably stored in 32bits (for performance reasons, gives each pixel a 4byte boundary). That would be 8MB for a buffer. But one buffer is not enough, here are two example scenarios;
Forward rendering (classic 3D rendering):
– A front buffer (8MB)
– A back buffer (these two are for double-buffering, which a lot of graphics engines employ) (8MB)
– A depth buffer (historically 8bits, but a lot of games use 16, 24 and even 32bit buffers) (let’s say 6MB)
– A buffer for post-processing effects, like fire or smoke; effects that need to be blended in later (8MB).
That’s 30MB in total. Now a lot of games actually use a newer rendering technique, called deferred rendering/shading. It requires at least these additional buffers;
– A buffer for world-space or view-space normals. Normals are used to calculate how lighting affects an object. Since we don’t use the alpha channel for normals, we could embed some material info in it. (8MB)
– A buffer for world-space or view-space position. This is required for lighting as well. (8MB)
That’s additional 16MB, bringing the total to 46MB. There are some tricks to reduce the amount of data used, but it is actually rather close to 32MB in total.
If you’re really strapped for cash, Steam OS nuff salad. Plus Win. OS goes into the software section, so instead of buying a game or 2 you just buy an OS and then buy $10 games on steam to balance it all out lol.
That’s because the US Europe and UK have slightly different AC frequency’s.
Free/G-sync are comeing about because old monitors used AC to time themselves, so always displayed 60 (ish) FPS, so VSyncing was needed to prevent tearing.
Now we have fancy LED displays that can output a frame whenever they like they are being made to match the GPU output.
Long story short, those defaults and settings are to allow for “old” technology that we are currently phasing out.
what they hope:
Aim low, no disappointment
What you get
Aim low, performs lower
I’m still finding it incredible that (in the old days of eren’s post) 49million things can happen 60+times a second through a cable.
Are humans really soo slow….
it’s more like £400 as a minimum