The Quadro M4000 sports 1,664 CUDA cores, compared to the previous K4200's 1,344 – so that's an increase of just under 25 per cent. The GPU uses the GM204GL revision of Maxwell, where the M6000 uses GM200GL. Whilst the K4200 has a GPU core speed of 780MHz, the M4000 uses a very similar 773MHz.
However, it's highly significant that FP64 (64-bit floating point) arithmetic logic units (ALUs) have been further reduced from 1/24 to 1/32 compared to the previous generation, which will put the M4000 at a disadvantage with applications using FP64 operations.
Most 3D content creation software doesn't, but some scientific visualisation software does, making the Kepler-based K4200 potentially still a better choice if you use software that definitely uses a lot of FP64 operations.
The other significant improvement over the K4200 is the quantity of memory. The M4000 sticks with the 256-bit memory path, but doubles the quantity of GDDR5 to 8GB, the same as the previous-generation ultra-high-end K5200.
This will give it substantially better abilities at handling huge texture sets compared to the K4200. The memory is also slightly faster, running at 1,502MHz compared to 1,350MHz.
In other words, the new M4000 has the same quantity and speed of memory as the former K5200, which cost more than twice as much. It offers identical bandwidth, too, at 192GB/sec. However, the K5200 has 2,304 CUDA cores – 38 per cent more than the M4000. Whilst these run at 650MHz, it should still be comfortably ahead of the new card. So if you purchased a system with one of these in not long ago, you don't have to feel too much like you wasted your money.
The M4000 sports four DisplayPort 1.2 connections, each one capable of driving monitors up to 4,096 x 2,160 4K resolution, with HDMI and DVD-D adapters also available. There's a separate 3D Stereo connector on an additional bracket.
There's support for all the latest 3D APIs, including DirectX 12, OpenGL 4.5, and Shader Model 5. Compute API supports includes CUDA (of course) plus OpenCL and DirectCompute, so this card can be harnessed for more general purpose number crunching than just 3D rendering. Power consumption has risen to 120W compared to the 108W of the K4200, but that's still not huge.
Only a single six-pin PCI Express power connector is required. Being a professional card, the M4000 comes with a standard three-year RTB warranty, although it's also possible to extend this to five years at time of purchase for a little extra.
How about a fair comparison with a FirePro card ?
A few days ago new McLaren F1 subsequent after earning 18,512$,,,this was my previous month’s paycheck ,and-a little over, 17k$ Last month ..3-5 h/r of work a day with extra open doors & weekly paychecks. it’s realy the easiest work I have ever Do. I Joined This 7 months ago and now making over 87$, p/h.Learn More right Here
dm……
➤➤
➤➤➤ http://GlobalSuperEmploymentVacanciesReportsMoney/GetPaid/98$hourly…❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦
Not sure there’s one that’s fair to compare it to at the moment. The W8100 has a few areas where it can compete with the K4200, but it’s not going to be able to put up much of a fight against the M4000.
A few days ago new McLaren F1 subsequent after earning 18,512$,,,this was my previous month’s paycheck ,and-a little over, 17k$ Last month ..3-5 h/r of work a day ..with extra open doors & weekly paychecks.. it’s realy the easiest work I have ever Do.. I Joined This 7 months ago and now making over 87$, p/h.Learn More right Here
4kkw……
➤➤
➤➤➤ http://GlobalSuperEmploymentVacanciesReportsCloud/GetPaid/98$hourly…❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦
Not sure what are you talking about (given you call yourself a doctor you must be able to spit out an informed opinion), why not test it against the direct competitor, the Firepro W7100? Althogh a weaker gpu than the quadro, it has a good 20 – 30% higher gpu frequency and a few times greater double precision performance, nothing comparable to w8100, but still grater than even the quadro m5000.
I’m a doctor of philosophy – that’s from me signing into Disqus via Twitter! It’s not at all relevant here. Here are some SPECviewperf 12 scores from the W7100, in a RENDA PW-E7F (see review on this site), which isn’t a million miles in spec from the test system for the M4000 review:
catia-04: 56.6
creo-01: 51.57
energy-01: 2.81
maya-04: 55.96
medical-01: 24.77
showcase-01: 44.73
snx-02: 62.24
sw-03: 86.48
Compare these to the scores in this review and you will see that the M4000 totally owns it in most tests. The double precision difference is simply not relevant for most professional 3D content creation applications. That’s why NVIDIA has been happy to produce a professional card where this capability has been neutered. Unfortunately, I didn’t have a W7100 or W8100 available when I reviewed the M4000, otherwise these would have made an interesting comparison. I am confident that both would lose out in most tests, though, from previous experience.
A few days ago new McLaren F1 subsequent after earning 18,512$,,,this was my previous month’s paycheck ,and-a little over, 17k$ Last month ..3-5 h/r of work a day ..with extra open doors & weekly paychecks.. it’s realy the easiest work I have ever Do.. I Joined This 7 months ago and now making over 87$, p/h.Learn More right Here
4hxp……
➤➤
➤➤➤ http://GlobalSuperEmploymentVacanciesReportsFirst/GetPaid/98$hourly…❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.❦.
Win in syntetic benchmarks it truly does.
However how in real life it performs?
Yeah, hard to tell what’s happening here, is it just optimizations for a given benchmark and background overclocking and such? Won’t be the first time for Nvidia.
I’m still dissapointed that reviewers still don’t test the cards on 3d model sets that are made by real living designers, architects and such, maybe even someone from engineering industry. Point being, the models won’t be optimized for benchmarks as it would in real life.
You’re not Yoda, are you? Speak like him, you do.
Real life testing is always hard. A test needs to offer a level playing field so you can compare results. I have used 3d model sets and SPECapc before (see other articles on this site), but you need a consistent platform and a test you can repeat identically every time. Having a £2,000 workstation lying around just for testing isn’t economically viable for most publications. Actually, I can’t think of one anywhere in the world that has this kind of facility outside of manufacturers.
The reasons why you don’t see tests like this is not laziness, it’s to do with economics, unfortunately. I’m about the only person in the UK who does workstation testing, and I’m actually a freelance. So I have to make do with what I have to hand. In an ideal world, I’d have a lab full of current kit to compare with. But in reality it’s nearly impossible for anyone to have that, even large publications, and not for niche products like professional workstations.
Get a grip on your self you lazy doctor, it is as easy as ever.
Tedious but super-duper easy, and as financially burdening as any other task in a review channel.
Same as now the PC review editors use often underpaid/unpaid grad student work on making reviews, articles etc.
The same can be done with outsourcing custom 3D models with textures in colaboration with universities in exchange for credit points with/without barebones salary for students.
Jesus, it can be as easy as just making relatively custom 3D scene sets, by just making a similar deal as with GPU maker, get a free and ever so often refreshed package from a few different model and texture sellers (mixing them in a given scene) in exchange for the usual mention in the test setup page and and a few sentence write up what’s it all about with a few pretty pictures.
And if it is so hard to make a review, why call it that, wouldn’t it be a bit more honest to call it a “first look”?
HI! i have recently buy the card. I am disappointed because no specific manual is included on the DVD and also impossible to find online. Only generic multi card installation power connection manual. I like to have a complete hardware manual, explain for example the function of the led light on the rear, between display port connectors.
Would have been interesting to see some power consumption testing. Good review, though. I think I’m going to pick this card up pretty soon.
Getting a new computer for myself with dual xeon E5-2640 V4 cpu’s but would this even be an upgrade for my Titan black, which has a lot more cuda cores for instance? i do game, run virtual machines, movie creation and some 3D using Lightwave/Zbrush.
I use this card every day in an engineering design role (Solidworks). I have been elsewhere over the years but I always come back to the Quadro cards and I’m always impressed when I return. Hard to do a side-by-side comparison in a real-world test, who has the time or the equipment?
I think it depends what you run. I have a Dell Precision workstation with the Quardo M4000 and a reasonably good i7 processor, I also have an Alienware 15R2 with a better processor and a GTX980 connected via the Graphics Amplifier. For gaming the Alienware machine is much better but for Solidworks the Alienware machine is virtually unusable with laggy performance & glitching while the Quadro machine is fast and smooth, not a hint of a glitch. Both machines will run both types of software but the difference is remarkable.
It used to be the case that the Geforce cards were optimized for DirectX and the Quadro cards were optimized for OpenGL, both could do both but each could only do one very well – I don’t know whether that’s still how it works. I can’t help feeling that there’s some price gouging attached to the Quardo cards, after all they use the same chips as the Geforce cards – presumably just using a different config & driver but much more expensive. It’s all a bit cheeky.
Plan is to get more into the more professional stuff, but games will still be a part of my computer time 🙂
Well i have now ordered the PNY P5000 quadro card, now i just need to figure out if i should keep my 40″ UHD monitor or go for the 5k 27″ or 34″ ultrawide, the ultrawide seems very nice i admit.
Thanks for the reply Jaffa99.