Comments Locked

16 Comments

Back to Article

  • MattMe - Wednesday, October 4, 2017 - link

    Is it just me or are there a whole lot of jaggies for 4k in that last screenshot? The lighting looks great, but a lot of the textures don't look very real. I know it's only a benchmark suite, just commenting really.
  • Communism - Wednesday, October 4, 2017 - link

    That's what happens when an engine tries to do almost everything in shaders and blur filters instead of actually rendering things properly.
  • MrSpadge - Wednesday, October 4, 2017 - link

    Blame the artists, not the engine for lack of texture detail.
  • Communism - Thursday, October 5, 2017 - link

    If you don't know the subject matter, then don't comment.
  • peterfares - Wednesday, October 4, 2017 - link

    If anti-aliasing is not on then it doesn't matter what resolution is being rendered, there will be jaggies. You'll only notice them if you get up real close at 100%.
  • Cyanara - Sunday, October 8, 2017 - link

    Are you looking at it on a 4k monitor? I'm pretty sure the general idea is that 4k monitors have such small pixels that you won't notice them and hence you don't need AA. But at pixel for pixel on a 1080p monitor, you're gonna notice the jaggies.
  • DanNeely - Wednesday, October 4, 2017 - link

    "Overall, on our 8-core Skylake X GPU test bench, we looked at 9 cards with and without async compute disabled"

    Does this mean you've finally sorted out all the problems that were interfering with doing game benchmarks on Skylake-X?
  • Ian Cutress - Wednesday, October 4, 2017 - link

    Nate doesn't seem to have issues, but he's only testing cards with one CPU. I'm a few thousand miles away from Nate, so it's not as easy as swapping a CPU in the lab. I have new motherboards coming, which might fix my issue. It might be my CPUs too - I sent our official sample to Nate, while I'm running ES samples. Ideally, I'd be debugging this issue, rather than dealing with launches on other platforms that are taking priority right now.
  • lucam - Wednesday, October 4, 2017 - link

    I heard Matrox is coming back with a solution that can compete with latest AMD and Nvidia. Look forward to seeing that.
  • BrokenCrayons - Thursday, October 5, 2017 - link

    That's good news. I can finally replace the S3 ViRGE DX in my gaming PC and maybe get better performance. Up to now, even SLI Titans have just been too pedestrian deliver the FPS necessary to keep up with my ViRGE in bleeding edge games Descent Freespace. I'm thinking 640x480 is just too many pixels for only a pair of Titans.
  • Fergy - Friday, October 6, 2017 - link

    Matrox has never had a competitive GPU. Just like 3Dlabs and Bitboys. I hope Apple/Qualcomm/ARM make a desktop GPU.
  • extide - Wednesday, October 4, 2017 - link

    Anyone remember when FutureMark was still MadOnion? Heh, I always loved the music in 3dMark 2000..
  • BurntMyBacon - Thursday, October 5, 2017 - link

    I liked the old name better. Just ran a 3dMark 2000 for nostalgia sake. I had forgotten what the music sounded like. I remember when 3dMark 2000 represented the pinnacle of graphics technology.
  • Communism - Thursday, October 5, 2017 - link

    If you don't know the subject matter, then don't comment.
  • Communism - Thursday, October 5, 2017 - link

    RIP proper functioning of comment system, this was as a reply to one of the posts.
  • EugenM - Friday, January 19, 2018 - link

    TimeSpy doesnt support true Async Compute, the single threaded style Async Compute is clearly tailored to take into Account Nvidia hardware Limitations leaving massive processing power idle in AMD RX Vega 64, had this benchmark had proper async compute it would make a dejavu on results like in the old days when Benchmarks overtesselated everything to bog down the dedicated hardware tesselator for AMD and allow Nvidia gpu processed tesselation to shine while everything else in the graphics quality department was just about average.

    In this test, Futuremark instead of beying fair and utilize DX12 implementation properly and fully implemented the Nvidia Version of Incomplete DX12.

Log in

Don't have an account? Sign up now