Comments Locked

44 Comments

Back to Article

  • ZeDestructor - Sunday, June 4, 2017 - link

    Why oh why did they have to include the stupid DVI port? And why not embrace it fully with 3-4 USB-C + 2 HDMI? :(
  • vladx - Sunday, June 4, 2017 - link

    MSI wanted to support all monitors out there, and that includes old ones using DVI.
  • ZeDestructor - Sunday, June 4, 2017 - link

    Ship an HDMI-SLDVI adapter. If you have a DL-DVI monitor, you either have something new enough to have DP or HDMI2, or can afford an active DL-DVI adapter.

    Besides, if you're buying a GPU with USB-C, you're not exactly in the "cares strongly for legacy ports" end of the market.
  • 0ldman79 - Sunday, June 4, 2017 - link

    It is a lot easier mechanically to convert DVI to HDMI than vice versa. The DVI connector screws in, it is solid. The HDMI is not as mechanically sound. Electrically they can be identical.
  • azrael- - Wednesday, June 7, 2017 - link

    ^ Very much this!
  • DanNeely - Sunday, June 4, 2017 - link

    Your assumpting that DL-DVI is only needed for old monitors owned by rich people is wrong. Some of the cheap Korean 1440p monitors shipped in the last few years have been DVI only
  • peterfares - Sunday, June 4, 2017 - link

    You don't even need to be rich to own one of the older 30" 2560x1600 monitors that only have DL-DVI. They were around $1000, definitely in the budget for any professional who could make use of one. I got my 30" for $1000 6 years ago, but thankfully that one was new enough to have DisplayPort but one generation older and it would've just had DL-DVI.
  • DanNeely - Sunday, June 4, 2017 - link

    You can make a reasonable argument that anyone rich enough to afford a $1k monitor should be able to afford a $80 adapter without excess hardship regardless of if being able to afford a $1k monitor means you're rich or not. $80 on top of a $250 monitor OTOH is rather painful; especially since the people affected include those who bought the cheapest Korean 1440p monitors. And for someone who bought the cheapest possible monitor for $250 having to buy a dongle costing 1/3rd the price of the monitor is rather painful. If for no other reason than because for the price of the dongle on top of their original purchase price they could've gotten a better display with more modern connections built in.
  • rtho782 - Monday, June 5, 2017 - link

    The $80 adaptors (I have one, on my 2008 30" monitor) are crap though, they need to be regularly disconnected when they play up.
  • DanNeely - Monday, June 5, 2017 - link

    How old is your adapter design? I know the 1st generation ones were notoriously bad, but I'd seen claims that the newer ones finally managed to get most of the bugs worked out.
  • SodaAnt - Monday, June 5, 2017 - link

    The newer adapters are much better. I had one of the first gen ones which required USB for power and I had all sorts of issues, but I got another one recently which didn't require external power and was only about $30, and it works great.
  • Mr Perfect - Monday, June 5, 2017 - link

    That's true.

    It makes less sense on a $700 video card though. If someone with a $250 monitor can't afford a $80 adapter, how are they splashing out for a $700 1080 Ti?
  • DanNeely - Monday, June 5, 2017 - link

    A lot of people apparently do though. According to the steam HW survey ~4.5% of gamers have a 980/1070 or better, only 3% have a 2560x1440 or higher display. Less than 1% have a 4k monitor; if you add in people with 1440p widescreens and assume almost as many have 2560x1600 screens (this size is bundled in with other) you're still only at about 1.3% of steam gamers have what's likely a $700+ monitor vs 2% having a 1080/980 TI who spent at least that much on their GPU; which means that a large fraction of high end GPU gamers only have cheap monitors for whatever reason. At least 1/3rd of the total, probably more significantly more since I took the largest possible guestimate for 1600p users and at least some of the 980/780 gamers are people who bought them at release prices and just haven't upgraded yet.
  • rtho782 - Monday, June 5, 2017 - link

    I have a 30" 2560x1600 Dell WFP3007-HC as my 2nd monitor (my first is a RoG Swift).

    I wish the founders edition 1080ti (which I bought) had a DVI port. I started off with a random GT620 in my other PCIe slot to drive the Dell, this caused issues, so I bought a DL-DVI adaptor, but this occasionally doesn't resume from sleep properly and I have to disconnect/reconnect it.

    DL-DVI is still useful.
  • eek2121 - Sunday, June 4, 2017 - link

    This is a top of the line GPU. Chances are that the target market for this product has decent monitors that either have Displayport or HDMI. Hell, you wouldn't buy a 1080ti for 1080p gaming...would you? (I own a 1080ti and I use it for 1440p gaming...even that is a bit of a waste).
  • chrnochime - Sunday, June 4, 2017 - link

    I'd buy it for VR(is it even powerful enough for silky smooth gaming? I have no idea), and nothing else. I could care less if my 24" screen is only 1080p instead of 4k, since I'll be 2' away from it, and I want my screen to not have freakishly tiny fonts.
  • Lord of the Bored - Thursday, June 8, 2017 - link

    Problem is... you can still buy a monitor today with a VGA port, and that's a DVI-D connector, so it can't even be adapted to DVI-I.
    Support all the monitors, indeed.

    (Lest this be confused for an actual complaint: we're well past the point where VGA SHOULD be a relevant standard, and I'm all for DVI-D disappearing as well. And I abhor HDMI, which is close to the worst possible video interconnect standard. I would be delighted if this thing was DP/USB-C only, or even straight USB-C. But if they're gonna include a DVI connector, it may as well be DVI-I.)
  • AllIDoIsWin - Monday, June 5, 2017 - link

    Noo.. that doesn't make sense. Nearly everyone I know is still using DVI. HDMI is for the birds. And nearly all monitors support DVI before HDMI, no?
  • lmcd - Tuesday, June 6, 2017 - link

    DVI should no longer exist. HDMI and DisplayPort both totally outclass it in every way.
  • azrael- - Wednesday, June 7, 2017 - link

    And there, my friend, you are wrong. DP outclasses DVI (mostly) but DVI outclasses HDMI. Truth be told HDMI has next to no place in the computing industry.
  • Aikouka - Monday, June 5, 2017 - link

    As a multi-monitor user, the idea of getting rid of DVI permanently is a really bleak thought. I don't think most people realize this, but HDMI and DisplayPort aren't always connected devices. If you turn off the monitor or it goes into a low-power mode, the monitor will disappear from Windows as the PC thinks it's no longer connected. (The low-power mode depends on how the monitor implements it. Some monitors allow you to turn off super low-power mode, which keeps the connection active while asleep.)

    I bought a GTX 1080 Ti Founder's Edition, but I ended up buying another 1080 Ti, because I couldn't stand not having DVI. With only DisplayPort, my windows were constantly being placed onto the main display. It's nothing bad, but it is highly annoying having to move things back whenever I go back to the machine. I tried using a headless display, but once Windows tossed my windows onto the fake display, I stopped using it. (I have to use a headless display with my kitchen Touch HTPC or else I can't remote into it while the HDMI-connected monitor is off.)

    Oh, and as a quick note, I also tried an active DisplayPort adapter (from StarTech), and it had issues when the monitor went to sleep (went to sleep, and then woke up with a "No DVI-D Signal" .. ad infinitum).
  • Morawka - Monday, June 5, 2017 - link

    Display Ports are still finicky on windows. Whenever i turn on my 2nd monitor which is using DP, it moves around my audio settings, and windows has to re-detect the monitor everytime i power it on. With DVI, my monitor still shows up, even when turned off in the windows display settings page, therefore when i turn it on and off, it doesn't mess with my audio settings.
  • rmm584 - Sunday, June 4, 2017 - link

    "Meanwhile the USB-C port and cabling system is intended to support 80Gb/sec (or more) of cable bandwidth."

    Is this right? I thought each USB-C lane is 10Gb/s and the Displayport alt mode enables the 4 lanes to be bonded in one direction giving 40Gb/s of cable bandwidth. I know Thunderbolt 3 doubles the bandwidth per lane using an active cable, that cable could reach 80Gb/s.
  • Guspaz - Monday, June 5, 2017 - link

    The port and cabling were intended to support higher speeds than the protocol currently supports, much like has happened in the past, where there was no difference between USB 1.0/1.1/2.0 cables, and how there was no difference between USB 3.0/3.1 type A cables.
  • MonkeyPaw - Sunday, June 4, 2017 - link

    Almost seems like a better plan would be to design this so the USB-C ports all end up originating from the motherboard, and the graphics card passes display signal back. That way the ports can still support all the other functionality without adding complexity to the add-on components. Having USB-C ports of varying capabilities strewn across the PC just seems like a mess that should be avoided.
  • Pinko - Sunday, June 4, 2017 - link

    What is not clear at the moment is that USB-C port supports for 2 streams DP1.2 as it's for TH3 specs, or it goes beyond and it support full DP1.4 capability. That would means 8K on a single stream. Anybody knows anything about it?
  • DanNeely - Sunday, June 4, 2017 - link

    Am I correctly understanding that the loss of TDMS back compatability only means that you couldn't hook up a passive USB-C to HDMI/DVI adapter (if such thing were to exist); but instead would need the same sort of active adapter that goes into a pure data USB-C port?

    https://en.wikipedia.org/wiki/Transition-minimized...
  • edzieba - Monday, June 5, 2017 - link

    It means you cannot use a Type-C 'displayport' output as a DP++ port (as the other two ports are). It;s a silly distinction, as Type C also offers HDMI Alternate Mode for identical functionality.

    The presence of DP++ ports means there is no good reason to include a HDMI port on the rear of a card. You basically waste a bunch of panel area that could be used for another DP++ port (or for more ventilation) just to avoid use of a passive adapter that costs single-digit $ (if not less).
  • chaos215bar2 - Monday, June 5, 2017 - link

    Forget USB data. If you're going to put this port on a high-end GPU, is there a technical reason it can't support Thunderbolt?
  • CharonPDX - Monday, June 5, 2017 - link

    Agree - they should slap an Alpine Ridge onboard, too. Route 4 of the PCI Express lanes to Thunderbolt, and voila - you have full-functioning Thunderbolt 3 on *ANY* PCI Express x16 equipped computer. Put two USB-C ports on, and you have full DisplayPort functionality, plus you gain both Thunderbolt *AND* USB 3.1 Gen 2 10 Gb/s, even on older USB 3.0/3.1-Gen1-only 5 Gb/s systems.
  • Spunjji - Monday, June 5, 2017 - link

    Genius! Hopefully now Intel are opening up the standard a bit we can get some competing controller chips out there to enable stuff like this. The GPU can easily make do with the remaining 12 lanes.
  • DanNeely - Monday, June 5, 2017 - link

    The electrical implementation would be more complex than just "the GPU can use the other 12 lanes" would imply. PCIe can only be split in powers of 2. Which means that a basic implementation when using the TB3 port could either be x16 to the mobo, feed 8 to the GPU, 4 to the TB3 controller and discard the last 4 or 8 to the mobo and 4x to the GPU and TB3 controller. For a single GPU setup the first isn't a particularly bad result since x8 vs x16 is negligible. Only having 8 lanes to share would start to hurt some games significantly though (others would barely notice).

    The more expensive options to avoid this would be to either go with a 2 card solution: an x16 GPU and an x4 TB3 card with an internal connector to link the two. Or to put a PLX chip on the card to split the PCIe lanes as needed. This option would let the GPU always have at least 12 lanes of bandwidth. The Problem is that the PLX chip maker was bought by a company who's more interested in getting huge per chip profits from server mobo makers than selling lots of commodity chips to enthusiast mobo vendors; and jacked the price up to about $100. Combined with the need for a TB3 controller from Intel and extra engineering work to make it all work and doing it this way while the best from a technical/performance standpoint would probably put a $200 price premium on the card.
  • lmcd - Tuesday, June 6, 2017 - link

    This could open up a pretty nuts world where an external GPU is connected to the internal GPU for SLI. That'd be pretty fun, honestly. Imagine one of those small ITX machines using a GPU with Thunderbolt for "docking" at home? Amusing to think of docking a desktop, but nevertheless, I'm sure people would enjoy it.
  • nagi603 - Monday, June 5, 2017 - link

    Are there long, tested usb c cables even under testing? I mean 5 meter long ones. I had to hunt for ages to find one that works with DP 1.2 for my freesync display, reportedly being told there is no such thing. I'd rather not have to repeat that, but it seems like there are none available over 2 meters.

    I mean bandwidth is nice and all, but if I can't get said bandwidth to the damn display, what good is it?
  • DanNeely - Monday, June 5, 2017 - link

    Apparently due to the way they're terminated, a max length beyond 2m in 5gb mode or 1m in 10gb mode is impossible.

    https://www.reddit.com/r/answers/comments/3py9gn/w...
  • Hxx - Monday, June 5, 2017 - link

    one more reason to go with a liquid cooled setup. There is no need for that 3 slot behemoth of a cooler on a graphics card.
  • masouth - Monday, June 5, 2017 - link

    there is no need for a liquid cooled setup on a graphics card either, it's just a choice. Just like choosing to go with an extra thick heatsink that takes up 3 slots instead of choosing a 2 slot model.
  • bigboxes - Monday, June 5, 2017 - link

    "leaving it with 2 DisplayPorts, a DVI-D port, an HDMI port, and the USB-C port"

    The supplied pic shows two HDMI ports and a single DisplayPort.
  • wi11ian - Saturday, October 7, 2017 - link

    This says the same specs
    https://www.techpowerup.com/gpudb/b4668/msi-gtx-10...
  • JyveAFK - Tuesday, June 6, 2017 - link

    Can the USB port be used to receive data? ie, if I get this card, plug in power, and then connect it to a USB-C laptop, can I drive a monitor/play games straight off it without using these big propriety gfx card enclosures? Some small (and hopefully cheap) mini-case/power for it? (heck, maybe an old mini-itx case).
  • bronan - Wednesday, June 28, 2017 - link

    OMG unbelievable that a tech specialist calls a thunderbolt 3 port a usb-c port.....
    I already said many times the confusion will be massive and here it shows.
    Nobody sees the difference between the ports and here it pops up in plain view
  • 8steve8 - Friday, September 1, 2017 - link

    still not for sale anywhere. USB and DP in one little cable, sold!
  • wi11ian - Thursday, October 19, 2017 - link

    still not for sale anywhere. :(

Log in

Don't have an account? Sign up now