Rainbow Six: Vegas: A Performance Analysis

by Josh Venning on 12/25/2006 6:00 AM EST
Comments Locked

32 Comments

Back to Article

  • kreacher - Monday, May 21, 2007 - link

    I would love to see an update on this article once the 2600 has been released.
  • SGTLindy - Saturday, December 30, 2006 - link

    it runs better on ATI and does not have many graphics options because its a Xbox 360 port!!

    Runs great on the 360....runs slower on the PC....wow that was tuff to figure out.

    Gears of War looks better on the U3 engine because...the GOW team made the U3 engine...if anyone is going to know how to tweak a U3 based game it would be them, especially since the engine just came out.

    None of this is rocket science.
  • Sharky974 - Friday, December 29, 2006 - link

    There is a user over at B3D saying his Rainbow Six Vegas box (he also provided a photo) says Unreal engine 2, NOT Unreal engine 3. And his photo backs that up. Apparantly R6 might be a "UE2.5" game.

    Anand wouldn't be the only site to make that mistake, but you guys might wanna look into it..
  • bisket - Wednesday, December 27, 2006 - link

    exactly, rocky.

    the heli rides do not tax my system at all. it's during levels that i have the *oh so very annoying* fps random drops to 20 from 60.

    i just hope this is not a growing trend in games. enough said. anandtech rocks! ;)
  • R0CKY - Wednesday, December 27, 2006 - link

    Was benchmarking the heli ride in these test really the best way to test Vegas performance? What percentage of the game is actually spent flying in a heli, and is testing the part of the game where the player switches off and doesn't really care what is going on in game the best part of the game to test?

    I appreciate there was no easy way to benchmark due to there being no in-game system to replay the same scene more than once, but at the end of the day it is the game's performance during firefights and urban scenes that is of interest to the gamer, not level-transition heli rides.

    Is it valid to assume that the engines rednering performance is the same for detailed character models as it is for long draw/low detail high altitude scenes?

    Rather than settling for an easily reproducible scene of little revelance, personally I'd would have liked to have seen something a bit more relevant tested, even if it took some ingenuity to come up. It is possible to get quite accurate comparisons, for example, by simply recording the FPS as a character runs the same path through a level several times - at least that way we'd get a report showing FPS from scenes the player is interested in, rather than unimportant heli rides.

    That comes of like a bit of a rant, but it is meant to be constructive comment, honest!

    :o)

  • mlambert890 - Wednesday, January 10, 2007 - link

    Weird, but to be honest, I actually do better in game (even during fire-fights), then in that heli ride. My thinking was that the engine isnt particularly efficient at rendering the wide-open city scape.

    With an FX-60 o/c to 2.8Ghz and an X1900XTX@650/775 and 2GB PC3200 I get 30-40fps on the heli ride, but I very rarely dip below 45fps in game. A couple of the big fights dropped into the 20's but it didnt really disrupt play that badly. Gameplay for 90% of the game ended up better than the heli ride bench would have implied.

    If you're interested, AMDZone did an R6:V bench using an avg of in-game framerates rather than the heli ride:

    http://www.amdzone.com/modules.php?op=modload&...">http://www.amdzone.com/modules.php?op=m...q=viewar...
  • VooDooAddict - Monday, February 19, 2007 - link

    thanks for the link to that review. Especially like the Single Core vs Dual Core and Dual Core vs Quad.
  • anandtech02148 - Wednesday, December 27, 2006 - link

    Gears of War got excellent lighting n shadows,
    worst unreal 3 engine game... REd orchestra.
    i like the first paragraph of this article, It hit the spot, consider i have downloaded 2Gig of patches for BF2!!!
    considered games now break the $100 easily for a title.
  • bisket - Wednesday, December 27, 2006 - link

    i don't see really, how this game can get that much praise.

    1. first off no widescreen support for pc except with a hack.

    2. imo i thought graw look a heck of a lot better then this. i hate ports from consoles to pc they dumb it down too much.

    3. i'm running a 8800gtx with a c2d 6600 with 2 gigs of pc6400 ram. and this game game me a good 60 fps (1920x1200 everything maxed with widescreen hack) in some areas. in some areas my fps droped to 20 which is unexceptable and just plain dumb. why? maybe because it's just a port and not optimized, i don't care if it's the unreal 3 engine or not, i'm not impressed.

    4. before i bash it too hard, i do have to say that despite it's major flaws the game is fun and could be *tons* better.

    5. i took this over to a friends house that has the dell 30" and same setup as me (8800gtx and whatnot) and we could not establish a framerate over 30fps, which is just ridiculous. i do not look forward to future pc games that are ported from a console. i will be saving my money next time.

    6. why all the low-res texture nonsense? and low geometry? i just don't get it.

    7. also, praise for the smoke? it looks bad (as in, not good), IMHO.

    i give this game a 5.5 out of 10.

    summary: decent graphics with major glitches and major fps drops in random places. fun gameplay. have fun playing online when it doesn't crash. very cool cover system and nice enemy ai.

  • 100proof - Thursday, December 28, 2006 - link

    8.) Ingame advertising ---> spyware..

    http://forums.ubi.com/eve/forums/a/tpc/f/380106502...">http://forums.ubi.com/eve/forums/a/tpc/f/380106502...


    My question is why don't Review sites like Anandtech hold game publishers like EA and Ubisoft accountable for this new trend of double dipping? Why also aren't publishers held accountable for not having information about spyware on outside of the packaging?






    Credit goes to SlipperyJim for info/screencaps below

    This shows traffic from when you double click the game icon to when it says "Press any key to begin:
    http://www.mods4games.com/images/misc/Vegas1.gif">http://www.mods4games.com/images/misc/Vegas1.gif

    Traffic from when you select "Multiplayer > Online":
    http://www.mods4games.com/images/misc/Vegas2.gif">http://www.mods4games.com/images/misc/Vegas2.gif

    Traffic from when you login with your Username and Password:
    http://www.mods4games.com/images/misc/Vegas3.gif">http://www.mods4games.com/images/misc/Vegas3.gif

    Traffic when you get a list of games:
    http://www.mods4games.com/images/misc/Vegas4.gif">http://www.mods4games.com/images/misc/Vegas4.gif


    The interesting locations seems to be "locate.madserver.net" and Demonware.

    "madserver.net" is Massive Incorporated server. This is the server for in-game adverts. If you add "locate.madserver.net" to your Windows host file it appears to block the in-game advertising. Below is a link to how it is blocked in Swat 4 (follow the same method but add "locate.madserver.net" to the list):

    http://nationalcheeseemporium.org/">http://nationalcheeseemporium.org/


    DemonWare is a company that offers matchmaking services (probably just like Gamespy in that they will check your CD key and maintain a master server list of available games). It also is a company that has lobby advertising and also offers something called "DemonWare DNA" which sounds a lot like spyware. Frown

    http://www.demonware.net/">http://www.demonware.net/



    quote:
    The most shocking part was next. The client contacted madserver to tell the advertisers how long the gamer spent with each advert in their view. This is mapped to the gamer id, so they know which player in the game saw the advert, and when, for how long, and from how far away (by virtue of the size attribute). Even the average viewing angle is passed back.
  • 100proof - Thursday, December 28, 2006 - link

    Matching statistics to the GamerID alone is useless. So why include the GamerID at all? Is other information related to a Ubisoft GamerID account being shared? birthdate? gender?

    Anandtech will you investigate this?
  • BronxBartoni - Tuesday, December 26, 2006 - link

    I would really have loved to see the differences, if any, between single and multi core setups.
  • poohbear - Tuesday, December 26, 2006 - link

    thanks for the review anandtech, many of us are interested in new graphics engines and how they perform w/ current hardware.:)
  • unclebud - Tuesday, December 26, 2006 - link

    "I think the point Anandtech was trying to make is that they hope the performance gap can be reduced somewhat with driver/game updates."

    yeah, it hurts them so bad to admit it... just look at their past reviews in video for the absolute proof.
    i bet if they had their way, amd + ati would have never happened. they probably have nightmares every night about it? just my opinion/observation. the site owner needs to come back and review more! i miss his articles! augh!
  • CrystalBay - Tuesday, December 26, 2006 - link

    Go Sierra, never give in. You Rock Forever, Keep on patchin...
  • BikeDude - Monday, December 25, 2006 - link

    I don't care about 1600x1200 running full blast with all the settings enabled.

    Which cards will allow me to run this game at 2560x1600 using reasonable settings? (reasonable=good fps without tangos turning into stick figures)

    I have a 7800GTX now... Time to upgrade?
  • VooDooAddict - Tuesday, December 26, 2006 - link

    If you want to run at 2560x1600 then expect to be upgrading to the leading edge frequently. 8800GTX would be a good buy for you if you really want to run at 2560x1600.

    However, if you run at 1280x800 you'll be at a perfect scaling for that 2560x1600 monitor. (I'm assuming you have the lovely Dell 30") 1280x800 will still look great when it's running smoothly on your 7800GTX.
  • Spoelie - Monday, December 25, 2006 - link

    yes
  • Jodiuh - Monday, December 25, 2006 - link

    1. Instead of using the "suggested" scene for benching and telling us to expect worse perf, why not take a look at the most stressful scenarios?
    2. Would you say there might be more perf/better compat for 88's using the newer 97.02's...97.44's?
    3. Are these "ports" running better on ATI because they were deved mainly for 360? Thankfully PS3's out w/ NV inside then?
  • ariafrost - Monday, December 25, 2006 - link

    Looks like with my X850XT overclocked I may be able to run RSV at 1440x900... albeit with medium settings and the widescreen hack from WSGF.

    Graphics performance can only improve as the Unreal Engine 3 is tweaked/optimized. I wouldn't despair quite yet :P
  • ariafrost - Monday, December 25, 2006 - link

    Well forget about running it on my X850XT, apparently RSV *requires* a Pixel Shader 3.0 video card. If anyone could confirm/deny that information it'd be great, but for now it looks like a lot of ill-informed customers may end up buying a game their "128MB/256MB" video cards can't support.
  • justly - Monday, December 25, 2006 - link

    quote:

    It's very evident looking at all of these tests how Rainbow Six: Vegas tends to favor ATI hardware, but again, keep in mind that because of patches and updates this may not (and hopefully won't) be the case for long.


    Anandtech always seems to have a problem when ever it can't recomend NVIDIA as the best solution in every senerio. What is so wrong with the idea that ATI hardware performs better than NVIDIA hardware of the same generation? Maybe I'm mistaken, but I thought even Anandtech expected ATI might do better in newer games.
    Personally I'm not much of a gamer so it really doesn't matter to me, but fot the sake of the people using your articles to choose hardware why give them expectations that might not materialize?

    Maybe because I am not engrosed in the gamming experiance I have a different perspective, but considering a lot of games are ported over from consoles (or at least designed with consoles in mind) wouldn't it be reasonable to expect any game designed around a console using ATI graphics to favor ATI graphics on the PC? It wouldn't surprize me in the least to see games favoring (or at least more competitive) on hardware built around ATI for the next year or two.
  • Jodiuh - Monday, December 25, 2006 - link

    Because it's happened before. Remember Oblivion?
  • munky - Monday, December 25, 2006 - link

    Nothing happened. The 7-series still has much worse performance in Oblivion in outdoor scenes with foliage than equivalent Ati cards.
    http://www.firingsquad.com/hardware/nvidia_geforce...">http://www.firingsquad.com/hardware/nvidia_geforce...
  • Frumious1 - Monday, December 25, 2006 - link

    Try not to be so easily offended, Justly. I think the point Anandtech was trying to make is that they hope the performance gap can be reduced somewhat with driver/game updates. There are other games where NVIDIA outperforms ATI, but overall the 7900 GTX offers similar performance to the X1900 XT and not too much worse than the X1950 XT/XTX cards (I think). Another way of looking at this is that perhaps they just hope SM3 support doesn't turn into a GeForce FX fiasco again.

    So far, looks to me like ATI has better shader hardware. Ever read any of the stuff on the folding at home forums by their programmers? Basically, they have stated that G70 really has poor performance on their SM3 code even with optimizations... and it doesn't even look like G80 will be all that great. All that said, I still don't like ATI's drivers. CCC(P) is so sluggish it's pathetic, and that's after performance improvements since it first cam out.
  • jediknight - Monday, December 25, 2006 - link

    I was hoping to see some of the last gen cards (err.. now with the 8800, I guess two gens old..) - as that's what I'm running with (with no hope of upgrading - as I'm with AGP right now.. )

    Specifically, if future reviews would consider the performance of the X800XL running at 1280x1024, I'll be happy :->
  • Spoelie - Monday, December 25, 2006 - link

    you need to have a SM3 card to play this game, as such, it won't even start on your card.

    not that I agree with that policy, they should have provided a SM2 path, not everybody has a ~1/1.5 years old card.
  • jkostans - Monday, December 25, 2006 - link

    I think its pretty clear you'll be needing to run at 800x600 with med graphics, or 1024x768 with low graphics settings in order to get around 20 fps.
  • Tanclearas - Monday, December 25, 2006 - link

    quote:

    The X1950 XTX almost runs the game smoothly at the highest settings, and with some overclocking, Vegas has a good chance of running perfectly fine at maximum details and 1600x1200 with this card. The 7900 GTX, as powerful as it is, just can't manage acceptable performance in the game at 1600x1200, but at one resolution down it looks and plays fine.


    At 1600 x 1200, the 7900GTX runs at 19.8 and the X1950XTX runs at 20.4 FPS. Given those numbers, the above quote doesn't really make much sense. Did I miss something?

    And just so people don't think I'm whining, or a fanboy, or whatever, I have an X1900XT (512MB). I am just honestly confused by the conclusion that the X1950XTX could handle 1600 x 1200 and the 7900GTX could not.
  • Josh Venning - Monday, December 25, 2006 - link

    Thanks for the comment. The paragraph has been tweaked a little so that it's a little more clear. The fact is that both the X1950 XTX and 7900 GTX at reference speeds experience a little choppiness in the game at the highest resolution and quality settings. With some overclocking, either of these cards could run the game at these settings smoothly. Sorry for the confusion.
  • fuhsiang - Monday, December 25, 2006 - link

    I was expecting my 7900GT to last a little longer. However, with the results of the 8800 in this game, I guess we can say goodbye to the 7900 if we want to play R6:Vegas at 1600x1200.
  • feraltoad - Wednesday, December 27, 2006 - link

    No kidding, I picked up a 1900xt (flashed to xtx) and I was hoping to play full res on my monitor (1680x1050) until Crysis & other DX10 games arrived. However, I loaded up Vegas, and I have choices of scaling or 1:1 window. I love the progress of technology but it's sad when a $450 vid card can't make it a full year on full res (even OCed to 688 core). Or is it more sad a Vid Card King now cost $600?

Log in

Don't have an account? Sign up now