Freesync and that jazz, OFF?!?

Discussion in 'Video Cards & Monitors' started by BobJustBob, May 9, 2019.

  1. BobJustBob

    BobJustBob Member

    Joined:
    Apr 30, 2009
    Messages:
    339
    Location:
    PERTH, W.A.
    hey. i am thinking about getting a new monitor and it seems most now have that nvidia/amd "vsync" tech built in. i know what it is, but i must know; can that feature be hardware (via monitor settings/button) be disabled. yeah, sure in 99% of cases you won't want to, but there could be some testing or whatever where i actually want tearing/vsync off.

    part 2, bias aside, who makes the better tech? nvidia or amd?

    is there a chart that shows what they do side by side?

    much thanks for any valid repliers.
     
  2. PsychoSmiley

    PsychoSmiley Member

    Joined:
    Dec 23, 2001
    Messages:
    6,747
    Location:
    Taranaki, New Zealand
    Freesync can be toggled in the device settings I know that much of the monitor, also through the GPU drivers as well. Can't speak for G-sync monitors though.

    My understanding is G-sync is more consisent with lower limits on the effective brackets and it working but comes at a cost premium.
     
  3. OP
    OP
    BobJustBob

    BobJustBob Member

    Joined:
    Apr 30, 2009
    Messages:
    339
    Location:
    PERTH, W.A.
    wait. what cost? are you suggesting that using either gsync or freesync takes up system resources? (cpu/ram?) huh? that sounds illogical since both are hardware bound via monitor. explain what you mean.
     
  4. ^catalyst

    ^catalyst Member

    Joined:
    Jun 27, 2001
    Messages:
    11,875
    Location:
    melbourne
    G-Sync (the old, real shit) costs money in the monitor, there is a chunk of hardware to do it and nvidia charge the manufacturer for it.

    OG-G-Sync monitors sync at broader, higher frame rates and are all very similar if not identical in that aspect.

    FreeSync monitors vary greatly in their sync range.
     
  5. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    2,011
    Location:
    nsw meth coast
    unless its a very old 60 75hz screen id leave it on
     
  6. boneburner

    boneburner Member

    Joined:
    Sep 17, 2009
    Messages:
    2,001

    I will assume that question is aimed at video cards? That would depend on what you wanted to use it for and on a strictly model by model basis.

    Most of the time it's nvidia for strictly gaming - but they also dump terrible products into the market because they can, with arcane numbering systems that purposefully try to lead the average consumer astray (see the current GTX 1650 for example)
     
  7. HobartTas

    HobartTas Member

    Joined:
    Jun 22, 2006
    Messages:
    766
    I have a standard 60 Hz monitor and a G-sync 144 Hz one both with Nvidia cards and these are my observations. I play Battlefield and I like Fraps to show around the 80-120 FPS at all times and if it dips below 60 I upgrade something. When playing on both monitors I can't tell the difference whereas other people claim they can. The only thing I can spot is if I'm standing still and not shooting and I throw a smoke grenade on the 60 Hz monitor the smoke grows by appearing in new added chunks whereas on the 144 Hz one it expands in size smoothly and that is the only difference I can discern if I'm staring at the screen looking at this special effect. I turn G-sync on and off and still can't tell the difference on the G-sync monitor when I'm getting around 100 FPS either way. My advice to you based on what I see is just get a good 60 Hz monitor and if you want a 144 Hz one just get a much cheaper Freesync monitor just for the 144 Hz side of things. People probably swear by G-Sync when they are getting some crap 25 FPS on their machine and there is some tearing like perhaps one half of the screen is shifted over perhaps by as much as a tenth of the horizontal picture. I might get some tearing but its probably only a couple of pixels at most anyway and I can't really see it either especially on a screen refreshing at least 60 times a second whilst I'm shooting at people.
     
  8. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    2,011
    Location:
    nsw meth coast
    what screen are you gonna do this with?
     
  9. havabeer

    havabeer Member

    Joined:
    Dec 12, 2010
    Messages:
    4,624
    I thought g-sync was really only worth it if you use dual cards... or are gaming under 60fps
     
  10. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    2,011
    Location:
    nsw meth coast
    lolwot
     
    walker_2003 likes this.
  11. ^catalyst

    ^catalyst Member

    Joined:
    Jun 27, 2001
    Messages:
    11,875
    Location:
    melbourne
    Nup, legit g-sync is actually really good. I was meh about it until I figured out I didn't have it set up correctly. Dual cards are kinda naf as the current scene is concerned, for actually gaming.

    I concur.
     
    nope likes this.
  12. walker_2003

    walker_2003 Member

    Joined:
    Jan 15, 2003
    Messages:
    10,937
    Location:
    Canberra
    Hell na G-Sync is amazing.. fixes tearing (meh so does adaptive v-sync) but its real benefit is removal of stutter, something alot of people (especially 100hz+ users) often just assume is low fps looking low.. nup..

    Pre-Gsync I couldnt handle under 100fps after expericing 144hz, now with G-sync thats lowered to 80fps.

    This article explains it well.

    https://www.blurbusters.com/gsync/preview/

     
    nope likes this.
  13. OP
    OP
    BobJustBob

    BobJustBob Member

    Joined:
    Apr 30, 2009
    Messages:
    339
    Location:
    PERTH, W.A.
    so i just decided --soon after posting-- to, BAM, upgrade my main monitor. it's all gooooood.. i could babble on but there really is no need. old/new mains; 27">31.5" (32") 120hz>144hz(not "oc" b/s) 1920x1080>2560x1440 so the three main things all went up.. rock and fucking roll. oh, and this new monitor is slightly bent.,. /curved
    you would not believe this, but just by coincidence, i paid exactly the same as i did for my last one, in 2011. like what are the odds of that? and it's not a rounded price like $500 or $999.. kinda feels cool for me.. (i did not start looking at the cp to see if i should adjust things or anything, because as it runs now by just plug & play, all seems fine to me. i might check out what settings it has --later-- just so i know what's available.)

    ..

    not that it matters much, but i also some days later upgraded my tv. its 4k and blahblah, and you know what about 4k? shove 4k up yer brownie! no i mean it's lovely, as long as you get the samsung 4k demo off the net and view that to get hard. other than that wtff is in 4k? absolutely nothing, and certainly not from tv stations. do they even transmit at 1080 yet/at all? maybe once in a blue moon one station may have something at 1080 for an hour or such, amiright? so 4k and all that is cloud dreams. a totally useless tv "feature" for 99.9% of users. (i count console gaming in the 0.01%, and smart tv or not, i talk about traditional tv viewing.)
     
    Last edited: May 18, 2019
    nope likes this.
  14. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    2,011
    Location:
    nsw meth coast
    lmao what did u even get reee
     
  15. havabeer

    havabeer Member

    Joined:
    Dec 12, 2010
    Messages:
    4,624
    YouTube, Netflix, foxtel And a few other things offer some 4k content. But you have to pay for them. I think some bluerays are coming out as 4k as well?

    It's just free to air not jumping on board.
     
  16. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    2,011
    Location:
    nsw meth coast
    basically 4k is beyond useless for media atm.
    Theres barely any 4k native content its all ai meme upscaled higher bitrate reencoded 2k masters with hdr slapped on.
    Cant wait for blurays to die off 4k was a fart
    https://4kmedia.org/real-or-fake-4k/
    dunno if this site is still trustworthy
    i run screen in sig for gaming and a 1080p tv its just pointless.
    Bitrate/encode matters way more these days that raw resolution.
     
    Last edited: May 18, 2019
  17. OP
    OP
    BobJustBob

    BobJustBob Member

    Joined:
    Apr 30, 2009
    Messages:
    339
    Location:
    PERTH, W.A.
  18. nope

    nope Member

    Joined:
    May 8, 2012
    Messages:
    2,011
    Location:
    nsw meth coast
    wait whats your pc? also what hdr is that tv? 100hz 4k would be awesome if it was dp/freesync
     
  19. OP
    OP
    BobJustBob

    BobJustBob Member

    Joined:
    Apr 30, 2009
    Messages:
    339
    Location:
    PERTH, W.A.
    baby boi.. you're mixing a tv and a pc monitor. what's freesync got to do with 4k and such? you're confused. i linked to exactly what i got. one is a standalone tv to watch tv on, and the other a pc monitor to game on and do "internet".
     
  20. walker_2003

    walker_2003 Member

    Joined:
    Jan 15, 2003
    Messages:
    10,937
    Location:
    Canberra
    I dont think he is, btw your a funny prick lol..

    nope thats a very entry level TV the HDR wouldnt be that amazing and its for sure 50hz native 100hz interpolation.. highly doubt it has freesync too.
     
    nope likes this.

Share This Page

Advertisement: