g sync not working in games System requirements for g-sync hdr. G-Sync technology - Nvidia's solution to performance problems

  • 02.07.2020

Instruction

To correct this parameter, open the menu of your game, find the "Options" or "Parameters" menu, in the "Video" sub-item, look for the "Vertical" (Vertical Sync) item. If the menu is in English and the options are text, then look for the switch position Disabled or "Off". After that, click the "Apply" or Apply button to save this setting. Changes take effect after restarting the game.

Another case is if the application does not have such a parameter. Then you will have to configure synchronization through the video card driver. The setting is different for graphics cards manufactured by AMD Radeon or nVidia Geforce.

If your video card belongs to the Geforce family, right-click on the desktop and select the "nVidia Control Panel" menu item. Another option is to open the control panel through the Start menu, there will be a launch icon with the same name. If you can't find the right icon in the control panel or desktop menu, look near the clock in the right corner of the screen, there will be a green nVidia icon that looks like an eye - double click on it. As a result, the video card settings menu will open.

The driver control panel window consists of two parts, on the left side there are categories of actions, and on the right side there are options and information. Select the bottom line "Manage 3D settings" on the left side. In the right part of the window, on the "Global Settings" tab, find the "Vertical Sync" option at the very top of the list. Opposite the current setting will be indicated: "Enable", "Disable" or "Application Settings". Select the "Disable" option from the drop-down list and confirm your choice by clicking the "Apply" button.

For owners of AMD Radeon video cards, the driver is configured through a special Catalyst application. To launch it, right-click on the desktop and select Catalyst Control Center. Or open the computer control panel and find the icon with the same name. The third way is in the system area of ​​the screen near the clock, in the lower right corner, look for a red round symbol and double-click on it. The result of all these actions is the same - the control center for the settings of your video card will open.

The principle is the same as in the nVidia control panel. On the left side of the window there will be categories of settings, and on the right - detailed settings and hints for them. Select "Games" or "Gaming" from the left column and then the "3D Application Settings" submenu. On the right side, settings for various parameters of the video card will appear. Scroll down the page and find the inscription "Wait for vertical update", and below it is a toggle slider with four checkboxes. Move this slider to the leftmost position, below it will be the inscription "Always off". Click the "Apply" button in the lower right corner of the window to save the changes.

G-Sync technology overview | A Brief History of Fixed Refresh Rates

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Adjusting the speed of the electron gun from one full upgrade to the next was not very practiced before, and there was no particular need for this before the advent of three-dimensional games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog ones (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved from a fixed refresh rate. Movies and television still rely on constant frame rate input. Once again, switching to a variable refresh rate doesn't seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not a problem for displays. But it arose when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call frame rate, usually expressed in FPS or frames per second) is inconsistent. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and if you look at the empty sky - 60 FPS.


Disabling sync causes tearing

It turns out that the variable frame rate of the GPU and the fixed refresh rate of the LCD panel do not work very well together. In this configuration, we are faced with a graphical artifact called "gap". It occurs when two or more incomplete frames are rendered together during one monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect during movement.

The image above shows two well-known artifacts that are often found but difficult to capture. Since these are display artifacts, you won't see them in normal game screenshots, but our screenshots show what you actually see during the game. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a video capture card, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; this is the way we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with the camera, the bottom one is through the video capture function. The bottom image is "sliced" horizontally and looks misaligned. In the top two images, the left shot was taken on a Sharp screen at 60Hz, the right shot on an Asus display at 120Hz. The tearing on the 120Hz display isn't as pronounced as the refresh rate is twice as high. However, the effect is visible, and appears in the same way as in the left image. This type of artifact is a clear indication that the images were taken with vertical sync (V-sync) disabled.


Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect seen in BioShock: Infinite footage is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels don't change color fast enough, resulting in this type of glow. A single frame cannot convey the effect of ghosting on the game itself. A panel with an 8ms grey-to-gray response time, such as the Sharp, will result in a blurry image with any movement on the screen. This is why these displays are generally not recommended for FPS games.

V-sync: "an sew on the soap"

Vertical sync, or V-sync, is a very old solution to tearing. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing tearing. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). etc.), which in turn will lead to noticeable braking.


When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, since vsync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems while causing other disadvantages. An informal survey of our staff found that gamers tend to turn v-sync off, and turn it on only when tearing becomes unbearable.

Get Creative: Nvidia Introduces G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync, which attempts to mitigate the problems of enabling V-sync when the frame rate is above the monitor's refresh rate, and quickly turning it off when performance falls sharply below the refresh rate. While the technology did its job faithfully, it was only a workaround that prevented tearing when the frame rate was below the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.


The GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The Packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal, and by replacing the monitor scaler with a variable blanking module, the LCD panel can operate at a variable refresh rate related to the frame rate output by the video card (within the monitor's refresh rate). In practice, Nvidia has been creative in using the special features of the DisplayPort interface and trying to catch two birds with one stone.

Even before the tests begin, I want to give credit for the creative approach to solving a real problem that affects PC games. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE review: $400 24" 144Hz gaming monitor", in which the monitor earned the Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

G-Sync technology overview | 3D LightBoost, built-in memory, standards and 4K

As we browsed Nvidia's press releases, we asked ourselves quite a few questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed is that Nvidia sent the monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using a pulsing panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-Sync.

Nvidia gave a negative answer. While using both technologies at the same time would be ideal, today strobe backlighting at a variable refresh rate results in flickering and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, the two technologies now have to be chosen, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know G-Sync eliminates the incremental input lag associated with V-sync, as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for the frame to pass through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, almost the same delay we experience with V-sync turned off, and it is related to the features of the game, video driver, mouse, etc.

Will G-Sync be standardized?

Such a question was asked in a recent interview with AMD, when a reader wanted to know the company's reaction to technology. G-Sync. However, we wanted to ask the developer directly and see if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA association.

However, no new specifications for DisplayPort, HDMI, or DVI are planned. G-Sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology currently called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

G-Sync at Ultra HD Resolutions

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will review today, only supports 1920x1080 pixels. Ultra HD monitors currently use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if the module G-Sync support MST configuration?

Truth be told, 4K displays with variable frame rates will have to wait. There is no separate 4K upscaling device yet, the nearest one should appear in the first quarter of 2014, and monitors equipped with them - only in the second quarter. Since the module G-Sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens before 30 Hz?

G-Sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low screen refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync technology overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to high refresh rate panels only?

You will notice that the first monitor with G-Sync it initially has a very high screen refresh rate (above the level required by the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, such as a 6-bit TN panel. We became curious, the introduction of technology G-Sync is it only planned for high refresh rate displays or will we see it on the more common 60hz monitors? In addition, I want to get access to a resolution of 2560x1440 pixels as quickly as possible.

Nvidia reiterated that the best experience from G-Sync can be obtained when your video card keeps the frame rate within 30 - 60 FPS. Thus, the technology can really benefit from conventional monitors with a frequency of 60 Hz and a module G-Sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decide not to use this function (and why not, because it is not yet compatible with G-Sync) can create a panel with G-Sync for much less money.

Speaking of resolutions, it's shaping up like this: QHD screens with a refresh rate of more than 120Hz could start shipping as early as early 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Now, of course, you don't need to combine two graphics adapters to display an image in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to comfortably play at this resolution. But there is also no way to run two cards in SLI on three G-Sync monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI and one DisplayPort. G-Sync requires DisplayPort 1.2 and the adapter will not work (nor will an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. There is a separate card for each monitor. Naturally, we assume that Nvidia partners will start releasing "G-Sync Edition" cards with more DisplayPort connectors.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with v-sync. Is she needed for G-Sync? The answer is no. G-Sync not only does it not require triple buffering, since the channel never stops, it, on the contrary, harms G-Sync, because it adds an extra delay frame with no performance gain. Unfortunately, game triple buffering is often set on its own and cannot be bypassed manually.

What about games that usually react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to run with V-sync on a 60Hz panel (although this does make life difficult for us at times due to input lag). To test them, modification of certain files with the .ini extension is required. As it behaves G-Sync with games based on Gamebryo and Creation engines that are sensitive to vertical sync settings? Are they limited to 60 FPS?

Secondly, you need a monitor with an Nvidia module G-Sync. This module replaces the screen scaler. And, for example, add to the split Ultra HD display G-Sync impossible. In today's review, we use a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you can get an idea of ​​​​what impact will have G-Sync if manufacturers start installing it in cheaper panels at 60 Hz.

Thirdly, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option to work G-Sync on three monitors in Surround mode, it is their connection via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

And finally, do not forget about driver support. The latest package version 331.93 beta is already compatible with G-Sync, and we anticipate that future WHQL-certified versions will feature it as well.

test bench

Test bench configuration
CPU Intel Core i7-3970X (Sandy Bridge-E), 3.5 GHz base clock, 4.3 GHz overclock, LGA 2011, 15 MB shared L3 cache, Hyper-Threading enabled, power saving features enabled.
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 & 1.65V
Storage device Samsung 840 Pro SSD 256GB SATA 6Gb/s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2 GB
Power Supply Corsair AX860i 860W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now we need to figure out in what cases G-Sync has the biggest impact. Chances are good that you are already using a monitor with a refresh rate of 60Hz. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that the majority of enthusiasts on the market will still stick to 60 Hz.

With V-sync active on a 60Hz monitor, the most noticeable artifacts appear when the card can't deliver 60fps, resulting in annoying jumps between 30 and 60 FPS. There are noticeable slowdowns here. With V-sync disabled, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. For some players, this is so distracting that they simply turn on V-sync and endure stuttering and input lag.

With refresh rates of 120 and 144 Hz and higher frame rates, the display refreshes more frequently, reducing the amount of time a single frame persists across multiple screen scans when performance is poor. However, problems with active and inactive vertical sync persist. For this reason, we will test the Asus monitor in 60 and 144 Hz mode with technology on and off. G-Sync .

G-Sync technology overview | Testing G-Sync with V-Sync enabled

It's time to start testing G-Sync. It remains only to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it's wrong.

Today we measure not performance, but quality. In our case, the tests can show only one thing: the frame rate at a particular point in time. About the quality and experience of use with the technology turned on and off G-Sync they say absolutely nothing. Therefore, we will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to the readers to judge? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Insofar as G-Sync introduces a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-Sync on, G-Sync off, 60Hz, 120Hz, 144Hz, ... The list goes on and on. But we'll start with a 60Hz refresh rate and active vsync.

It's probably easiest to start with Nvidia's own demo utility, which swings the pendulum from side to side. The utility can simulate a frame rate of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. You can then disable or enable V-sync and G-Sync. Although the test is fictional, it demonstrates the capabilities of the technology well. You can watch a scene at 50 FPS with vsync turned on and think: "Everything is quite good, and visible stuttering can be tolerated." But after activation G-Sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a tech demo. I would like evidence based on real games. To do this, you need to run a game with high system requirements, such as Arma III.

In Arma III can be installed in a test machine GeForce GTX 770 and set ultra settings. With V-sync disabled, the frame rate fluctuates between 40 and 50 FPS. But if you enable V-sync, it will drop to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the graphics card simply decreases.

Since there was no image freeze, there was a significant difference when activating G-Sync not noticeable, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, as the same frame is not kept across multiple monitor scans. We feel that Arma is generally less "jerky" than many other games, so you don't feel any lag.

On the other hand, in Metro: Last Light, the influence G-Sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 resolution with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select SSAA options from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes an antechamber where it's easy to strafe back and forth. Running the level with V-sync active at 60 Hz, we entered the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA causes fluctuations from 60 to 30 FPS, from which each duplicated frame creates an inconvenience. This is one of the games where we would definitely turn off v-sync and just ignore tearing. Many people have already developed a habit.

but G-Sync removes all negative effects. You no longer have to look at the Fraps counter waiting for drops below 60 FPS to lower one more graphical setting. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn about this later.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom "s Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is heavily influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And because Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-Sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-Sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-Sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings intact, and then enable G-Sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-Sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-Sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-Sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-Sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the best solution would be to disable G-Sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-Sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-Sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Innovation like this is the only way we can continue to advance our industry while maintaining the technical edge that PCs have over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-Sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-Sync based on Nvidia's pendulum demo test, you're sure to wonder if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-Sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-Sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-Sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-Sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-Sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the impact of the new technology cannot be confused, while in others it is less pronounced. But anyway G-Sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-Sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us a lot of exciting news to discuss, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-Sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...

There are things that are not only difficult to write about, but very difficult. Which you just need to see once than hear about them a hundred times or read on the Internet. For example, it is impossible to describe some natural wonders, such as the majestic Grand Canyon or the snow-capped Altai Mountains. You can look at beautiful pictures with their image a hundred times and enjoy the videos, but all this will not replace live impressions.

The topic of smooth frame output to a monitor using Nvidia G-Sync technology also belongs to such topics - according to text descriptions, the changes do not seem so significant, but in the very first minutes of a 3D game on a system with an Nvidia Geforce video card connected to G-Sync -monitor, it becomes clear how big the qualitative leap is. And although more than a year has passed since the announcement of the technology, the technology does not lose its relevance, it still has no competitors (among the solutions that have entered the market), and the corresponding monitors continue to be produced.

Nvidia has been working on the issue of improving the user experience of Geforce graphics processors in modern games by improving the smoothness of rendering for quite some time. We can recall Adaptive V-Sync adaptive sync technology, which is a hybrid that combines V-sync on and off modes (V-Sync On and V-Sync Off, respectively). In the case when the GPU renders at a frame rate less than the monitor's refresh rate, synchronization is disabled, and for FPS exceeding the refresh rate, it is enabled.

Fluidity issues weren't all solved with adaptive sync, but it was an important step in the right direction. But why was it necessary at all to make some special synchronization modes and even release software and hardware solutions? What is wrong with technologies that have been around for decades? Today we're going to show you how Nvidia's G-Sync technology helps eliminate all known display artifacts, such as tearing, stuttering, and increased latency.

Going far ahead, we can say that G-Sync synchronization technology allows you to get smooth frame changes with the highest possible performance and comfort, which is very noticeable when playing on such a monitor - this is noticeable even to the average home user, and for avid gamers it can mean an improvement reaction time, and at the same time game achievements.

Today, most PC gamers use monitors with a refresh rate of 60Hz - typical LCD screens are the most popular right now. Accordingly, both when synchronization is turned on (V-Sync On) and when it is turned off, there are always some shortcomings associated with the basic problems of ancient technologies, which we will discuss later: high latency and FPS jerks with V-Sync turned on and unpleasant gaps images when off.

And since lags and uneven frame rates are more annoying and annoying, rarely any of the players turn on the synchronization at all. And even some models of monitors with a refresh rate of 120 and 144 Hz that have appeared on the market cannot help eliminate problems completely, they simply make them somewhat less noticeable by updating the screen contents twice as often, but the same artifacts are still present: lags and the absence of the same comfortable smoothness.

And since G-Sync monitors paired with an appropriate Nvidia Geforce graphics card can not only provide a high refresh rate, but also eliminate all these shortcomings, purchasing such solutions can be considered even more important than even upgrading to a more powerful GPU. But let's first understand why it was necessary to do something different from the long-known solutions - what's the problem here?

Problems of Existing Video Output Methods

Technologies for displaying images on a screen with a fixed refresh rate have been around since the days when cathode ray tube (CRT) monitors were used. Most readers should remember them - pot-bellied ones, like ancient televisions. These technologies were originally developed for displaying television images at a fixed frame rate, but in the case of devices for outputting 3D images dynamically calculated on a PC, this solution causes big problems that have not been solved so far.

Even the most modern LCD monitors have a fixed image refresh rate on the screen, although technologically nothing prevents them from changing the picture at any time, at any frequency (within reasonable limits, of course). But PC gamers from the old days of CRT monitors have to put up with a decidedly imperfect solution to the problem of synchronizing 3D rendering frame rate and monitor refresh rate. So far, there have been very few options for displaying an image - two, and both of them have drawbacks.

The root of all problems lies in the fact that with a fixed refresh rate of the picture on the monitor, the video card renders each frame at a different time - this happens due to the constantly changing complexity of the scene and the load on the graphics processor. And the rendering time of each frame is not constant, it changes every frame. It is no wonder that when trying to display a number of frames on the monitor, synchronization problems arise, because some of them require much more time to draw than others. As a result, we get a different preparation time for each frame: sometimes 10 ms, sometimes 25 ms, for example. And the monitors that existed before the advent of G-Sync could only display frames after a certain period of time - not earlier, not later.

The matter is further complicated by the richness of hardware and software configurations of gaming PCs, combined with very different load depending on the game, quality settings, video driver settings, etc. As a result, it is impossible to configure each gaming system so that personnel training is carried out with constant or at least not too different times in all 3D applications and conditions - as is possible on game consoles with their single hardware configuration.

Naturally, unlike consoles with their predictable frame render times, PC players are still severely limited in their ability to achieve a smooth gaming experience without noticeable drawdowns and lags. In the ideal (read - impossible in reality) case, the update of the image on the monitor should be carried out strictly after the next frame is calculated and prepared by the GPU:

As you can see, in this hypothetical example, the GPU always has time to draw a frame before it needs to be transferred to the monitor - the frame time is always slightly less than the time between updates of information on the display, and in the breaks the GPU rests a bit. But in reality, everything is completely different - the frame rendering time is very different. Imagine if the GPU does not have time to render a frame in the allotted time - then the frame must either be displayed later, skipping one update of the image on the monitor (vertical synchronization is enabled - V-Sync On), or display frames in parts with synchronization disabled, and then on the monitor at the same time there will be pieces from several adjacent frames.

Most users turn V-Sync off to get lower latency and smoother screen output, but this solution introduces visible tearing artifacts. And with synchronization turned on, there will be no picture tearing, since the frames are displayed exclusively in their entirety, but the delay between the player’s action and updating the image on the screen increases, and the frame rate is very uneven, since the GPU never draws frames in strict accordance with the refresh time of the picture on the monitor.

This problem has existed for many years and clearly interferes with the comfort of viewing the result of a 3D rendering, but until some time no one bothered to solve it. And the solution, in theory, is quite simple - you just need to display information on the screen strictly when the GPU finishes working on the next frame. But first, let's take a closer look at examples of how existing image output technologies work, and what solution Nvidia offers us in its G-Sync technology.

Disadvantages of output when synchronization is disabled

As we already mentioned, the vast majority of players prefer to keep synchronization turned off (V-Sync Off) in order to get the display of frames rendered by the GPU on the monitor as quickly as possible and with a minimum delay between the player's actions (keypresses, mouse commands) and their display. For serious players, this is necessary for victories, and for ordinary players, in this case, the sensations will be more pleasant. This is how working with V-Sync disabled looks schematically:

There are no problems and delays with the withdrawal of frames. But although the disabled vsync solves the problem of lag as much as possible, providing a minimum delay, at the same time artifacts appear on the image - picture breaks when the image on the screen consists of several pieces of adjacent frames rendered by the GPU. The lack of smoothness of the video sequence is also noticeable due to the unevenness of the frames coming from the GPU to the screen - image breaks in different places.

These tearing results from displaying two of the more frames rendered on the GPU during one monitor refresh cycle. Of several - when the frame rate exceeds the refresh rate of the monitor, and of the two - when it approximately corresponds to it. Look at the diagram above - if the content of the frame buffer is updated in the middle between the times of displaying information on the monitor, then the final picture on it will be distorted - in this case, part of the information belongs to the previous frame, and the rest to the current one being drawn.

With sync disabled, frames are sent to the monitor with absolutely no regard for its refresh rate or time, so they never match the monitor's refresh rate. In other words, with V-Sync turned off, monitors without G-Sync support will always experience such tearing in the picture.

This is not only about the fact that it is unpleasant for the player to watch stripes twitching all over the screen, but also that the simultaneous rendering of parts of different frames can misinform the brain, which is especially noticeable with dynamic objects in the frame - the player sees parts of objects shifted relative to each other. You have to put up with this only because disabling V-Sync provides minimal output latency at the moment, but far from ideal dynamic image quality, as you can see in the following examples (full resolution frames are available by clicking):

Using the examples above, taken using the FCAT software and hardware system, you can make sure that the real image on the screen can be composed of pieces of several neighboring frames - and sometimes unevenly, when a narrow strip is taken from one of the frames, and the neighboring ones occupy the rest ( noticeably larger) part of the screen.

Tearing problems are even more visible in dynamics (if your system and / or browser does not support playing MP4 / H.264 videos at a resolution of 1920 × 1080 pixels with a refresh rate of 60 FPS, then you will have to download and view them locally using a media player with relevant capabilities):

As you can see, even in dynamics, unpleasant artifacts in the form of picture breaks are easily noticeable. Let's see how it looks schematically - in a diagram that shows the output method with synchronization disabled. In this case, the frames arrive on the monitor immediately after the GPU finishes rendering them, and the image is displayed on the display even if the output of information from the current frame has not yet been completely completed - the rest of the buffer falls on the next screen update. That is why each frame of our example displayed on the monitor consists of two frames drawn on the GPU - with an image break in the place marked in red.

In this example, the first frame (Draw 1) is rendered by the GPU to the screen buffer faster than its 16.7ms refresh time - and before the image is rendered to the monitor (Scan 0/1). The GPU immediately starts working on the next frame (Draw 2), which breaks the picture on the monitor, which contains another half of the previous frame.

As a result, in many cases, a clearly distinguishable band appears on the image - the border between the partial display of adjacent frames. In the future, this process is repeated, since the GPU works on each frame for a different amount of time, and without process synchronization, the frames from the GPU and those displayed on the monitor never match.

Pros and cons of vertical sync

When you turn on the traditional vertical synchronization (V-Sync On), the information on the monitor is updated only when the work on the frame is completely finished by the GPU, which eliminates tearing in the image, because the frames are displayed entirely on the screen. But, since the monitor updates the content only at certain intervals (depending on the characteristics of the output device), this binding brings other problems.

Most modern LCD monitors update information at a frequency of 60 Hz, that is, 60 times per second - approximately every 16 milliseconds. And with sync enabled, image output time is tightly tied to the monitor's refresh rate. But, as we know, GPU frame rate is always variable, and the rendering time of each frame differs depending on the constantly changing complexity of the 3D scene and quality settings.

It cannot always be equal to 16.7 ms, but will be either less than this value or more. With synchronization enabled, the work of the GPU on frames is again completed either earlier or later than the screen refresh time. If the frame was drawn faster than this moment, then there are no particular problems - the visual information just waits for the monitor refresh time to display the frame in its entirety, and the GPU is idle. But if the frame does not have time to render in the allotted time, then it has to wait for the next cycle of updating the image on the monitor, which causes an increase in the delay between the player's actions and their visual display on the screen. At the same time, the image of the previous “old” frame is displayed again on the screen.

Although all this happens quite quickly, but the increase in latency is visually easily noticeable, and not only by professional players. And since the frame rendering time is always variable, turning on the binding to the monitor refresh rate causes jerks when displaying a dynamic image, because the frames are displayed either quickly (equal to the monitor refresh rate), or twice, three times, four times slower. Consider a schematic example of such work:

The illustration shows how frames are displayed on the monitor when vertical sync is on (V-Sync On). The first frame (Draw 1) is rendered by the GPU faster than 16.7 ms, so the GPU does not move on to work on rendering the next frame, and does not tear the image, as in the case of V-Sync Off, but waits for the first frame to be fully displayed on the monitor. And only after that it starts drawing the next frame (Draw 2).

But work on the second frame (Draw 2) takes more time than 16.7 ms, so after they expire, visual information from the previous frame is displayed on the screen, and it is shown on the screen for another 16.7 ms. And even after the GPU finishes working on the next frame, it is not displayed on the screen, since the monitor has a fixed refresh rate. In general, you have to wait 33.3 ms for the second frame to be displayed, and all this time is added to the delay between the player's action and the end of the frame being displayed on the monitor.

To the problem of the time lag, there is also a gap in the smoothness of the video sequence, noticeable by the jerkiness of the 3D animation. The problem is very clearly shown in a short video:

But even the most powerful graphics processors in demanding modern games cannot always provide a sufficiently high frame rate that exceeds the typical monitor refresh rate of 60 Hz. And, accordingly, they will not give the possibility of a comfortable game with the synchronization turned on and the absence of problems such as tearing the picture. Especially when it comes to games such as the multiplayer Battlefield 4, the very demanding Far Cry 4 and Assassin's Creed Unity at high resolutions and maximum game settings.

That is, the modern player has little choice - either get a lack of smoothness and increased delays, or be content with imperfect picture quality with broken pieces of frames. Of course, in reality, everything does not look so bad, because somehow we have been playing all this time, right? But in times when they are trying to achieve the ideal both in quality and in comfort, you want more. Moreover, LCD displays have a fundamental technological ability to display frames when the graphics processor indicates it. The point is small - to connect the GPU and the monitor, and there is already such a solution - Nvidia G-Sync technology.

G-Sync technology - Nvidia's solution to performance problems

So, most modern games in the version with synchronization turned off cause picture breaks, and with it turned on, unsmooth frame changes and increased delays. Even at high refresh rates, traditional monitors can't get rid of these problems. Probably, the choice between two far from ideal options for displaying frames in 3D applications for many years has bothered Nvidia employees so much that they decided to get rid of the problems by giving gamers a completely new approach to updating information on the display.

The difference between G-Sync technology and existing display methods is that the timing and frame rate in the case of the Nvidia variant is determined by the Geforce GPU, and it is dynamically changing, not fixed, as it was before. In other words, in this case, the GPU takes full control of the output of frames - as soon as it finishes working on the next frame, it is displayed on the monitor, without delays and image breaks.

Using this kind of connection between the GPU and the specially adapted monitor hardware gives players the best output method - just perfect, in terms of quality, eliminating all the problems we mentioned above. G-Sync provides a perfectly smooth frame change on the monitor, without any delays, jerks and artifacts caused by the display of visual information on the screen.

Naturally, G-Sync does not work magically, and for the technology to work on the monitor side, it requires the addition of special hardware logic in the form of a small board supplied by Nvidia.

The company is working with monitor manufacturers to include G-Sync cards in their gaming display models. For some models, there is even an upgrade option by the user himself, but this option is more expensive, and it doesn’t make sense, because it’s easier to immediately buy a G-Sync monitor. From a PC, it is enough to have any of the modern Nvidia Geforce video cards in its configuration, as well as an installed G-Sync-optimized video driver - any of the latest versions will do.

When Nvidia G-Sync technology is enabled, after processing the next frame of the 3D scene, the Geforce GPU sends a special signal to the G-Sync controller board built into the monitor, which tells the monitor when to update the image on the screen. This allows you to achieve just perfect smoothness and responsiveness when playing on a PC - you can see this by watching a short video (required at 60 frames per second!):

Let's see how the configuration with enabled G-Sync technology looks like, according to our scheme:

As you can see, everything is very simple. Enabling G-Sync ties the monitor's refresh rate to the end of rendering each frame on the GPU. The GPU completely controls the work: as soon as it finishes rendering the frame, the image is immediately displayed on a G-Sync-compatible monitor, and as a result, the display refresh rate is not fixed, but variable - exactly like the GPU frame rate. This eliminates image tearing issues (because it always contains information from a single frame), minimizes jitter in frame rate (the monitor doesn't wait longer than a frame is physically processed by the GPU), and reduces output latencies relative to the vsync-enabled method.

I must say that the players clearly lacked such a solution, the new method of synchronizing the GPU and the Nvidia G-Sync monitor really has a very strong effect on the comfort of playing on a PC - that almost perfect smoothness appears, which was not there until now - in our time of super-powerful graphics cards! Since the announcement of G-Sync technology, the old methods have instantly become an anachronism and upgrading to a G-Sync monitor capable of variable refresh rates up to 144 Hz seems like a very attractive option to finally get rid of problems, lags and artifacts.

Does G-Sync have disadvantages? Of course, like any technology. For example, G-Sync has an annoying limitation, which is that it provides smooth frame output to the screen at a frequency of 30 FPS. And the selected refresh rate for the monitor in G-Sync mode sets the bar for the rate of screen refresh. That is, with a refresh rate set to 60 Hz, maximum smoothness will be provided at a frequency of 30-60 FPS, and at 144 Hz - from 30 to 144 FPS, but not less than the lower limit. And with a variable frequency (for example, from 20 to 40 FPS), the result will no longer be ideal, although it is noticeably better than traditional V-Sync.

But the biggest downside to G-Sync is that it's Nvidia's proprietary technology that competitors don't have access to. Therefore, at the beginning of the outgoing year, AMD announced a similar FreeSync technology, which also consists in dynamically changing the frame rate of the monitor in accordance with the preparation of frames from the GPU. An important difference is that AMD development is open and does not require additional hardware solutions in the form of specialized monitors, since FreeSync has been transformed into Adaptive-Sync, which has become an optional part of the DisplayPort 1.2a standard from the notorious VESA (Video Electronics Standards Association). It turns out that AMD will skillfully use the theme developed by a competitor for its own benefit, since without the appearance and popularization of G-Sync, they would not have had any FreeSync, as we think.

Interestingly, Adaptive-Sync technology is also part of the VESA embedded DisplayPort (eDP) standard, and is already used in many display components that use eDP for signal transmission. Another difference from G-Sync is that VESA members can use Adaptive-Sync without the need for any payment. However, it is very likely that Nvidia will also support Adaptive-Sync in the future as part of the DisplayPort 1.2a standard, since such support will not require much effort from them. But the company will not refuse G-Sync either, as it considers its own solutions to be a priority.

The first Adaptive-Sync-enabled monitors should arrive in the first quarter of 2015, not only with DisplayPort 1.2a ports, but also with dedicated Adaptive-Sync support (not all DisplayPort 1.2a-enabled monitors will be able to boast this). Thus, in March 2015, Samsung plans to launch the Samsung UD590 (23.6 and 28 inches) and UE850 (23.6, 27 and 31.5 inches) monitor lines with support for UltraHD resolution and Adaptive-Sync technology. AMD claims that monitors with this technology will be up to $100 cheaper than similar devices with G-Sync support, but it's difficult to compare them, since all monitors are different and come out at different times. In addition, there are already not so expensive G-Sync models on the market.

Visual difference and subjective impressions

Above we described the theory, and now it's time to show everything clearly and describe your feelings. We tested Nvidia's G-Sync technology in practice in several 3D applications using an Inno3D iChill Geforce GTX 780 HerculeZ X3 Ultra graphics card and an Asus PG278Q monitor that supports G-Sync technology. There are several models of monitors with G-Sync support on the market from different manufacturers: Asus, Acer, BenQ, AOC and others, and for the Asus VG248QE monitor, you can even buy a kit to upgrade it to support G-Sync on your own.

The youngest video card model to use G-Sync technology is the Geforce GTX 650 Ti, with an extremely important requirement for a DisplayPort connector on board. Other system requirements include at least Microsoft Windows 7 operating system, use of a good DisplayPort 1.2 cable, and the use of a high-quality mouse with high sensitivity and polling rate is recommended. G-Sync technology works with all full-screen 3D applications that use the OpenGL and Direct3D graphics APIs when running on Windows 7 and 8.1 operating systems.

Any modern driver is suitable for work, which - G-Sync has been supported by all company drivers for more than a year. If you have all the required components, you only need to enable G-Sync in the drivers, if it has not already been done, and the technology will work in all full-screen applications - and only in them, based on the very principle of the technology.

To enable G-Sync technology for full-screen applications and get the best possible experience, you need to enable the 144Hz refresh rate in the Nvidia control panel or operating system desktop settings. Then, you need to make sure that the use of technology is allowed on the corresponding page "Setting G-Sync" ...

And also - select the appropriate item on the "Manage 3D settings" page in the "Vertical sync pulse" parameter of the global 3D settings. There you can also disable the use of G-Sync technology for test purposes or if any problems appear (looking ahead - we did not find any during our testing).

G-Sync technology works at all resolutions supported by monitors, up to UltraHD, but in our case we used a native resolution of 2560 × 1440 pixels at 144 Hz. In our comparisons with the current state of affairs, we used a 60Hz refresh rate mode with G-Sync disabled to emulate the behavior of typical non-G-Sync monitors found in most gamers. Most of which use Full HD monitors capable of maximum mode at 60Hz.

Be sure to mention that although with G-Sync enabled, the screen refresh will be at an ideal frequency - when the GPU "wants" it, the optimal mode will still be rendering at a frame rate of about 40-60 FPS - this is the most appropriate frame rate for modern games, not too small to hit the lower limit of 30 FPS, but also not requiring a decrease in settings. By the way, Nvidia's Geforce Experience program strives for exactly this frequency, providing the appropriate settings for popular games in the software of the same name that comes with the drivers.

In addition to games, we also tried out a specialized test application from Nvidia - . This application shows a 3D pendulum scene that is easy to assess the smoothness and quality, allows you to simulate different frame rates and select the display mode: V-Sync Off/On and G-Sync. With this test software, it is very easy to show the difference between different synchronization modes - for example, between V-Sync On and G-Sync:

The Pendulum Demo app allows you to test different sync methods under different conditions, it simulates the exact frame rate of 60 FPS to compare V-Sync and G-Sync under ideal conditions for the legacy sync method - in this mode, there should simply be no difference between the methods. But the 40-50 FPS mode puts V-Sync On in an awkward position where lags and clumsy frame changes are visible to the naked eye, since the frame rendering time exceeds the refresh rate at 60 Hz. When you turn on G-Sync, everything becomes perfect.

As for comparing modes with V-Sync disabled and G-Sync enabled, here the Nvidia application also helps to see the difference - at frame rates between 40 and 60 FPS, image tearing is clearly visible, although there are less lags than with V-Sync On. And even the non-smooth video sequence regarding the G-Sync mode is noticeable, although in theory this should not be - perhaps this is how the brain perceives “torn” frames.

Well, with G-Sync enabled, any of the modes of the test application (constant frame rate or variable - it doesn’t matter) always provides the smoothest video sequence. And in games, all the problems of the traditional approach to updating information on a monitor with a fixed refresh rate are sometimes noticeable almost even more - in this case, you can clearly appreciate the difference between all three modes on the example of the StarCraft II game (viewing a previously saved recording):

If your system and browser supports playback of MP4 / H.264 video data at 60 FPS, then you will clearly see that in the disabled synchronization mode, obvious video breaks are noticeable, when V-Sync is enabled, jerks and uneven video sequences are observed. All this disappears when you turn on Nvidia G-Sync, in which there are no artifacts in the image, no increase in delays, no "torn" frame rate.

Of course, G-Sync is not a magic wand, and this technology will not get rid of delays and slowdowns caused not by the process of displaying frames on a monitor with a fixed refresh rate. If the game itself has problems with the smoothness of frame output and large jerks in FPS caused by loading textures, processing data on the CPU, suboptimal work with video memory, lack of code optimization, etc., then they will remain in place. Moreover, they will become even more noticeable, since the output of the remaining frames will be perfectly smooth. However, in practice, on powerful systems, problems do not occur too often, and G-Sync really improves the perception of dynamic video.

Since Nvidia's new output technology impacts the entire output pipeline, it could theoretically cause artifacts and frame rate ripples, especially if the game artificially caps FPS at some point. Probably, such cases, if any, are so rare that we did not even notice them. But they noted a clear improvement in comfort when playing - when playing behind a monitor with G-Sync technology enabled, it seems that the PC has become so powerful that it is capable of a constant frame rate of at least 60 FPS without any drawdowns.

The feeling you get when playing with a G-Sync monitor is very difficult to describe in words. The difference is especially noticeable at 40-60 FPS - a frame rate that is very common in demanding modern games. The difference compared to conventional monitors is simply amazing, and we will try not only to tell it in words and show it in video examples, but also to show frame rate graphs obtained with different display modes.

In games of genres such as real-time strategy and the like, such as StarCraft II, League of Legends, DotA 2, etc., the benefits of G-Sync technology are clearly visible, as you can see from the example from the video above. In addition, such games always require fast-paced action that does not tolerate delays and uneven frame rates, and smooth scrolling plays a rather important role in comfort, which is greatly hindered by video tearing with V-Sync Off, delays and lags with V-Sync On. So G-Sync technology is ideal for these types of games.

First person shooters like Crysis 3 and Far Cry 4 are even more common, they are also very demanding on computing resources, and at high quality settings, players in them often get a frame rate of just around 30-60 FPS - ideal for use G-Sync, which really greatly improves the comfort when playing in such conditions. The traditional vertical sync method will very often cause frames to be output at a frequency of only 30 FPS, increasing lags and jerks.

The same goes for third-person games like Batman, Assassin's Creed, and Tomb Raider. These games also use the latest graphics technology and require fairly powerful GPUs to achieve high frame rates. At maximum settings in these games and turning off V-Sync, FPS of the order of 30-90 is often obtained, which causes unpleasant tearing of the picture. Enabling V-Sync only helps in some scenes with lower resource requirements, and the frame rate jumps from 30 to 60 in steps, which causes slowdowns and jerks. And the inclusion of G-Sync solves all these problems, and it is very noticeable in practice.

Practice test results

In this section, we'll take a look at the impact of G-Sync and V-Sync on framerates - the performance graphs give you a good idea of ​​how the different technologies work. During testing, we tested several games, but it is far from convenient for everyone to show the difference between V-Sync and G-Sync - some game benchmarks do not allow forcing V-Sync, other games do not have a convenient tool for playing the exact game sequence (most modern games, unfortunately), still others run too fast or within tight framerate limits on our test system.

So we ended up with Just Cause 2 at max settings and a couple of benchmarks: Unigine Heaven and Unigine Valley - also at max quality settings. The frame rate in these applications varies quite widely, which is convenient for our purpose - to show what happens to the output of frames in various conditions.

Unfortunately, at the moment we do not have the FCAT hardware and software system in use, and we will not be able to show real FPS graphs and recorded videos in different modes. Instead, we tested per-second average and instantaneous frame rates using a well-known utility at 60 and 120 Hz monitor refresh rates using V-Sync On, V-Sync Off screen refresh methods, using Adaptive V-Sync, and using G-Sync at 144Hz to show the difference between the new technology and current 60Hz traditional V-sync monitors.

G-Sync vs. V-Sync On

We will start our study by comparing modes with V-Sync On and G-Sync technology - this is the most revealing comparison, which will show the difference between methods that do not have the disadvantages of tearing the image. First, we will look at the Heaven test application at maximum quality settings at a resolution of 2560 × 1440 pixels (clicking on the thumbnails opens the charts in full resolution):

As you can see in the graph, the frame rate with G-Sync enabled and without sync is almost the same, except for the frequency above 60 FPS. But the FPS in the mode with the vertical synchronization method enabled is noticeably different, because in it the frame rate can be lower than or equal to 60 FPS and a multiple of integers: 1, 2, 3, 4, 5, 6 ..., since the monitor sometimes has to show the same previous frame for several refresh periods (two, three, four, and so on). That is, the possible "steps" of the frame rate with V-Sync On and 60 Hz: 60, 30, 20, 15, 12, 10, ... FPS.

This aliasing is clearly visible on the red line of the graph - during the run of this test, the frame rate was often 20 or 30 FPS, and much less often - 60 FPS. Although with G-Sync and V-Sync Off (No Sync), it was often in a wider range: 35-50 FPS. With V-Sync enabled, this output rate is not possible, so the monitor always shows 30 FPS in such cases - limiting performance and adding lag to the overall output time.

It should be noted that the graph above shows not the instantaneous frame rate, but averaged values ​​​​within a second, but in reality FPS can “jump” much more - almost every frame, which causes unpleasant unsmoothness and lags. In order to see this visually, here are a couple of graphs with instantaneous FPS - more precisely, with graphs of rendering time for each frame in milliseconds. The first example (the lines are slightly shifted relative to each other, only exemplary behavior in each mode is shown):

As you can see, in this example, the frame rate changes more or less smoothly in the case of G-Sync, and in steps with V-Sync On (there are single jumps in rendering time in both cases - this is normal). With vsync enabled, render and frame output times can be 16.7ms; 33.3ms; 50 ms, as can be seen on the graph. In terms of FPS, this corresponds to 60, 30 and 20 frames per second. In addition, there is no particular difference between the behavior of the two lines, there are peaks in both cases. Let's look at another significant time period:

In this case, there are obvious “throws” in the rendering time of frames, and with them the FPS in the case with vertical synchronization enabled. Look, with V-Sync On, there is a jump in the frame rendering time from 16.7 ms (60 FPS) to 33.3 ms (30 FPS) and back - in reality, this causes the same uncomfortable unevenness and clearly visible jerks in the video sequence. The smoothness of frame changes in the case of G-Sync is much higher and playing in this mode will be noticeably more comfortable.

Consider the FPS graph in the second test application - Unigine Valley:

In this benchmark, we note about the same thing as in Heaven. The frame rates in G-Sync and V-Sync Off modes are almost the same (except for a peak above 60 Hz), and the included V-Sync causes a clear step change in FPS, most often showing 30 FPS, sometimes rolling down to 20 FPS and rising to 60 FPS - typical behavior of this method, causing lags, jerks and uneven video sequence.

In this subsection, it remains for us to consider a segment from the built-in test of the Just Cause 2 game:

This game perfectly shows all the inferiority of the outdated synchronization method V-Sync On! With changing frame rates from 40 to 60-70 FPS, the G-Sync and V-Sync Off lines are almost the same, but the frame rate with V-Sync On reaches 60 FPS only in short segments. That is, with real GPU capabilities for playing at 40-55 FPS, the player will be content with only 30 FPS.

Moreover, in the section of the graph where the red line jumps from 30 to 40 FPS, in reality, when viewing the image, there is a clear uneven frame rate - it jumps from 60 to 30 almost every frame, which clearly does not add smoothness and comfort when playing. But maybe with a refresh rate of 120 Hz, V-sync will do better?

G-Sync vs. V-Sync 60/120Hz

Let's take a look at two V-Sync On modes at 60Hz and 120Hz refresh rates and compare them to V-Sync Off (as we identified earlier, this line is almost identical to G-Sync). With a refresh rate of 120 Hz, more values ​​​​are added to the FPS “steps” already known to us: 120, 40, 24, 17 FPS, etc., which can make the graph less stepped. Let's look at the frame rate in the Heaven benchmark:

Noticeably, the 120Hz refresh rate helps V-Sync On achieve better performance and smoother framerates. In cases where 20 FPS is observed at 60 Hz on the graph, 120 Hz mode gives an intermediate value of at least 24 FPS. And 40 FPS instead of 30 FPS is clearly visible on the graph. But there are not fewer steps, but even more, so the frame rate at 120 Hz refresh, although it changes by a smaller amount, does it more often, which also adversely affects the overall smoothness.

There are fewer changes in the Valley benchmark, as the average frame rate is closest to the 30 FPS step available for both modes: 60 and 120 Hz refresh rates. Disabled sync provides smoother frame transitions, but with visual artifacts, and V-Sync On modes again show stepped lines. In this subsection, it remains for us to look at the Just Cause 2 game.

And again, we clearly see how flawed the vertical synchronization is, which does not provide a smooth frame change. Even going to 120Hz refresh rate gives V-Sync On just a few extra FPS "steps" - frame rate jumps back and forth from one step to another - all this is very unpleasant when watching animated 3D scenes, you can take our word for it or watch the video examples above again.

Effect of output method on average frame rate

And what happens with the average frame rate when all these synchronization modes are enabled, how does the inclusion of V-Sync and G-Sync affect the average performance? You can roughly estimate the speed loss even by the FPS graphs shown above, but we will also give the average frame rates that we obtained during testing. The first will again be Unigine Heaven:

The indicators in the Adaptive V-Sync and V-Sync Off modes are almost the same - after all, the speed almost does not increase above 60 FPS. Logically, turning on V-Sync also lowers the average frame rate, since this mode uses FPS steps. At 60Hz, the average frame rate dropped by more than a quarter, and turning on 120Hz brought back only half of the loss in average FPS.

The most interesting thing for us is how much the average frame rate drops in G-Sync mode. For some reason, the speed above 60 FPS is cut, although the monitor was set to 144 Hz, so the speed when G-Sync was turned on turned out to be slightly lower than the disabled mode. In general, we can assume that there are no losses at all, and they certainly cannot be compared with the lack of speed with V-Sync On. Consider the second benchmark - Valley.

In this case, the drop in average rendering speed in modes with V-Sync turned on decreased, since the frame rate was close to 30 FPS throughout the test - one of the "steps" of frequency for V-Sync in both modes: 60 and 120 Hz. Well, for obvious reasons, the losses in the second case turned out to be slightly lower.

When G-Sync is enabled, the average frame rate again turned out to be lower than that noted in the disabled mode, all for the same reason - enabling G-Sync "killed" FPS values ​​\u200b\u200bof 60. But the difference is small, and the new Nvidia mode provides noticeably faster speed than with vertical sync enabled. Let's look at the last chart - the average frame rate in the game Just Cause 2:

In the case of this game, the V-Sync On mode suffered significantly more than in the test applications on the Unigine engine. The average frame rate in this mode at 60 Hz is more than one and a half times lower than when synchronization is disabled at all! Enabling a refresh rate of 120 Hz greatly improves the situation, but still G-Sync allows you to achieve noticeably better performance even in average FPS numbers, not to mention the comfort of the game, which is now not assessed by numbers alone - you have to see it with your own eyes.

So, in this section, we found out that G-Sync technology provides a frame rate close to the mode with sync disabled, and its inclusion has almost no effect on performance. This is in contrast to V-Sync, which causes the frame rate to be staggered and often jumps from one step to the next, causing clumsy motion during the animated sequence of frames and detrimental to the comfort of 3D games.

In other words, both our subjective impressions and test results indicate that Nvidia's G-Sync technology really changes the visual comfort of 3D games for the better. The new method is devoid of both graphical artifacts in the form of image breaks, consisting of several adjacent frames, as we see in the mode with V-Sync disabled, and there are no problems with smooth output of frames to the monitor and an increase in output delays, as in V-Sync mode On.

Conclusion

With all the difficulties of objectively measuring the smoothness of video output, first I would like to express a subjective assessment. We were quite impressed with the gaming experience on Nvidia Geforce and Asus' G-Sync enabled monitor. Even a one-time "live" demonstration of G-Sync really makes a strong impression with the smoothness of frame changes, and after a long trial of this technology, it becomes very dreary to continue playing on a monitor with old methods of displaying an image on the screen.

Perhaps, G-Sync can be considered the biggest change in the process of displaying visual information on the screen for a long time - we finally saw something really new in the connection between displays and GPUs, which directly affects the comfort of 3D graphics perception, and even and so noticeable. And before the announcement of Nvidia's G-Sync technology, we were stuck for years with outdated display standards rooted in the demands of the TV and movie industries.

Of course, we would like to get such features even earlier, but now is a good time to implement it, since in many demanding 3D games at maximum settings, modern top-end video cards provide a frame rate at which the benefits of enabling G-Sync become maximum. And before the advent of technology from Nvidia, the realism achieved in games was simply "killed" by far from the best ways to update the picture on the monitor, causing image tearing, increased delays and jerks in frame rates. G-Sync technology, on the other hand, allows you to get rid of these problems by equating the frame rate to the screen with the rendering speed of the GPU (albeit with some limitations) - this process is now in charge of the GPU itself.

We have not met a single person who tried G-Sync at work and remained dissatisfied with this technology. The reviews of the first lucky ones who tested the technology at the Nvidia event last fall were completely enthusiastic. Supported by journalists from the specialized press and game developers (John Carmack, Tim Sweeney and Johan Andersson) - they also gave extremely positive feedback to the new output method. We are now joining - after several days of using a monitor with G-Sync, I don’t want to return to old devices with long-outdated synchronization methods. Ah, if only there were more monitors with G-Sync, and they were not equipped exclusively with TN matrices ...

Well, from the minuses of Nvidia's technology, we can note that it works at a frame rate of at least 30 FPS, which can be considered an unfortunate drawback - it would be better if the image would be displayed clearly even at 20-25 FPS after it was prepared on the GPU . But the main disadvantage of the technology is that G-Sync is the company's own solution, which is not used by other GPU manufacturers: AMD and Intel. You can also understand Nvidia, because they spent resources on developing and implementing the technology and negotiated with monitor manufacturers about its support precisely with the desire to make money. Actually, they once again acted as the engine of technical progress, despite the company's alleged greed for profit, which seems to many. Let's reveal the big "secret": profit is the main goal of any commercial company, and Nvidia is no exception.

And yet, the future is more likely for more universal open standards, similar to G-Sync in essence, such as Adaptive-Sync, an optional feature within DisplayPort 1.2a. But the appearance and distribution of monitors with such support will have to wait for some more time - somewhere until the middle of next year, and G-Sync monitors from different companies (Asus, Acer, BenQ, AOC and others) have already been on sale for several months though not too cheap. Nothing prevents Nvidia from supporting Adaptive-Sync in the future, although they have not officially commented on this topic. Let's hope that Geforce fans not only now have a working solution in the form of G-Sync, but in the future it will also be possible to use the dynamic refresh rate within the generally accepted standard.

Among other disadvantages of Nvidia G-Sync technology for users, we note that its support from the monitor costs the manufacturer a certain amount, which translates into an increase in retail price relative to standard monitors. However, among G-Sync monitors there are models of different prices, including not too expensive ones. The main thing is that they are already on sale, and every player can get maximum comfort when playing right now, and so far only when using Nvidia Geforce video cards - the company vouches for this technology.

What is vertical sync in games? This function is responsible for the correct display of games on standard LCD monitors with a frequency of 60 Hz. When enabled, the frame rate is limited to 60Hz and no interruptions are displayed on the screen. Disabling it will increase the frame rate, but at the same time, there will be a screen tearing effect.

What is vertical sync in games for?

V-sync is a rather controversial topic in games. On the one hand, for a visually comfortable gaming experience, it seems to be very necessary, provided that you have a standard LCD monitor.

Thanks to it, no errors appear on the screen during the game, the picture is stable and has no gaps. The downside is that the frame rate is capped at 60Hz, so more demanding players may experience what is called input lag, that is, a slight delay when moving in the game with the mouse (can be equated with artificially smoothed mouse movement).

Disabling vertical sync also has its pros and cons. First of all, an unlimited FPS frame rate is provided and thereby completely removes the mentioned input lag. This is useful in games like Counter-Strike, where reaction and accuracy are important. Movement and aiming is very clear, dynamic, every movement of the mouse occurs with high precision. In some cases, we can get a higher FPS rate, since V-Sync, depending on the video card, can slightly reduce hardware performance (the difference is about 3-5 FPS). Unfortunately, the disadvantage is that without vertical sync, we get a screen tearing effect. When turning or changing movement in the game, we notice that the image is torn into two or three horizontal parts.

Enable or disable V-Sync?

Is vertical sync necessary? It all depends on our individual preferences and what we want to get. In multiplayer FPS games, it is recommended to turn off vertical sync to improve aim accuracy. The screen tearing effect, as a rule, is not so noticeable, and when we get used to it, we will not even notice it.

In turn, in story games, you can safely turn on V-Sync. Here, high accuracy is not so important, the first violin is played by the environment, visual comfort, so you should bet on good quality.

Vertical sync can usually be turned on or off in the game's graphics settings. But if we don’t find such a function there, then you can manually turn it off manually in the video card settings - both for everyone, and only for selected applications.

Vertical sync on NVIDIA graphics cards

On GeForce graphics cards, the feature is located in the Nvidia Control Panel. Right-click on the Windows 10 desktop and then select Nvidia Control Panel.

In the sidebar, select the 3D Settings Controls tab under 3D Settings. The available settings will be displayed on the right.

Settings are divided into two tabs - global and program. On the first tab, you can set options for all games and, for example, whether to enable or disable vertical sync in each. Whereas on the second tab you can set the same parameters, but individually for each game separately.

Select the global or program tab, and then look for the "Vertical Sync" option in the list. There is a drop-down field next to it - we choose to force turn off or turn on vertical synchronization.

V-Sync on AMD graphics

In the case of AMD graphics cards, it looks exactly the same as in Nvidia. Right click on the desktop and then go to the Panel Catalyst Control Center.

Then open the "Games" tab on the left and select "Settings for 3D applications". On the right, a list of available options will be displayed that can be forced to be enabled from the position of the AMD Radeon graphics settings. When we are on the "System Settings" tab, we select for everyone.

If you need to set the parameters individually for each game separately, then you should click on the "Add" button and specify the EXE file. It will be added to the list as a new bookmark, and when you switch to it, you can set parameters only for this game.

When you have selected the tab with the added application or system parameters (general), then find the option "Wait for vertical update" in the list. A selection box will appear where we can forcibly enable or disable this option.

V-Sync on integrated Intel HD Graphics

If using an integrated Intel HD Graphics chip, a control panel is also available. It should be available by right-clicking on the desktop or via the Ctrl+Alt+F12 key combination.

On the Intel panel, go to the Settings Mode tab - Control Panel - 3D Graphics, and then to the user settings.

Here we find a field with vertical synchronization Vertical Sync. You can enable it forcibly by setting the value to "Enabled" or set it to "Application Settings". Unfortunately, there is no force disable feature in the Intel HD card options - you can only enable V-Sync. Since it is not possible to disable vertical synchronization in the video card, this can only be done in the settings of the game itself.

www.instcomputer.ru

Windows 10 small FPS and floating mouse :: Counter-Strike: Global Offensive General Discussions

Counter-Strike: Global Offensive > General Discussions > Topic Details

Windows 10, small FPS, floating mouse

Good day everyone, I recently upgraded my system to Win10 (previous was Win7). After the update, I ran into several problems. The first problem is the relatively low FPS compared to what it was. Actually, after installing Win10, I seem to have lost FPS. I explain the essence of the problem, I have an average system and on the "seven" I had a good FPS 200-300. On the "top ten", my FPS does not rise above 60, either in the menu or in the game itself. I looked almost all over the Internet and did not find a solution to this problem. The second problem is a slight mouse floating, which is barely felt, but at the same time it greatly interferes with accurate aiming. PS, this problem didn't exist before installing 10. My system: GPU: GeForce GTX 660Ti CPU: ItelCore i3-3220 3.3GHz RAM: 8GB Hard drive (on which CS:GO is installed): 2TB Monitor: ASUS VK278 60Hz Mouse: Razer DeathAdder 2013 Pad: Razer Goliathus Speed ​​Keyboard: Razer BlackWidow Ultimate 2013

Please share your thoughts on this topic. I will be very happy)

Note: This is ONLY to be used to report spam, advertising, and problematic (harassment, fighting, or rude) posts.

steamcommunity.com

Windows 10 update allows you to turn off v-sync and unlock max fps

11.05.2016 02:22

Game projects optimized for the Universal Windows Platform (UWP) with DirectX 12 support can be run without activating the V-Sync option. The update also adds support for NVIDIA G-SYNC and AMD FreeSync technologies.

The update will help to avoid twitching and delays on the screen, as well as improve the visual quality of the image.

Microsoft has stated that Gears of War: Ultimate Edition and Forza Motorsport 6: Apex will receive patches with this option in the very near future.

Automatic updating will gradually come to all computers with the "tenth" version of Windows.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom "s Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is heavily influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And because Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-Sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-Sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-Sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings intact, and then enable G-Sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-Sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-Sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-Sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-Sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the best solution would be to disable G-Sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-Sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-Sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Innovation like this is the only way we can continue to advance our industry while maintaining the technical edge that PCs have over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-Sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-Sync based on Nvidia's pendulum demo test, you're sure to wonder if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-Sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-Sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-Sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-Sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-Sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the impact of the new technology cannot be confused, while in others it is less pronounced. But anyway G-Sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-Sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us a lot of exciting news to discuss, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-Sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...