• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

RTX is finally here

Underseer

Contributor
Joined
May 29, 2003
Messages
11,413
Location
Chicago suburbs
Basic Beliefs
atheism, resistentialism
https://www.guru3d.com/news-story/battlefield-v-raytracing-features-are-now-enabled.html

[YOUTUBE]t02BA1wa3qU[/YOUTUBE]

Microsoft's October update got seriously delayed by a bug. Check your Windows computer: it should be bugging you to install the new and improved October update.

Anyway, one of the features of this update is the addition of DXR to DirectX, so now those overpriced RTX cards from nVidia can finally do ray tracing in games. Make that game. Singular. At the moment Battlefield V is the only game that implements ray-tracing and the performance numbers are disappointing to say the least.

untitled-1.png

The RTX 2080 TI is a $1300+ video card and if you want to do ray tracing in Battlefield V with the highest settings with it, you're looking at framerates that are barely above 20 FPS at a 4K resolution, and doesn't quite hit 60 FPS at HD resolution.
 
That's disappointing. I was holding off on buying Battlefield V until I could get a card that does ray tracing.
 
OHH NO, only 60 FPS at HD resolution!!!!! My nVidia quadro 150m sneers at the weakness of the new 2080ti.
 
OHH NO, only 60 FPS at HD resolution!!!!! My nVidia quadro 150m sneers at the weakness of the new 2080ti.

The point isn't that the 2080ti is weak. It's the most powerful and expensive gaming GPU you can buy (without getting into the rally exotic stuff not really intended for gaming), and even it struggles once you turn ray-tracing on.

It's possible that nVidia and game developers will figure out how to optimize the software for ray tracing, but right now it looks like you're trading a lot of performance for what amounts to a small improvement in image quality.

And ray-tracing was supposed to be the justification for the high expense of the new generation of cards.
 
Last edited:
On a semi-related note, yet another bug was found with the October update.

If you have certain Intel drivers, the October update will cause your sound to stop working. Everyone not using those particular drivers can install the new and improved October update just fine, but Microsoft has already pulled the update a second time for anyone with those drivers.
 
OHH NO, only 60 FPS at HD resolution!!!!! My nVidia quadro 150m sneers at the weakness of the new 2080ti.

The point isn't that the 2080ti is weak. It's the most powerful and expensive gaming GPU you can buy (without getting into the rally exotic stuff not really intended for gaming), and even it struggles once you turn ray-tracing on.

It's possible that nVidia and game developers will figure out how to optimize the software for ray tracing, but right now it looks like you're trading a lot of performance for what amounts to a small improvement in image quality.

And ray-tracing was supposed to be the justification for the high expense of the new generation of cards.

There's going to be a performance hit when trading shading tricks that conserve resources for actual ray tracing (I'm assuming multiple hit light paths).


Maybe turn down number of hits to get better FPS- unless they're using "ray tracing" to mean something other than what I think it means.

If you do n-1 hits, depending on how they calculate normals, it's potentially exponentially less calculations than n hits.

In other words, each hit adds something like:

(k^2)^hits or (k^3)^hits [with k being an integer >1] surface normal calcs

+ angle calcs*hits + intensity distance calcs*hits for solids


and you basically have to double the number of calculations for a ray going through a transparent or translucent material.

So you at least have exponential increase of normal calcs per hit, unless they're doing some special type of normal calc that I'm not familiar with (I'm uneducated... so).
 
It's almost certainly not the same thing Hollywood CGI people mean by "ray tracing," but whatever it is, Nvidia has been touting this as the Next Great Thing for some time now. We were even shown graphics images that we were meant to think was rendered by these new GPUs and not pre-rendered like any other Hollywood CGI as if the new cards would be able to make games look that good.

The reality has been losing half to two thirds of your frame-rate in exchange for prettier reflections in the puddles and being able to see the muzzle flash reflected on your gun barrel.

If you're in a frantic shooter, you're not going to notice those fine details in the reflections, but you are going to notice the massive drop in framerate.
 
Graphics driver 417.22 was just released and claims to offer a 50% increase in ray tracing performance.

Release notes PDF:
https://us.download.nvidia.com/Windows/417.22/417.22-win10-win8-win7-desktop-release-notes.pdf

Even if true, the lowest quality ray-tracing settings would almost get you to 60 frames per second in 4K UHD. Almost. Mind you, this is on a card that almost no one can afford to buy.

Granted, most PC gamers are using HD or QHD monitors, so the performance hit is probably tolerable even on a 2080 (non-TI) or 2070 card, but even so, the image quality improvement is barely noticeable, so why take a performance hit at all?
 
I might be interested if someone could make an external version of such a card.

How to use an external graphics card with a laptop | PCWorld
Many do-it-yourselfers using Thunderbolt 3 or going the ExpressCard/mPCIe route end up with a plug-and-play experience requiring little to no modification—though it takes some research first. When it’s done, however, you’ll be left with a console-toppling PC gaming setup for about the same price as a new Xbox One S, depending on which graphics card you choose. That’s far cheaper than building a whole new gaming desktop, and you can still take advantage of your laptop’s portability by disconnecting the eGPU hardware.
I've found Use an external graphics processor with your Mac - Apple Support

I've found this description: NVIDIA RTX Ray Tracing | NVIDIA Developer
Conventional 3D rendering has used a process called rasterization since the 1990’s. Rasterization uses objects created from a mesh of triangles or polygons to represent a 3D model of an object. The rendering pipeline then converts each triangle of the 3D models into pixels on a 2D screen. These pixels may then be further processed or “shaded” before final display on the screen.

Ray tracing, which has long been used for non-real-time rendering, provides realistic lighting by simulating the physical behavior of light. Ray tracing calculates the color of pixels by tracing the path that light would take if it were to travel from the eye of the viewer through the virtual 3D scene. As it traverses the scene, the light may reflect from one object to another (causing reflections), be blocked by objects (causing shadows), or pass through transparent or semi-transparent objects (causing refractions). All of these interactions are combined to produce the final color of a pixel that then displayed on the screen.
It links to What's the Difference Between Ray Tracing, Rasterization? | NVIDIA Blog which goes into more detail.
 
If you have the right kind of USB-C ports, yes, you can use an external GPU, but wouldn't that be kind of cumbersome? One way or the other, gaming on laptop is generally more expensive at a given performance point.

I'm becoming a curmudgeon as I get older, and when I actually do game, I mostly play the kind of games for which the latest and greatest graphics technology is kind of moot (e.g. strategy games, and certainly not Battlefield V), but I still find the tech fascinating.

I am perhaps being too harsh. New tech like this often has a rough start.

Then again, we're reaching the end of Moore's law, and the only way they can keep increasing performance is adding more cores, so maybe they'll never have enough power to make real-time ray tracing work.

I still reserve the right to laugh at Nvidia after the way they hyped up this ray-tracing technology, though.
 
Real Time Ray-Tracing May Replace GPU Rasterization | CdrInfo.com -- showing raytracing doing specular reflections much better than rasterization.


Here's another wrinkle in high-performance gaming video:  High-dynamic-range rendering,  High-dynamic-range imaging.

What is HDR gaming? All you need to know | Trusted Reviews It can capture contrast details that plain old 8-bit resolution can't. HDR is usually done with floating point: 16-bit or 32-bit. Here are the various bit allocations of different IEEE-754 floating-point formats:
  • 16 = 1 + 5 + 10 (half)
  • 32 = 1 + 8 + 23 (single)
  • 64 = 1 + 11 + 52 (double)
  • 128 = 1 + 15 + 112 (quadruple)
  • 256 = 1 + 19 + 236 (octuple)
In that standard ( IEEE 754), the first bit is the sign bit, the second set of bits is the exponent, and the third set of bits the fractional part. The sign bit is 0 for positive numbers, 1 for negative numbers. The exponent has a bias value added, 2^((number of digits) - 1) - 1, making its zero value 0111... . The fractional part is digits after the binary version of the decimal point, because the one in front is always 1. The only exception is for the smallest exponent value, and in that case, the whole value is multiplied by 2.

Half-precision floating-point (16-bit) can exactly represent 11-bit integers, so it should be plenty for imaging and rendering.


The next advance might be doing colors better, by subdividing the visual spectrum further than into our eyes' three primary-color channels. (PDF) 3D Graphics Techniques for Capturing and Inspecting Hyperspectral Appearance discusses some work in that. One needs hyperspectral imaging to supply data for such an approach, taking pictures using a special camera that makes much more than three images, in bands of the light spectrum much narrower than red or green or blue. Three images would be for each of red, green, and blue in a typical color camera.

The motivation: real-world light comes in a continuum of wavelengths, and (for example), a blue-green light will illuminate a yellow-green surface less than a blue-green surface, while in typical CG, with RGB color, the light and both surfaces would be just plain green.

It's been possible to get away with RGB color because most ordinary light sources and objects have rather broad spectra, making RGB color a usually-good approximation.
 
Uh, correct me if I'm wrong, but most "HDR" on home theaters, consoles, and PCs are just using 10-bit integers per color instead of the usual 8-bit.

Just getting 10 bits per color and a 60Hz framerate is a difficult trick unless you have a pretty beefy GPU, an expensive monitor, and the right kind of connector between the two. Worse, most PC gamers only care about getting the highest possible refresh rates (e.g. 120 Hz, 144 Hz), and they're willing to trade both resolution and color depth to get it.

But that is fascinating. Thanks for the reading material.
 
The Trusted Reviews article suggests using TVs for HDR gaming? Aren't most TVs running at 30Hz?
 
Bargle. I should be in bed already. That PDF looks interesting, but I'm already cross-eyed. Wouldn't that require very different monitors and other display technologies?
 
Back
Top Bottom