Skip to content

Commit

Permalink
Described Whitted-style ray tracing demo
Browse files Browse the repository at this point in the history
  • Loading branch information
victorcoda committed Apr 19, 2021
1 parent 7b34d6e commit 737a34a
Show file tree
Hide file tree
Showing 2 changed files with 10 additions and 0 deletions.
10 changes: 10 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -104,3 +104,13 @@ The image space Gaussian filter is an NxN-tap convolution filter that weights th
<img src="./screenshots/edge-detection.jpg" height="140px" align="left">

The Sobel operator, sometimes called the Sobel–Feldman operator or Sobel filter, is used in image processing and computer vision, particularly within edge detection algorithms where it creates an image emphasising edges. Normally Sobel filter requires eight texture reads, but we can optimize it under certain circumstances. If it's enough to read only single value instead of multiple values without texture filtering, we can use *textureGatherOffsets()* function to gather four scalar values from different locations inside limited offset range (this range is restricted by the size of hardware texture cache). This demo implements two approaches to find depth discontinuities in the depth buffer: 1) Sobel filter with two texture gather reads. 2) Single *textureGather()* read with following screen-space derivatives to find deltas between adjacent fragments. The result gradient magnitude is passed to the *step()* function with some edge threshold to emphasise silhouettes of the objects. While screen-space derivatives may be cheaper, Sobel filter gives smoother result.

### [Ray tracing](ray-tracing/)
<img src="./screenshots/ray-tracing.jpg" height="200px" align="left">

Whitted-style ray tracing was first introduced by Turner Whitted in [An Improved Illumination Model for Shaded Display](https://www.cs.drexel.edu/~david/Classes/Papers/p343-whitted.pdf) (*Graphics and Image Processing, June 1980*). Illumination returned to the viewer has been described as a "tree of rays", so a ray tracing algorithm has been programmed in C. For the scenes shown in this paper, the image generation times were around hour or two. Hardware accelerated ray tracing was first introduced by NVIDIA with their Turing GPU architecture in 2018, see [Turing Ray Tracing Technology](https://www.nvidia.com/content/dam/en-zz/Solutions/design-visualization/technologies/turing-architecture/NVIDIA-Turing-Architecture-Whitepaper.pdf), p. 25. New hardware RT Cores accelerate Bounding Volume Hierarchy traversal and ray/triangle intersection testing. This demo uses [VK_NV_ray_tracing](https://www.khronos.org/registry/vulkan/specs/1.2-extensions/man/html/VK_NV_ray_tracing.html) extension that allows to utilize RT Cores of GPU hardware. From shading perspective, it implements quite a simple algorithm:

* Generate an image by sending one ray per pixel.
* Check for shadows by sending a ray to the light.

On API level we need to build top- and bottom-level acceleration structures, which allow to speed up BVH traversal and ray-triangle intersection testing. BLAS contains the AABBs or geometry to be intersected, whereas TLAS contains instance data referring BLAS. If scene objects update their transforms, TLAS should be re-builded. Vertex and index data of the scene provided as storage buffers binded to indexed descriptor array. With ray-tracing pipeline a new shader stages were introduced: ray-gen, miss, intersection, any-hit and closest-hit. The entire ray-tracing pipeline is executed by single *vkCmdTraceRaysNV* call.
Binary file added screenshots/ray-tracing.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 737a34a

Please sign in to comment.