In this assignment you will add a materials system to the core you built in A1. You will need to modify your mesh handling to include new attributes for texture coordinates and tangents; add material parameter, texture map, and lighting environment loading to your .s72 loader (and get these textures uploaded to the GPU); update your drawing code to handle multiple shaders and binding sets; and write shader code that implements tangent-space normal maps.
Scoring: this assignment is worth 15 points, total. Thanks to a convenient and intentional construction, each "point" is worth exactly 1% of your final course grade. Points are allocated as follows: A2-env is worth 2 points; A2-tone is worth 1 point; A2-diffuse is worth 3 points; A2-normal is worth 2 points; A2-pbr is worth 3 points; A2x-displacement is worth up to 1 extra point; and A2-create is worth 4 points.
Points will be awarded for each section based on both the code and the sections of a report demonstrating and benchmarking the code.
Reminder: this class takes a strict didactic, ethical, and legal view on copying code.
Write code that can:
"environment"
and "mirror"
materials."lambertian"
material, including albedo from a texture map. Build a utility to process lighting environment cube maps to produce lambertian look-up-table cube maps."pbr"
material. Add support for GGX-specular-importance-sampled mip-map generation to your cube map processing utility.Use your creativity (and your code) to make something beautiful:
Demonstrate and test your code; write a report which includes screen recordings, images, timings, examples, graphs. This is non-trivial. See report template to understand what is required.
Turn in your code in /afs/cs.cmu.edu/academic/class/15472-f24/<andrewid>/A2/
.
Your turn-in directory should include:
report/
report describing your code and illustrating that it works.
report/report.html
start with the report template and replace the "placeholder" sections.report/*.s72,*.b72
benchmarking scenes and data mentioned in your report.report/*
other files (images, animations, cubemaps) needed by your report.code/
the code you wrote for this assignment.
code/.git
your code must be stored in a git repository with a commit history showing your development process.code/Maekfile.js
build script. When run as node Maekfile.js
, produces bin/viewer
(bin/viewer.exe
on Windows), your compiled scene viewer; and bin/cube
(bin/cube.exe
on Windows), your compiled cube-map utility.report/
folder; the scene file for your model should be in the model/
folder.)model/
your created model.
model.s72
your main model Scene'72 file.*.b72
any data files needed by your scene.*.png
any texture or lighting data files needed by your scene.model.mp4
a screen recording (H.264 in MP4 container) of your model shown in your viewer. You may wish to rotate the view, the lighting, or otherwise demonstrate the model.
We expect your Maekfile.js
to properly build your viewer on at least one of { Linux/g++
; Windows/cl.exe
; macOS/clang++
}.
We will not penalize you for minor cross-platform compile problems; though we would appreciate it if you tested on Linux.
We will compile and run your code in an environment set up to build the nakluV tutorial -- GLFW 3.4 and the Vulkan SDK will be available.
We expect your report to be viewable in Firefox on Linux. You may wish to consult MDN to determine format compatibility for any embedded videos.
Same as A1.
This assignment uses the scene'72 (.s72
) format.
In this assignment, you may make the following simplifying assumptions about scene'72 files:
"indices"
property."topology"
of "TRIANGLE_LIST"
."attributes"
(though, likely, with a different "src"
file):
"attributes":{
"POSITION": { "src":"cube.b72", "offset":0, "stride":48, "format":"R32G32B32_SFLOAT" },
"NORMAL": { "src":"cube.b72", "offset":12, "stride":48, "format":"R32G32B32_SFLOAT" },
"TANGENT": { "src":"cube.b72", "offset":24, "stride":48, "format":"R32G32B32A32_SFLOAT" },
"TEXCOORD": { "src":"cube.b72", "offset":40, "stride":48, "format":"R32G32_SFLOAT" },
}
"type":"cube"
textures have format "format":"rgbe"
.
Update your code to support the new "ENVIRONMENT"
type in the Scene'72 specification; and add support for the "environment"
and "mirror"
materials to show it off.
Note that both "environment"
and "mirror"
both use the lighting environment as a look-up table, but with a different vector.
"environment"
uses the normal directly, while "mirror"
uses the reflection vector (note, also, the GLSL reflect
function).
Be mindful of coordinate systems in your shader code --
your shader needs to deal with object-local coordinates (vertices), environment coordinates (when looking up lighting directions), and clip coordinates.
I tend to do lighting in world space pass a vec3 EYE
(camera) point uniform; but with some careful noodling you can do lighting in other coordinate systems.
Our lighting environments will often have a very high dynamic range (the brightest direction is many many times brighter than the darkest direction). In order to deal with this high dynamic range while still maintaining relatively compact files we will use an RGBE (RGB + shared exponent) encoding inspired by the color format in the Radiance renderer's .hdr image format. Note that we are just using the color encoding part of this spec, and will actually store the data in some other image format (probably .png)!
To convert from a stored value (\(rgbe\)) to a radiance value (\( rgb' \)), multiply as follows:
\[ rgb' \gets 2^{e-128} * \frac{rgb + 0.5}{256} \]With one particular quirk, which is that we will map \( (0,0,0,0) \to (0,0,0) \) so that true black is supported.
You likely find frexp
and ldexp
useful in converting to/from RGBE format.
Note also that radiance's color handling code is available for inspiration;
as is some old RGBE-handling from 15-466.
Now that we're working with environment probes that operate in real-world radiance units, your viewer code should take care to adopt a "linear light" + "tone mapping" rendering flow.
In other words, your code (probably in a fragment shader) should first compute a fragment radiance, and then convert this radiance to the displayed color value by using a "tone mapping" operator.
I suggest that you investigate glslc's #include
functionality to allow all of your material shaders to use the same tone mapping function.
I leave the choice of tone mapping operator to you, but do require you to support something more sophisticated than linear.
This might also be an interesting time to consider supporting an HDR output surface. Though, for the purposes of the exercise, your code should still have an LDR output mode with tone mapping available.
You already have a "lambertian"
material;
to light it with an environment you need to develop some code to "pre-convolve" the environment cubemap with a cos-weighted hemisphere to make a lambertian look-up table cube map;
and modify your lambertian shader to support looking up into an environment cube map.
You should produce a utility cube
that when run with the command cube in.png --lambertian out.png
reads a cubemap from in.png
, samples it to produce a lambertian lookup table cube map, and stores this in out.png
.
Both files should be in rgbe encoding (RGB with a shared exponent in the A component).
Your engine should use Vulkan's support for cube maps when creating the cube images and accessing them in shaders. This will require you to add extra parameters to the image creation and uploading functions (or make special versions for layered images).
Tip: look into VK_FORMAT_E5B9G9R9_UFLOAT_PACK32
for storing HDR images compactly on the GPU.
Tip: the diffuse lookup cubemap can be very small because it is pretty darn low frequency. E.g., having an edge length of 16 pixels is reasonable.
Tip: the s72 specification does provide guidance for how to name your lambertian cubemaps, relative to the source lighting environment.
Add support for normal maps. This will require carrying a tangent frame through the vertex shader and into the fragment shader.
Be aware that normal maps in s72 are stored as 2D textures, scaled and offset as \( n * 0.5 + 0.5 \). Forgetting to scale and bias to expand these normals will result in strange apparent normal directions.
Accidentally storing normal maps in _SRGB
-format textures will do even stranger things!
The caution about being clear about coordinate systems in A2-env definitely also applies here.
Add support for the "pbr"
material type,
using the split-sum approximation with precomputed specular mip-maps and a look-up table as described in Epic's 2013 SIGGRAPH course talk (and notes -- with code!).
You may also find the 2012 Disney Talk (and notes) useful, especially for describing the BRDF parameters more clearly.
The glTF 2.0 specification adopts a similar BRDF and includes some implementation information that might come in handy -- I especially appreciated the description of how to handle metalness. (Though their implementation doesn't deal with image-based lighting and the split-sum approximation.)
In order to "pre-integrate" the convolution of the GGX specular lobe and a given lightmap, expand your cube map utility to support
the command line option cube in.png --ggx out.png
, which will read a cubemap from in.png
, sample it to produce an stack of importance-sampled-with-GGX-at-different-roughnesses lookup table cube maps, and stores them in out.1.png
(first mip level after the base level; lowest non-zero roughness; half the side length of the input cube) through out.N.png
(smallest mip level / highest roughness).
Tip: the s72 specification does provide guidance for how to name your ggc mip-level cubemaps, relative to the source lighting environment.
Add displacement map support to your materials.
Use parallax occlusion mapping or a similar technique to add view-dependent displacement to all material types.
Your creative exercise in A2 is to build a textured, normal-mapped model. (And put it in a nice scene to show it off.)
When building the model, I suggest first building a high-detail model (either by hand or by using photogrammetry techniques) and then transferring that detail to a lower-resolution model by "baking" it (e.g., in blender). Note that you must create this model yourself, including the textures!
In deciding on what to model or capture, think about what objects might show off features of the "pbr"
material (like variable roughness and metalness), and what has enough detail to benefit from a normal map.
Please keep your model content to a "PG" level. This is not the time to show off your collection of drug paraphernalia.
When building out a scene to show off the model, I suggest finding a suitable environment (polyhaven has many), and considering adding a camera or model animation to show how the light interacts with your model's textured detail.
Don't forget to write the report.