A1 Scene Viewer

Out: Monday, September 16th
Due: Friday, October 4th (end of day)

In this assignment you will build the core of a real-time scene viewer which renders scenes using the Vulkan API. This code serves as a foundation that you will modify, extend, and improve for the rest of the semester.

Scoring: this assignment is worth 15 points, total. Thanks to a convenient and intentional construction, each "point" is worth exactly 1% of your final course grade. Points are allocated as follows: A1-load is worth 2 points; A1-show is worth 3 points; A1-cull is worth 2 points; A1-move is worth 2 points; A1-hide is worth 3 points; A1x-fast is worth 1 point + up to 1 extra point; and A1-create is worth 2 points.

Points will be awarded for each section based on both the code and the sections of a report demonstrating and benchmarking the code.

Collaboration, Plagiarism, and Copyright

Reminder: this class takes a strict didactic, ethical, and legal view on copying code.

What to Do

Starting with your codebase from the nakluV tutorial, write code that can:

Use your creativity (and your code) to make something beautiful:

Demonstrate and test your code; write a report which includes screen recordings, images, timings, examples, graphs. This is non-trivial. See report template to understand what is required.

What to Turn In

Turn in your code in /afs/cs.cmu.edu/academic/class/15472-f24/<andrewid>/A1/. (If you are accessing AFS via an andrew machine, you may need to aklog cs.cmu.edu to acquire cross-realm tokens.)
Your turn-in directory should include:

We expect your Maekfile.js to properly build your viewer on at least one of { Linux/g++; Windows/cl.exe; macOS/clang++ }. We will not penalize you for minor cross-platform compile problems; though we would appreciate it if you tested on Linux.

We will compile and run your code in an environment set up to build the nakluV tutorial -- GLFW 3.4 and the Vulkan SDK will be available.

We expect your report to be viewable in Firefox on Linux. You may wish to consult MDN to determine format compatibility for any embedded videos.

Allowed Libraries

Although one of this class's principles is "No Magic," we cannot open every black box; so you are allowed to use a few approved libraries for specific tasks in your code. (These libraries are in addition to GLFW and the Vulkan SDK which -- as we note above -- will be available in our build environment.)

If you use any of these libraries, you will need to include the relevant code in your turn-in repository. We suggest a git submodule.

If you would like to use a different library for one of these tasks you do need approval from course staff (there will be a Zulip thread). Approval may not be given for certain libraries, and may take some time.

Command-Line Arguments

Some of the features you implement in this assignment are controlled by command-line arguments. Many of these are documented in more detail in the sections below. The "optional" and "required" tags indicate whether the argument is required on the command line, not whether it is required that you implement it.

You may add your own command line arguments; indeed, there is a section in the report template to document them.

Note that the tutorial codebase already includes some simple command-line parsing code in RTG.cpp -- specifically, the RTG::Configuration::parse function.

Scene Format

This assignment (and subsequent assignments) will use scene'72 (.s72) format.

In this assignment, you may make the following simplifying assumptions about scene'72 files:

A1-load Scene Data Structures and Loading

When launched with the command-line option --scene filename.s72, your viewer should load a scene in Scene'72 format from the file filename.s72. To complete this part of the assignment, you will both need to design a scene data structure to load the scene into and write parsing code to handle the scene format itself.

Suggestions:

Warning: consider the Scene code from 15-466 as an anti-example! It only handles tree-structured scenes, isn't built for efficient traversal, and is pretty closely tied to OpenGL.

A1-show Scene Display

Now that you've loaded the scene, it is time to show it.

This will involve traversing the scene and sending each mesh + transformation to the GPU using the Vulkan API. You should be able to start with your code from the nakluV Tutorial, which is already set up for drawing object instances.

Cameras

Your viewer should have three camera modes:

If started with the command-line option --camera name your viewer should launch in scene camera mode with the camera named name active. If the named camera does not exist in the scene, your viewer should print an error message and exit.

Otherwise, it is up to you to determine the details of how to activate and switch between scene, user, and debug camera modes. (Please document how in your report.) Some handy controls might include warping the user camera to the current scene camera and/or setting the debug camera to a position that can see the whole scene.

Materials / Lighting

As per the assumptions above, all materials in your scene will be lambertian, and the only lights in the scene will be a distant directional light and/or a hemisphere light.

Therefore, your fragment shader will probably do something like this:

	vec3 energy =
		  SKY_ENERGY * (dot(n, SKY_DIRECTION) * 0.5 + 0.5)
		+ SUN_ENERGY * max(dot(n, SUN_DIRECTION), 0.0);
	outColor = vec4(texture(ALBEDO, texCoord) * energy, 1.0);
Basic hemispherical lighting equation in glsl syntax, where: n is the per-pixel normal (remember to normalize after interpolation!); texCoord is the interpolated texture coordinate; *_DIRECTION are uniforms giving the light directions; *_ENERGY are uniforms giving the light energy in appropriate units; ALBEDO is the albedo texture; and outColor is the value that gets written to the framebuffer.

A1-cull Culling

The fastest triangle to render is the triangle you don't need to render. It's time to update your viewer to perform view frustum culling. When frustum culling is active, your code should check whether a mesh instance is visible before sending it to the GPU for drawing.

In order to perform a fast visibility test you should build a bounding volume for each mesh and check this volume against the viewing frustum. Use bounding boxes for your first implementation; you may test other shapes later for potential performance improvement.

When your viewer is run with the --culling mode command line option it will start with a given culling mode selected. The modes are as follows:

We are using a culling flag because, in your report, you will need to demonstrate that culling is working, and show scenes where it has both a positive and negative impact on frame rate.

Note: as mentioned above, when rendering with the debug camera, do culling as if rendering for the previously-selected (user or scene) camera. This can be very handy for checking to see if culling is actually working! (And demonstrating that it is working in your report.) You may also want to add some code to visualize the view frustum and/or bounding volumes for the meshes.

A1-move Animation

It is finally time to stop deferring animation. Update your parsing code to read "DRIVER" objects; add code to your main loop to track and update the current playback position for the animations; and patch in keyboard control to pause/unpause/restart the animations.

Note: the animation playback position for the first rendered frame should be 0, and should be advanced for subsequent frames by measuring the elapsed time from the first frame. (Or, in headless mode, using the times from the events file.)

Note2: your viewer should, by default, start with any scene animations playing and with the playback position at time 0. For the purposes of debugging/testing, you may wish to add command line options to start paused, set starting time, and/or loop animations.

A1-hide Headless Operation

For benchmarking, it will be very useful to have your application run exactly the same workload, and to run it as fast as possible in the background.

Add support for the --headless events command line flag.

In headless mode, your code should not create a window (i.e., a GLFWwindow); it should not create a surface (i.e., a VkSurfaceKHR), it should not create a swapchain (i.e., a VkSwapchainKHR), and it should not use any of the WSI extension functions (e.g., vkAcquireNextImageKHR or vkQueuePresentKHR). Instead, it should emulate these functions by creating a list of VkImages itself, and both signalling and waiting on the right semaphores and fences to keep rendering and "presentation" separate.

Images in your application's fake swapchain will be made available for rendering by events in the events file (which also includes the frame times that should be reported to scene update code).

Important: events files contain timestamps to allow (e.g.) running animations at a repeatable rate; your code shouldn't try to delay to make these match wall-clock time.

Note: For reasons of stable timing, and to avoid mid-run failures, I suggest parsing the events file at startup and storing it into an events structure in memory.

Note: Note that your swapchain should behave as if associated with a surface with VK_PRESENT_MODE_FIFO_KHR as the present mode.

Note: There is a headless surface extension supported on some platforms. You may not use it. Part of this exercise is to get you to understand what a presentation layer must be doing to properly hand out and retrieve rendered images.

Events File Format

The events file passed to --headless will be a UTF-8-encoded text file with '\n' (unix-style) line endings. Each line follows the same general format:

ts EVENT param1 ... paramN

Where ts is the time since the first frame as an integer number of microseconds (i.e., millionths of a second); EVENT is the event type (a string in all caps); and param1 through paramN are parameters (defined per event type).

Note: ts values will be nondecreasing; i.e., events are listed in chronological order.

Your code should support the following events:

Example Events File

Here is an example events file and the script that generated it. (Usage: node events.mjs > example.events.) You may find yourself making similar files when doing the performance testing part of your report.

A1x-fast Extra: Go Fast

Note: this section is work 1 point + up to 1 extra point. I tagged it as "extra" because you can skip it entirely and still receive 14/15 = ~93% on the assignment. On the other hand, if you try (and document) several substantial approaches to improving your viewer's rendering speed you can earn some extra credit here.

Within the framework of this basic viewer there are a ton of ways to (attempt to) make your code go faster. The final segment of this assignment is an open-ended exploration of these possibilities.

To receive one point on this segment, you may add any of the following to your system (and document the impact in your report):

To receive up to one extra point, go beyond by attempting more items from the list above or developing and documenting other substantial improvements of your own devising.

Don't forget the report

Don't forget to write the report.