Wow, it has been a month now(just about)! I’ve been very busy recently, and thus haven’t had as much time to work on my Rust Renderer as I wanted recently.

My main aim recently has been getting my renderer to an initial “Mesh Viewer” stage. I moved over the camera controller implementation from the WGPU tutorial and decided to use that. I setup a simple descriptor set that contains the projection and view matrices, so now I am able to draw my cube to the screen!

renderdoc_preview

Then, I set about being able to load meshes from outside of the renderer. I created a function that returns a MeshHandle so that meshes can be referenced outside of the renderer with an opaque handle. The meshes are stored in a slotmap in the renderer, and the MeshHandle points to the mesh that has been loaded. I chose this approach because it is the same solution I used in my C++ projects and I liked being able to pass the handle around without worrying about data being modified, unless explicity passed to the places it needed to be used. I’m not 100% sure if this is the best solution, but I’m going to stick with it for now.

I implemented this and then got a strange problem. I could not get my cube to work correctly. All I had done was seperate the loading of the mesh into a different function. However, I realised that I had swapped around the order of me loading meshes and setting up the other buffers. When I swapped it back round, it worked like it did before. This baffled me for quite a few hours, and I was running around and looking in all the wrong places. It turns out, that the vertex buffer I had created was using the same memory allocation parameters as my staging buffer(AllocationCreateFlags::MAPPED and vk_mem_alloc::AllocationCreateFlags::HOST_ACCESS_SEQUENTIAL_WRITE). I removed these parameters, and this fixed the problem immediately. Honestly, I know too little about GPU memory to exactly know why this is happening, but I can guess that it is something to do with the memory not being flushed and visible when the GPU started using it. Once this was fixed, I was now able to load meshes from outside of the renderer.

The next task was to be able to choose what meshes were drawn, using the new MeshHandles. I decided to pass a Vec to the renderer, which would then take ownership over the vec. In the future I will need to look into lifetimes, as when passing other information such as the object's transforms, I would prefer not moving the vec into the renderer each time, and instead just giving it a reference. With this now working, I decided to load two meshes, and on input, swap which mesh was being rendered. This worked great, and showed me that the MeshHandles were working correctly.

I plan to use this week to work on the engine creating a simple game using my renderer, and hoping to get some ideas on the structure/design of it from actually using it to make something. I decided to implement model matrices for objects, so that I can start rendering objects in different positions, scales and rotations. This will be important for making any game that revolves around more than one object ;)

renderdoc_preview

I got that working, and I’m happy with how quick I was able to do it. I’m using a large storage buffer to store an array of transforms, and using push constants to send the index of the transform that should be used for each draw. I’m not necessarily sure if this is how it should be done, but I’m happy with how it works so far. I’m not currently sure if I want to introduce textures at this point, and I may leave my objects untextured for the time being. This will allow me to focus on other parts of the renderer for now, so that I can get it up to a point where I can begin creating a simple game.