Hah, we had this since the beginning in CUDA! (Circa 2008)
It's baffling how little has been invested in shader language tooling - probably set back graphics programming in the game industry about a decade or so. I just wish there was a proper CUDA-like shading language with a simple API, like what Sebastian Aaltonen suggested in https://www.sebastianaaltonen.com/blog/no-graphics-api.
Using printf in shaders is awesome, it makes a huge difference when writing and debugging shaders. Vulkan and GLSL (and Slang) have a usable printf out of the box, but HLSL and D3D do not.
Afaik the way it works in Vulkan is that all the string formatting is actually done on the CPU. The GPU writes only writes the data to buffers with structs based on the format string.
All the shader prints are captured by tools such as Renderdoc, so you can easily find the vertex or pixel that printed something and then replay the shader execution in a debugger.
I only wish that we would've had this 20 years ago, it would have saved me so much time, effort and frustration.
Finding which pixel to debug, or just dumping some info from the pixel under mouse cursor (for example) is better done with a simple printf. Then you can pick up the offending pixel/vertex/mesh/compute in the debugger if you still need it.
You get both, a debugger and printf related tooling in Renderdoc and it's better than either of those alone.
I've been writing a lot of GPU code over the past few years (and the few decades before it) and shader printf has been a huge productivity booster.
Hah, we had this since the beginning in CUDA! (Circa 2008)
It's baffling how little has been invested in shader language tooling - probably set back graphics programming in the game industry about a decade or so. I just wish there was a proper CUDA-like shading language with a simple API, like what Sebastian Aaltonen suggested in https://www.sebastianaaltonen.com/blog/no-graphics-api.
Using printf in shaders is awesome, it makes a huge difference when writing and debugging shaders. Vulkan and GLSL (and Slang) have a usable printf out of the box, but HLSL and D3D do not.
Afaik the way it works in Vulkan is that all the string formatting is actually done on the CPU. The GPU writes only writes the data to buffers with structs based on the format string.
All the shader prints are captured by tools such as Renderdoc, so you can easily find the vertex or pixel that printed something and then replay the shader execution in a debugger.
I only wish that we would've had this 20 years ago, it would have saved me so much time, effort and frustration.
Maybe because Pix is quite good?
Using printf and debuggers are complementary.
Finding which pixel to debug, or just dumping some info from the pixel under mouse cursor (for example) is better done with a simple printf. Then you can pick up the offending pixel/vertex/mesh/compute in the debugger if you still need it.
You get both, a debugger and printf related tooling in Renderdoc and it's better than either of those alone.
I've been writing a lot of GPU code over the past few years (and the few decades before it) and shader printf has been a huge productivity booster.
Not putting strings on GPU looks very much like how some system approaches semihosting
[dead]