Home > other >  Accessing undefined stage_in Metal shader argument
Accessing undefined stage_in Metal shader argument

Time:03-22

I am building a minimalistic 3D engine in Metal and I want my vertex and fragment shader code to be as much reusable as possible so that my vertex shader can for instance be used without being changed no matter its input mesh vertex data layout.

An issue I have is that I can't guarantee all meshes will have the same attributes, for instance a mesh may just contain its position and normal data while another may additionally have UV coordinates attached.

Now my first issue is that if I define my vertex shader input structure like this:

struct VertexIn {
    float3 position [[ attribute(0) ]];
    float3 normal [[ attribute(1) ]];
    float2 textureCoordinate [[ attribute(2) ]];
};

I wonder what is the consequence of doing so if there was no specified attribute 2 in my metal vertex descriptor? My tests seem to indicate there is no crash (at least of just defining such an argument in the input texture), but I wonder if this is just undefined behavior or if this is actually safe to do?

Another issue I have is that I might want to pass the uv texture info to the fragment shader (ie: return it from my vertex shader), but what happens if it is missing? It feel like except if specifically designed this way, it would be undefined behavior to access textureCoordinate to set its value to a property of some VertexOut structure I return from my vertex shader.

Additionally I notice that Apple's RealityKit framework must have found some way around this issue: it enables users to point to "shader modifier" functions that are passed the data of both vertex and fragment shaders so that they can act on it, what surprises me is that the structures the user functions are passed define a lot of properties which I am not sure are always defined for all meshes (for instance, a second UV texture). This seems pretty similar to the problem I am trying to solve.

Am I missing some obvious way to fix this issue?

Thank you

CodePudding user response:

I think the intended way to deal with this is function constants. This is an example of how I deal with this in my vertex shaders.

constant bool HasColor0 [[ function_constant(FunctionConstantHasColor0) ]];
constant bool HasNormal [[ function_constant(FunctionConstantHasNormal) ]];
constant bool HasTangent [[ function_constant(FunctionConstantHasTangent) ]];
constant bool HasTexCoord0 [[ function_constant(FunctionConstantHasTexCoord0) ]];
constant bool AlphaMask [[ function_constant(FunctionConstantAlphaMask) ]];

// ... 

struct VertexIn
{
    float3 position [[ attribute(AttributeBindingPosition) ]];
    float3 normal [[ attribute(AttributeBindingNormal), function_constant(HasNormal) ]];
    float4 tangent [[ attribute(AttributeBindingTangent), function_constant(HasTangent) ]];
    float4 color [[ attribute(AttributeBindingColor0), function_constant(HasColor0) ]];
    float2 texCoord [[ attribute(AttributeBindingTexcoord0), function_constant(HasTexCoord0) ]];
};

struct VertexOut
{
    float4 positionCS [[ position ]];
    float4 tangentVS = float4();
    float3 positionVS = float3();
    float3 normalVS = float3();
    float2 texCoord = float2();
    half4 color = half4();
};

static VertexOut ForwardVertexImpl(Vertex in, constant CameraUniform& camera, constant MeshUniform& meshUniform)
{
    VertexOut out;

    float4x4 viewModel = camera.view * meshUniform.model;
    float4 positionVS = viewModel * float4(in.position.xyz, 1.0);
    out.positionCS = camera.projection * positionVS;
    out.positionVS = positionVS.xyz;

    float4x4 normalMatrix;
    if(HasNormal || HasTangent)
    {
        normalMatrix = transpose(meshUniform.inverseModel * camera.inverseView);
    }

    if(HasNormal)
    {
        out.normalVS = (normalMatrix * float4(in.normal, 0.0)).xyz;
    }

    if(HasTexCoord0)
    {
        out.texCoord = in.texCoord;
    }

    if(HasColor0)
    {
        out.color = half4(in.color);
    }
    else
    {
        out.color = half4(1.0);
    }

    if(HasTangent)
    {
        // Normal matrix or viewmodel matrix?
        out.tangentVS.xyz = (normalMatrix * float4(in.tangent.xyz, 0.0)).xyz;
        out.tangentVS.w = in.tangent.w;
    }

    return out;
}

vertex VertexOut ForwardVertex(
    VertexIn in [[ stage_in ]],
    constant CameraUniform& camera [[ buffer(BufferBindingCamera) ]],
    constant MeshUniform& meshUniform [[ buffer(BufferBindingMesh) ]])
{
    Vertex v
    {
        .color = in.color,
        .tangent = in.tangent,
        .position = in.position,
        .normal = in.normal,
        .texCoord = in.texCoord,
    };

    return ForwardVertexImpl(v, camera, meshUniform);
}

And in the host application I fill out the MTLFunctionConstantValues object based on the semantics geometry actually has:

func addVertexDescriptorFunctionConstants(toConstantValues values: MTLFunctionConstantValues) {
    var unusedSemantics = Set<AttributeSemantic>(AttributeSemantic.allCases)

    for attribute in attributes.compactMap({ $0 }) {
        unusedSemantics.remove(attribute.semantic)

        if let constant = attribute.semantic.functionConstant {
            values.setConstantValue(true, index: constant)
        }
    }

    for unusedSemantic in unusedSemantics {
        if let constant = unusedSemantic.functionConstant {
            values.setConstantValue(false, index: constant)
        }
    }
}

A good thing about it is that compiler should turn those function constants ifs into code without branches, so it shouldn't really be a problem during runtime AND this lets you compile your shaders offline without having to use online compilation and defines.

  • Related