Home > Back-end >  MeshData GetVertexData has the incorrect length
MeshData GetVertexData has the incorrect length

Time:08-27

I'm trying to optimize some mesh generation using MeshData & the Job System, but for some reason when I try to use 2 params in meshData.SetVertexBufferParams, the resulting meshData.GetVertexData is half the length it should be (I set the vertex count to 5120, but the resulting VertexData NativeArray is only 2560 items long).
When I force it to be double the length (SetVertexBufferParams(numVerts * 2, ...)), it creates a mesh that appears to treat the norms and vert positions as all position data and also makes the screen go black so no screen shot.

Here's my code:

// generate 256 height values
int[] arr = new int[256];
for (int i = 0; i < arr.Length; i  )
{
    arr[i] = (int) (Mathf.PerlinNoise(i / 16 / 16f, i % 16 / 16f) * 5);
}
// put it in a NativeArray
NativeArray<int> heights = new NativeArray<int>(arr, Allocator.TempJob);

// 4 verts per face * 5 faces = 20
int numVerts = heights.Length * 20; // this value is always 5120
// 2 tris per face * 5 daces * 3 indices = 30
int indices = heights.Length * 30;

// MeshData setup
Mesh.MeshDataArray meshDataArray = Mesh.AllocateWritableMeshData(1);
Mesh.MeshData meshData = meshDataArray[0];
meshData.SetVertexBufferParams(numVerts,
    new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float32, 3, stream:0),
    new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float32, 3, stream:1)
);
meshData.SetIndexBufferParams(indices, IndexFormat.UInt16);

// Create job
Job job = new Job
{
    Heights = heights,
    MeshData = meshData
};
// run job
job.Schedule().Complete();

// struct I'm using for vertex data
[System.Runtime.InteropServices.StructLayout(System.Runtime.InteropServices.LayoutKind.Sequential)]
public struct VData
{
    public float3 Vert;
    public float3 Norm;
}

// Here's some parts of the job
public struct Job : IJob
{
    [ReadOnly]
    public NativeArray<int> Heights;
    public Mesh.MeshData MeshData;

    public void Execute()
    {
        NativeArray<VData> Verts = MeshData.GetVertexData<VData>();
        NativeArray<ushort> Tris = MeshData.GetIndexData<ushort>();

        // loops from 0 to 255
        for (int i = 0; i < Heights.Length; i  )
        {
            ushort t1 = (ushort)(w1   16);


            // This indicates that Verts.Length is 2560 when it should be 5120
            Debug.Log(Verts.Length);

            int t = i * 30; // tris
            int height = Heights[i];
            // x and y coordinate in chunk
            int x = i / 16;
            int y = i % 16;

            float3 up = new float3(0, 1, 0);

            // This throws and index out of bounds error because t1 becomes larger than Verts.Length
            Verts[t1] = new VData { Vert = new float3(x   1, height, y   1), Norm = up};

            // ...
        }
    }
}

CodePudding user response:

meshData.SetVertexBufferParams(numVerts,
    new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float32, 3, stream:0),
    new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float32, 3, stream:1)
);

Your SetVertexBufferParams here places VertexAttribute.Position and VertexAttribute.Normal on a separate streams thus halving the size of the buffer per stream and later the length of the buffers if buffer becomes reinterpreted with the wrong struct by mistake.

This is how documentation explains streams:

Vertex data is laid out in separate "streams" (each stream goes into a separate vertex buffer in the underlying graphics API). While Unity supports up to 4 vertex streams, most meshes use just one. Separate streams are most useful when some vertex attributes don't need to be processed, for example skinned meshes often use two vertex streams (one containing all the skinned data: positions, normals, tangents; while the other stream contains all the non-skinned data: colors and texture coordinates).

But why it might end up re-interpreted as half the length? Well, because of this line:

NativeArray<VData> Verts = MeshData.GetVertexData<VData>();

How? Because there is a implicit stream parameter value there (doc)

public NativeArray<T> GetVertexData(int stream = 0);

and it defaults to 0. So what happens here is this:

var Verts = Positions_Only.Reinterpret<Position_And_Normals>();

or in other words:

var Verts = NativeArray<float3>().Reinterpret<float3x2>();


case solved :T


TL;DR:

  • Change stream:1 to stream:0 so both vertex attributes end up on the same stream.
  • or var Positions = MeshData.GetVertexData<float3>(0); & var Normals = MeshData.GetVertexData<float3>(1);
  • or create a dedicated VData struct per stream var Stream0 = MeshData.GetVertexData<VStream0>(0); & var Stream1 = MeshData.GetVertexData<VStream1>(1);
  • Related