Home > Mobile >  Two gob encoders produce different results
Two gob encoders produce different results

Time:12-26

... and it's driving me nuts trying to understand what I'm doing wrong!

Playground: https://go.dev/play/p/ZQP8Y-gwihQ

The example looks contrived but it's drawn from code that I have where the error arose. In my code I'm hashing the bytes buffer and want the process to be predictable.

package main

import (
    "bytes"
    "encoding/gob"
    "fmt"
    "log"
)

type Foo struct {
    Bar string
    Baz string
}

func (f *Foo) X() string {
    var b bytes.Buffer
    s := struct {
        Bar string
        Baz string
    }{
        f.Bar,
        f.Baz,
    }
    log.Printf("%v", s)
    gob.NewEncoder(&b).Encode(s)
    return fmt.Sprintf("%x", b)
}

func (f *Foo) Y(x string) string {
    var b bytes.Buffer
    s := struct {
        Bar string
        Baz string
        S   string
    }{
        f.Bar,
        f.Baz,
        x,
    }
    log.Printf("%v", s)
    gob.NewEncoder(&b).Encode(s)
    return fmt.Sprintf("%x", b)
}

func main() {
    a := &Foo{
        Bar: "bar",
        Baz: "baz",
    }

    log.Println(a.X())
    log.Println(a.Y("something"))
}

Running yields:

{bar baz}
{1cff81030102ff820001020103426172010c00010342617a010c0000000dff820103626172010362617a00 0 0}
{bar baz something}
{22ff83030102ff840001030103426172010c00010342617a010c00010153010c00000018ff840103626172010362617a0109736f6d657468696e6700 0 0}

Commenting out log.Println(a.X()) yields:

{bar baz something}
{22ff81030102ff820001030103426172010c00010342617a010c00010153010c00000018ff820103626172010362617a0109736f6d657468696e6700 0 0}

I expect the two encodings to the same but they differ (predictably) in locations that I assume correspond to field boundaries:

22
ff83 # 81
030102

ff84 # 82
0001030103426172010c00010342617a010c00010153010c00000018

ff84 # 82
0103626172010362617a0109736f6d657468696e6700

Even though the details differ the behavior is consistent with my code.

I'm creating a new bytes.Buffer and gob.NewEncoder in each method and so it's unclear why invoking X changes the result of Y.

CodePudding user response:

What you're missing is that the stream of bytes produced by an Encoder instance has global (program-wide) state in addition to the per-Encoder state. That global state consists of [note: edited phrase here] registered-and-sent types.

When you send a typed value, if the type has not yet been registered before being sent, it will be registered for you, in the global state. This assigns an internal numeric value to the type. See Register (and its companion RegisterName). When you call your X, that registers the anonymous struct type that holds s in X. When you call your Y, that registers the anonymous struct type that holds s in Y. These get different internal type numbers. By not calling X, that type is never registered, and Y's type gets registered under the first available number.

In my code I'm hashing the bytes buffer ...

That's not a great idea for what are now probably obvious reasons. However, if you explicitly register each type in a known order, you'll be safe enough here unless some future version changes the wire format for some (presumably good) reason. Oops, testing this shows it doesn't help either. That's because even if the type is registered, it doesn't have a transmit number set until the first time a value of that type gets encoded. So you need to encode (and optionally then just discard) a value of each type.

Here is a functioning example of carefully discard-encoding the two types, so that commenting out the call to log.Println(a.X()) has no effect on the encoding of the second value.

  • Related