Home > front end >  How to design RestAPI for too many tables in Golang
How to design RestAPI for too many tables in Golang

Time:06-16

I think if i keep using the method below, i'll have to write too much code.


I declared structures for all the tables. and i used the go validate package for validation.

[types.go]

type TableA struct {
    Field1 string `json:"field1" validate:"required, max=10"`
    Field2 int    `json:"field2" validate:"number"`
}

type TableB struct {
    ...
}

And i initialized the router for each method and connected the handlers.

[tableA.go]

router.Get("/table-a", r.Get_tableA_Handler),
router.Post("/table-a", r.Post_tableA_Handler),
router.Patch("/table-a", r.Patch_tableA_Handler),
router.Delete("/table-a", r.Delete_tableA_Handler)
...

Each handler parses the json in the request body, validates the data and call the db function.

[tableA_router.go]

func (rt *tableARouter) Post_tableA_Handler(w http.ResponseWriter, r *http.Request) error {

    //Json to Struct
    req := new(types.tableA)
    if err := httputils.DecodeJsonBody(r, req); err != nil {
        return err
    }

    // Validation 
    if err := validCheck(req); err != nil { 
        return err
    }

    // DB function
    err := rt.insert_tableA_DB(r.Context(), req) 
    if err != nil {
        return err
    }

    return rt.rd.JSON(w, http.StatusCreated, "Create Success")
}

...

func validCheck(data interface{}) error {
    validate := validator.New()
    err := validate.Struct(data)
    return err
}

This is a DB function called from the handler function above (using Gorm)

[tableA_db.go]

func (rt *tableARouter) insert_tableA_DB(ctx context.Context, data *types.TableA) error {
    // DB Connect
    db, err := db.Open(rt.dbcfg)
    if err != nil {
        return err
    }
    defer db.Close()

    tx := db.Begin()
    defer tx.Rollback()


    // == INSERT ==
    query := `INSERT INTO table_a
        (field1, field2, ...)
        VALUES (?, ?, ...)`
    result := tx.WithContext(ctx).Exec(query,
        data.Field1, data.Field2, ...)


    //Result
    if result.Error != nil {
    ...
}

There are too many tables now... If there are 100 tables i have to write 100 handlers and 100 DB functions. Is there any way to use something like /tables/{tableName}? Please give me any advice.... Thank you.

CodePudding user response:

You can use an ORM package, like GORM to make easier your work.

Or you can make an universal handler and with the reflect package, analyze your defined structs and make every SQL query dinamically. But it's not the best solution if any of your struct has inner slices, other embedded structs, or if you need to use joined tables you also have to deal with it manually. I have servers where we have more than 200 endpoints with more than 3-400 methods with 200 SQL tables and the whole server was written by hand. But I can say, it's very rare when a handler and the DB func can be reused without modifying.

Maybe you can wrap the error handling, rollback/commit, json parse and response parts in a func then use it to call the DB methods.

  • Related