Home > Back-end >  Create a generic procedure, which inserts data into any table
Create a generic procedure, which inserts data into any table

Time:05-11

I'm currently working on a .NET application and want to make it as modular as possible. I've already created a basic SELECT procedure, which returns data by checking inputted parameters on SQL Server side.

I want to create a procedure that parses structured data as string and inserts its' contents to corresponding table in database.

For example, I have a table as

CREATE TABLE ExampleTable ( 
  id_exampleTable int    IDENTITY (1, 1)  NOT NULL,
  exampleColumn1  nvarchar(200)           NOT NULL,
  exampleColumn2  int                     NULL,
  exampleColumn3  int                     NOT NULL,

  CONSTRAINT pk_exampleTable PRIMARY KEY  ( id_exampleTable ) 
)

And my procedure starts as

CREATE PROCEDURE InsertDataIntoCorrespondingTable
  @dataTable nvarchar(max), --name of Table in my DB
  @data nvarchar(max) --normalized string parameter as 'column1, column2, column3, etc.'
AS

BEGIN

  IF @dataTable = 'table'

    BEGIN
      /**Parse this string and execute insert command**/
    END

   ELSE IF /**Other statements**/
END

TL;DR

So basically, I'm looking for a solution that can help me achieve something like this

EXEC InsertDataIntoCorrespondingTableByID(
  @dataTable = 'ExampleTable', 
  @data = '''exampleColumn1'', 2, 3'
)

Which should be equal to just

INSERT INTO ExampleTable SELECT 'exampleColumn1', 2, 3

Sure, I can push data as INSERT statements (for each and every 14 tables inside DB...), generated inside an app, but I want to conquer T-SQL :)

CodePudding user response:

This might be reasonable (to some degree) on an RDBMS that supports structured data like JSON or XML natively, but doing this the way you are planning is going to cause some real pain-in-the-rear support and, more importantly, a sql injection attack vector. I would leave this to the realm of the web backend server where it belongs.

  1. You are likely going to invent your own structured data markup language and parser to solve this as sql server. That's a wheel that doesn't need to be reinvented. If you do end up building this, highly consider going with JSON to avoid all the issues that structured data inherently bring with it, assuming your version of sql server supports json parsing/packaging.

  2. Your front end that packages your data into your SDML is going to have to assume column ordinals, but column ordinal is not something that one should rely on in a database. SQL Amateurs often do, I know from years in the industry and dealing with end users that are upset when a new column is introduced in a position they don't want it. Adding a column to a table shouldn't break an application. If it does, that application has bad code.

  3. Regarding the sql injection attack vector, your SP code is going to get ugly. You'll need to parse out each item in @data into a variable of its own in order to properly parameterize your dynamic sql that is being built. See here under the "working with parameters" section for what that will look like. Failure to add this to your SP code means that values passed in that @data SDML could become executable SQL instead of literals and that would be very bad. This is not easy to solve in SP language. Where it IS easy to solve though is in the backend server code. Every database library on the planet supports parameterized query building/execution natively.

    Once you have this built you will be dynamically generating an INSERT statement and dynamically generating variables or an array or some data structure to pass in parameters to the INSERT statement to avoid sql injection attacks. It's going to be dynamic, on top of dynamic, on top of dynamic which leads to:

  4. From a support context, imagine that your application just totally throws up one day. You have to dive into investigate. You track the SDML that your front end created that caused the failure, and you open up your SP code to troubleshoot. Imagine what this code ends up looking like 0) It has to determine if the table exists

    1. It has to parse the SDML to get each literal
    2. It has to read DB metadata to get the column list
    3. It has to dynamically write the insert statement, listing the columns from metadata and dynamically creating sql parameters for the VALUES() list.
    4. It has to execute sending a dynamic number of variables into the dynamically generated sql.

    My support staff would hang me out to dry if they had to deal with that, and I'm the one paying them.

All of this is solved by using a proper backend to handle communication, deeper validation, sql parameter binding, error catching and handling, and all the other things that backend servers are meant to do.

I believe that your back end web server should be VERY aware of the underlying data model. It should be the connection between your view, your data, and your model. Leave the database to the things it's good at (reading and writing data). Leave your front end to the things that it's good at (presenting a UI for the end user).

CodePudding user response:

I suppose you could do something like this (may need a little extra work)

declare @columns varchar(max);
select @columns = string_agg(name, ', ') WITHIN GROUP ( ORDER BY column_id )
from sys.all_columns
where object_id = object_id(@dataTable);

declare @sql varchar(max) = select concat('INSERT INTO ',@dataTable,' (',@columns,') VALUES (', @data, ')')

exec sp_executesql @sql

But please don't. If this were a good idea, there would be tons of examples of how to do it. There aren't so it's probably not a good idea.

There are however tons of examples of using ORMs or auto-generated code in stead - because that way your code is maintainable, debugable and performant.

  • Related