I'm using SQL Server 2019. I have data in a table like this:
Resource | Tags | Cost |
---|---|---|
Prod | project:kentico, tenant:abc, environment:production | 1000 |
Prod | project:App services, tenant:abc, environment:production, Provider:abc | 2000 |
I want the tags column to be changed to json format to look like this:
Resource | Tags | Cost |
---|---|---|
Prod | {"project":"App services", "tenant":"abc","environment":"production"} | 1000 |
Prod | {"project":"App services", "tenant":"abc","environment":"production","Provider":"abc"} | 2000 |
CodePudding user response:
Assuming that neither commas (,
) or colons (:
) can appear in the values, you could just use REPLACE
:
SELECT Resource,
CONCAT('{"',REPLACE(REPLACE(Tags,', ','","'),':','":"'),'"}') AS Tags,
Cost
FROM (VALUES('Prod','project:kentico, tenant:abc, environment:production',1000),
('Prod','project:App services, tenant:abc, environment:production, Provider:abc',2000))V(Resource,Tags,Cost);
If your data isn't as well formed, I would suggest not doing this in SQL Server, and using something else that has good string manipulation functionality. Or, better yet, fix the process that is inserted the data in the first place to provide normalised data or proper JSON (this is the real solution).
CodePudding user response:
Another option here is to use String_Split()
and some prayers.
SELECT Resource,
Tags,
'{' STRING_AGG('"' REPLACE(value, ':', '":"') '"', ',') '}' as jsonvalue
FROM test
CROSS APPLY STRING_SPLIT(REPLACE(Tags, ' ', ''), ',')
GROUP BY Resource, Tags;
I agree with Larnu though that SQL Server is a terrible platform to be doing this work. Definitely feels more appropriate to pull this out into python and monkey around in there where a proper JSON library can be used to get the final result (and deal with error handling for all the edge cases that are likely to pop up).