I have 3 tables in a database (Records, Works and People) that I need to query and return as JSON. The data looks something like:
ID | Record | Work | People
_____________________________
1 | Rec 1 | Work 1 | Mr A
1 | Rec 1 | Work 1 | Mrs A
1 | Rec 1 | Work 1 | Mr B
2 | Rec 2 | Work 2 | Mr C
3 | Rec 3 | Work 2 | Mr C
So a Record is linked to a Work and that can have multiple People against it. The resulting JSON should look something like:
[ { Record: "Rec 1"
, Work: "Work 1"
, People: [ { Name: Mr A },
{ Name: Mrs A },
{ Name: Mr B }
]
},
{ Record: "Rec 2"
, Work: "Work 2"
, People: [ { Name: Mr C } ]
} ,
{ Record: "Rec 3"
, Work: "Work 2"
, People: [ { Name: Mr C } ]
}
]
My question is; in MuleSoft what would be the best way to achieve this?
- Have 3 query components and then use a Scatter/Gather to stitch the results together.
- Have 1 stored procedure call that returns the data as shown above and then use DataWeave to unpick it.
- Have 1 stored procedure call that does all the querying and builds the JSON payload for me.
This is related to performance mainly and then reduced complexity. The number of records could run into the millions, so I need the most efficient way to do it. I'm leaning towards letting the database do the work and return the pre-formatted JSON, but I'm not sure if there are any downsides to doing that.
CodePudding user response:
Assuming this is an API you should only have 1 Query scope and update your procedure to gather all this data as part of a SQL Join function then use dataweave to filter and build your response structure.
CodePudding user response:
There is not a predetermined best way. Test the different methods and compare the results in speed and resource usage. Tune configurations to detect possible improvements. Specially with Mule 4 database queries with high volumes you'll want to disable repeatble streams for performance.