I was asked in an interview to process two multidimensional arrays into one object with the format similar to:
{
'sally': {
id: 1,
metrics: {
sales: 6000,
missed: 0
}
},
'bob': {
id: 2,
metrics: {
sales: 1000,
missed: 0
}
},..
}
However, I was unable to figure it out. I've tried doing nested for-loops, but the problem with this is I'm worried about execution time. According to the interviewer, it should have an execution time of O(n)
and if I understand Big O correctly, a nested for-loop would have an execution time of around O(n^2)
. Here is the code I have so far:
// returns a dictionary
let testIAteShitOn = (requested_user = false, requested_metric = false) => {
// is the array always organized the same?
let users = [
[1, "Sally"],
[2, "Bob"],
[3, "George"],
];
// same here, is this always organized the same?
// are there only ever 2 metrics?
let metrics = [
[1, "sales", 5000],
[1, "sales", 1000],
[3, "missed", 1000],
[2, "sales", 1000],
];
for (let i = users.length - 1; i >= 0; i--) {
let id = users[i][0];
let data = findDiamond(id, metrics);
}
return obj;
}
let findDiamond = (id, array) => {
return array.filter((item) => item[0] == id);
}
I would appreciate any help on this and advice on where and how to best learn algorithms, data structures, and Big O.
CodePudding user response:
You could use two separate loops to get the wanted result with a helper object which has the reference to each user by the id
.
const
users = [[1, "Sally"], [2, "Bob"], [3, "George"]],
metrics = [[1, "sales", 5000], [1, "sales", 1000], [3, "missed", 1000], [2, "sales", 1000]],
references = {},
result = {};
for (const [id, name] of users) {
references[id] = result[name] = { id, metrics: { sales: 0, missed: 0} };
}
for (const [id, metric, value] of metrics) {
references[id].metrics[metric] = value;
}
console.log(result);
.as-console-wrapper { max-height: 100% !important; top: 0; }
CodePudding user response:
You can do in a more functional way using Array.reduce
and Object.fromEntries
const users = [[1, "Sally"], [2, "Bob"], [3, "George"]];
const metrics = [[1, "sales", 5000], [1, "sales", 1000], [3, "missed", 1000], [2, "sales", 1000]]
const usersDictonary = Object.fromEntries(users)
const output = metrics.reduce((res, [id, key, value]) => {
const user = usersDictonary[id]
const existing = res[user] || {id, metrics:{sales: 0, missed: 0}}
existing.metrics[key] = value existing.metrics[key]
return {
...res,
[user]: existing
}
}, {})
console.log(output)
CodePudding user response:
If you are unaware of the total metrics there are. You could create a set of metrics and create property in the ouptut object for each metric
const users = [[1, "Sally"], [2, "Bob"], [3, "George"]],
metrics = [[1, "sales", 5000], [1, "sales", 1000], [3, "missed", 1000], [2, "sales", 1000]],
output = {},
allMetrics = new Set(),
metricMap = {}
for (const [id, metric, value] of metrics) {
allMetrics.add(metric);
metricMap[id] ??= {}
metricMap[id][metric] = value
}
for (const [id, name] of users) {
output[name] = { id, metrics: {} }
for (const m of allMetrics) {
output[name].metrics[m] ??= 0
output[name].metrics[m] = metricMap[id][m] ?? 0
}
}
console.log(output)
CodePudding user response:
My approach would be to make several easy transformations until we reach our target state.
const combine = (users, metrics, types = [...new Set(metrics.map (([_, t]) => t))]) =>
Object .fromEntries (Object .entries (metrics .reduce (
(a, [id, type, value]) => {a [id] .metrics [type] = value; return a},
users .reduce ((a, [id, name]) => {
a [id] = {name, metrics: Object .fromEntries (types .map (t => [t, 0]))}
return a
}, {})
)) .map (([id, {name, ...rest}]) => [name, {id: Number (id), ...rest}]))
const users = [[1, "Sally"], [2, "Bob"], [3, "George"]]
const metrics = [[1, "sales", 5000], [1, "sales", 1000], [3, "missed", 1000], [2, "sales", 1000]]
console .log (combine (users, metrics))
.as-console-wrapper {max-height: 100% !important; top: 0}
We can think about this as a sequence of transformations, where we see the following steps:
Use a
reduce
call to collectusers
into an object keyed by id, including name and blank metrics:{ "1": {name: "Sally", metrics: {sales: 0, missed: 0}}, "2": {name: "Bob", metrics: {sales: 0, missed: 0}}, "3": {name: "George", metrics: {sales: 0, missed: 0}} }
Use another
reduce
call to add the real metric data frommetrics
:{ "1": {name: "Sally", metrics: {sales: 6000, missed: 0}}, "2": {name: "Bob", metrics: {sales: 1000, missed: 0}}, "3": {name: "George", metrics: {sales: 0, missed: 1000}} }
Turn that into array entries, with
Object .entries
:[ ["1", {name: "Sally", metrics: {sales: 6000, missed:0}}], ["2", {name: "Bob", metrics: {sales: 1000, missed: 0}}], ["3", {name: "George", metrics: {sales:0, missed: 1000}}] ]
map
these to reformat them to look more like our target format:[ ["Sally", {id: 1, metrics: {sales: 6000, missed:0}}], ["Bob", {id: 2, metrics: {sales: 1000, missed: 0}}], ["George", {id: 3, metrics: {sales:0, missed: 1000}}] ]
Use
Object.fromEntries
to turn these back into an object:{ Sally: {id: 1, metrics: {sales: 6000, missed:0}}, Bob: {id: 2, metrics: {sales: 1000, missed: 0}}, George: {id: 3, metrics: {sales:0, missed: 1000}} }
We should note that types
are extracted from the actual metrics, so we don't hard-code sales
and missed
. There is a minor performance optimization that we didn't do here: the types
parameter is just an array of Strings (['sales', 'missed']
) but when we use it to create our metrics object, we first map it (types .map (t => [t, 0])
), then call Object .fromEntries
on the result. We aren't using the original type names anywhere else, so we could do the mapping up front and only call forEntries
on the individual items. I doubt it would be a serious problem, but if you like, just move that mapping call to the end of the types
definition. (Note that we can't move the fromEntries
call there, because then the records would all share the same metrics
property by reference.
Also note that we do make the assumption that all metrics entries have a corresponding user one. If that might fail, we'd have to do something a little more complex to avoid adding to their non-existent metrics, or do something else to report the data mismatch. We leave that as an exercise for the reader.