Manipulating arrays to extract unique objects and count occurrences
up vote
-1
down vote
favorite
The data describes cycling activities (key1
) and chunks of them (key2
). In total there are around 1,000 key1
objects and arrays under key1
are 1-100 long.
- Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?
- I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?
// Original array
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
console.log(arr);
// Step 1: Extract meaningful information from original array
const arrOfArrOfObj = arr
.map(category => category.key1
.map(subCategory => (
{
id: subCategory.key2.id,
name: subCategory.key2.name
}
)
)
);
console.log(arrOfArrOfObj);
// Step 2: Make the array one dimensional
const arrOfObj = .concat(...arrOfArrOfObj);
console.log(arrOfObj);
// Step 3: Remove duplicates and count object occurrences.
let dedupedArrWithCount = ;
l = arrOfObj.length;
for (let i = 0; i < l; i++) {
let objExists = false;
for (let j = 0; j < dedupedArrWithCount.length; j++) {
// Two objects are identical if their ids are identical.
if (arrOfObj[i].id === dedupedArrWithCount[j].id) {
objExists = true;
dedupedArrWithCount[j].count += 1;
}
}
if (!objExists) {
dedupedArrWithCount.push({
id: arrOfObj[i].id,
name: arrOfObj[i].name,
count: 1
})
}
}
console.log(dedupedArrWithCount);
javascript beginner array functional-programming ecmascript-6
add a comment |
up vote
-1
down vote
favorite
The data describes cycling activities (key1
) and chunks of them (key2
). In total there are around 1,000 key1
objects and arrays under key1
are 1-100 long.
- Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?
- I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?
// Original array
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
console.log(arr);
// Step 1: Extract meaningful information from original array
const arrOfArrOfObj = arr
.map(category => category.key1
.map(subCategory => (
{
id: subCategory.key2.id,
name: subCategory.key2.name
}
)
)
);
console.log(arrOfArrOfObj);
// Step 2: Make the array one dimensional
const arrOfObj = .concat(...arrOfArrOfObj);
console.log(arrOfObj);
// Step 3: Remove duplicates and count object occurrences.
let dedupedArrWithCount = ;
l = arrOfObj.length;
for (let i = 0; i < l; i++) {
let objExists = false;
for (let j = 0; j < dedupedArrWithCount.length; j++) {
// Two objects are identical if their ids are identical.
if (arrOfObj[i].id === dedupedArrWithCount[j].id) {
objExists = true;
dedupedArrWithCount[j].count += 1;
}
}
if (!objExists) {
dedupedArrWithCount.push({
id: arrOfObj[i].id,
name: arrOfObj[i].name,
count: 1
})
}
}
console.log(dedupedArrWithCount);
javascript beginner array functional-programming ecmascript-6
1
what does this data describe? will there always be two levels or might there ever be 3 or more?
– Sᴀᴍ Onᴇᴌᴀ
Oct 24 at 21:42
The data describes cycling activities (key1) and chunks of them (key2). In total there are around 1,000 key1 objects and arrays under key1 are 1-100 long.
– Jelefra
Oct 24 at 21:48
so does that mean that there will only ever be two levels of keys?
– Sᴀᴍ Onᴇᴌᴀ
Oct 26 at 20:11
Yes that's the case.
– Jelefra
Oct 27 at 14:10
add a comment |
up vote
-1
down vote
favorite
up vote
-1
down vote
favorite
The data describes cycling activities (key1
) and chunks of them (key2
). In total there are around 1,000 key1
objects and arrays under key1
are 1-100 long.
- Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?
- I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?
// Original array
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
console.log(arr);
// Step 1: Extract meaningful information from original array
const arrOfArrOfObj = arr
.map(category => category.key1
.map(subCategory => (
{
id: subCategory.key2.id,
name: subCategory.key2.name
}
)
)
);
console.log(arrOfArrOfObj);
// Step 2: Make the array one dimensional
const arrOfObj = .concat(...arrOfArrOfObj);
console.log(arrOfObj);
// Step 3: Remove duplicates and count object occurrences.
let dedupedArrWithCount = ;
l = arrOfObj.length;
for (let i = 0; i < l; i++) {
let objExists = false;
for (let j = 0; j < dedupedArrWithCount.length; j++) {
// Two objects are identical if their ids are identical.
if (arrOfObj[i].id === dedupedArrWithCount[j].id) {
objExists = true;
dedupedArrWithCount[j].count += 1;
}
}
if (!objExists) {
dedupedArrWithCount.push({
id: arrOfObj[i].id,
name: arrOfObj[i].name,
count: 1
})
}
}
console.log(dedupedArrWithCount);
javascript beginner array functional-programming ecmascript-6
The data describes cycling activities (key1
) and chunks of them (key2
). In total there are around 1,000 key1
objects and arrays under key1
are 1-100 long.
- Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?
- I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?
// Original array
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
console.log(arr);
// Step 1: Extract meaningful information from original array
const arrOfArrOfObj = arr
.map(category => category.key1
.map(subCategory => (
{
id: subCategory.key2.id,
name: subCategory.key2.name
}
)
)
);
console.log(arrOfArrOfObj);
// Step 2: Make the array one dimensional
const arrOfObj = .concat(...arrOfArrOfObj);
console.log(arrOfObj);
// Step 3: Remove duplicates and count object occurrences.
let dedupedArrWithCount = ;
l = arrOfObj.length;
for (let i = 0; i < l; i++) {
let objExists = false;
for (let j = 0; j < dedupedArrWithCount.length; j++) {
// Two objects are identical if their ids are identical.
if (arrOfObj[i].id === dedupedArrWithCount[j].id) {
objExists = true;
dedupedArrWithCount[j].count += 1;
}
}
if (!objExists) {
dedupedArrWithCount.push({
id: arrOfObj[i].id,
name: arrOfObj[i].name,
count: 1
})
}
}
console.log(dedupedArrWithCount);
// Original array
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
console.log(arr);
// Step 1: Extract meaningful information from original array
const arrOfArrOfObj = arr
.map(category => category.key1
.map(subCategory => (
{
id: subCategory.key2.id,
name: subCategory.key2.name
}
)
)
);
console.log(arrOfArrOfObj);
// Step 2: Make the array one dimensional
const arrOfObj = .concat(...arrOfArrOfObj);
console.log(arrOfObj);
// Step 3: Remove duplicates and count object occurrences.
let dedupedArrWithCount = ;
l = arrOfObj.length;
for (let i = 0; i < l; i++) {
let objExists = false;
for (let j = 0; j < dedupedArrWithCount.length; j++) {
// Two objects are identical if their ids are identical.
if (arrOfObj[i].id === dedupedArrWithCount[j].id) {
objExists = true;
dedupedArrWithCount[j].count += 1;
}
}
if (!objExists) {
dedupedArrWithCount.push({
id: arrOfObj[i].id,
name: arrOfObj[i].name,
count: 1
})
}
}
console.log(dedupedArrWithCount);
// Original array
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
console.log(arr);
// Step 1: Extract meaningful information from original array
const arrOfArrOfObj = arr
.map(category => category.key1
.map(subCategory => (
{
id: subCategory.key2.id,
name: subCategory.key2.name
}
)
)
);
console.log(arrOfArrOfObj);
// Step 2: Make the array one dimensional
const arrOfObj = .concat(...arrOfArrOfObj);
console.log(arrOfObj);
// Step 3: Remove duplicates and count object occurrences.
let dedupedArrWithCount = ;
l = arrOfObj.length;
for (let i = 0; i < l; i++) {
let objExists = false;
for (let j = 0; j < dedupedArrWithCount.length; j++) {
// Two objects are identical if their ids are identical.
if (arrOfObj[i].id === dedupedArrWithCount[j].id) {
objExists = true;
dedupedArrWithCount[j].count += 1;
}
}
if (!objExists) {
dedupedArrWithCount.push({
id: arrOfObj[i].id,
name: arrOfObj[i].name,
count: 1
})
}
}
console.log(dedupedArrWithCount);
javascript beginner array functional-programming ecmascript-6
javascript beginner array functional-programming ecmascript-6
edited Nov 1 at 16:38
Sᴀᴍ Onᴇᴌᴀ
7,71061748
7,71061748
asked Oct 24 at 21:24
Jelefra
12
12
1
what does this data describe? will there always be two levels or might there ever be 3 or more?
– Sᴀᴍ Onᴇᴌᴀ
Oct 24 at 21:42
The data describes cycling activities (key1) and chunks of them (key2). In total there are around 1,000 key1 objects and arrays under key1 are 1-100 long.
– Jelefra
Oct 24 at 21:48
so does that mean that there will only ever be two levels of keys?
– Sᴀᴍ Onᴇᴌᴀ
Oct 26 at 20:11
Yes that's the case.
– Jelefra
Oct 27 at 14:10
add a comment |
1
what does this data describe? will there always be two levels or might there ever be 3 or more?
– Sᴀᴍ Onᴇᴌᴀ
Oct 24 at 21:42
The data describes cycling activities (key1) and chunks of them (key2). In total there are around 1,000 key1 objects and arrays under key1 are 1-100 long.
– Jelefra
Oct 24 at 21:48
so does that mean that there will only ever be two levels of keys?
– Sᴀᴍ Onᴇᴌᴀ
Oct 26 at 20:11
Yes that's the case.
– Jelefra
Oct 27 at 14:10
1
1
what does this data describe? will there always be two levels or might there ever be 3 or more?
– Sᴀᴍ Onᴇᴌᴀ
Oct 24 at 21:42
what does this data describe? will there always be two levels or might there ever be 3 or more?
– Sᴀᴍ Onᴇᴌᴀ
Oct 24 at 21:42
The data describes cycling activities (key1) and chunks of them (key2). In total there are around 1,000 key1 objects and arrays under key1 are 1-100 long.
– Jelefra
Oct 24 at 21:48
The data describes cycling activities (key1) and chunks of them (key2). In total there are around 1,000 key1 objects and arrays under key1 are 1-100 long.
– Jelefra
Oct 24 at 21:48
so does that mean that there will only ever be two levels of keys?
– Sᴀᴍ Onᴇᴌᴀ
Oct 26 at 20:11
so does that mean that there will only ever be two levels of keys?
– Sᴀᴍ Onᴇᴌᴀ
Oct 26 at 20:11
Yes that's the case.
– Jelefra
Oct 27 at 14:10
Yes that's the case.
– Jelefra
Oct 27 at 14:10
add a comment |
1 Answer
1
active
oldest
votes
up vote
0
down vote
Responding to your questions
“Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?”
While I like the benefits of functional approaches, they are often slower because of the extra function calls.
Steps 1 and 2 could be simplified by using two for...of
loops (since ecmascript-6 features like const
are also used). Also, instead of constructing a new object to return in the nested map callback, you can just return subCategory.key2
(though perhaps you simplified the data and the original data contains more properties that aren't needed in the end).
const arrOfObj = ;
for (let category of arr) {
for (let subCategory of category.key1) {
arrOfObj.push(subCategory.key2);
}
}
This generally works faster, at least for the small dataset supplied - see this jsPerf test for comparison
“I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?”
I was thinking of using JSON.stringify() to make an array of counts in step 1 and then use JSON.parse() in step 3 for lookups in constant time, but apparently that was slower, possibly because of the original data set which only has 1 collision. Maybe for a larger dataset that would be faster. See this jsPerf for a comparison.
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
Other feedback
The variable l
is declared without any keyword:
let dedupedArrWithCount = ;
l = arrOfObj.length;
Unless this code is wrapped in an IIFE, l
becomes a global variable, which can lead to unintentional consequences if that name is used later. It is advisable to use const
(or let
if there was a need to re-assign it).
In that same vein, dedupedArrWithCount
could be declared with the const
keyword, since it is never re-assigned, just mutated using the push method.
add a comment |
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
up vote
0
down vote
Responding to your questions
“Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?”
While I like the benefits of functional approaches, they are often slower because of the extra function calls.
Steps 1 and 2 could be simplified by using two for...of
loops (since ecmascript-6 features like const
are also used). Also, instead of constructing a new object to return in the nested map callback, you can just return subCategory.key2
(though perhaps you simplified the data and the original data contains more properties that aren't needed in the end).
const arrOfObj = ;
for (let category of arr) {
for (let subCategory of category.key1) {
arrOfObj.push(subCategory.key2);
}
}
This generally works faster, at least for the small dataset supplied - see this jsPerf test for comparison
“I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?”
I was thinking of using JSON.stringify() to make an array of counts in step 1 and then use JSON.parse() in step 3 for lookups in constant time, but apparently that was slower, possibly because of the original data set which only has 1 collision. Maybe for a larger dataset that would be faster. See this jsPerf for a comparison.
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
Other feedback
The variable l
is declared without any keyword:
let dedupedArrWithCount = ;
l = arrOfObj.length;
Unless this code is wrapped in an IIFE, l
becomes a global variable, which can lead to unintentional consequences if that name is used later. It is advisable to use const
(or let
if there was a need to re-assign it).
In that same vein, dedupedArrWithCount
could be declared with the const
keyword, since it is never re-assigned, just mutated using the push method.
add a comment |
up vote
0
down vote
Responding to your questions
“Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?”
While I like the benefits of functional approaches, they are often slower because of the extra function calls.
Steps 1 and 2 could be simplified by using two for...of
loops (since ecmascript-6 features like const
are also used). Also, instead of constructing a new object to return in the nested map callback, you can just return subCategory.key2
(though perhaps you simplified the data and the original data contains more properties that aren't needed in the end).
const arrOfObj = ;
for (let category of arr) {
for (let subCategory of category.key1) {
arrOfObj.push(subCategory.key2);
}
}
This generally works faster, at least for the small dataset supplied - see this jsPerf test for comparison
“I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?”
I was thinking of using JSON.stringify() to make an array of counts in step 1 and then use JSON.parse() in step 3 for lookups in constant time, but apparently that was slower, possibly because of the original data set which only has 1 collision. Maybe for a larger dataset that would be faster. See this jsPerf for a comparison.
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
Other feedback
The variable l
is declared without any keyword:
let dedupedArrWithCount = ;
l = arrOfObj.length;
Unless this code is wrapped in an IIFE, l
becomes a global variable, which can lead to unintentional consequences if that name is used later. It is advisable to use const
(or let
if there was a need to re-assign it).
In that same vein, dedupedArrWithCount
could be declared with the const
keyword, since it is never re-assigned, just mutated using the push method.
add a comment |
up vote
0
down vote
up vote
0
down vote
Responding to your questions
“Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?”
While I like the benefits of functional approaches, they are often slower because of the extra function calls.
Steps 1 and 2 could be simplified by using two for...of
loops (since ecmascript-6 features like const
are also used). Also, instead of constructing a new object to return in the nested map callback, you can just return subCategory.key2
(though perhaps you simplified the data and the original data contains more properties that aren't needed in the end).
const arrOfObj = ;
for (let category of arr) {
for (let subCategory of category.key1) {
arrOfObj.push(subCategory.key2);
}
}
This generally works faster, at least for the small dataset supplied - see this jsPerf test for comparison
“I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?”
I was thinking of using JSON.stringify() to make an array of counts in step 1 and then use JSON.parse() in step 3 for lookups in constant time, but apparently that was slower, possibly because of the original data set which only has 1 collision. Maybe for a larger dataset that would be faster. See this jsPerf for a comparison.
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
Other feedback
The variable l
is declared without any keyword:
let dedupedArrWithCount = ;
l = arrOfObj.length;
Unless this code is wrapped in an IIFE, l
becomes a global variable, which can lead to unintentional consequences if that name is used later. It is advisable to use const
(or let
if there was a need to re-assign it).
In that same vein, dedupedArrWithCount
could be declared with the const
keyword, since it is never re-assigned, just mutated using the push method.
Responding to your questions
“Is there a simpler, more semantic, faster, or otherwise better way than to use two nested map in step 1?”
While I like the benefits of functional approaches, they are often slower because of the extra function calls.
Steps 1 and 2 could be simplified by using two for...of
loops (since ecmascript-6 features like const
are also used). Also, instead of constructing a new object to return in the nested map callback, you can just return subCategory.key2
(though perhaps you simplified the data and the original data contains more properties that aren't needed in the end).
const arrOfObj = ;
for (let category of arr) {
for (let subCategory of category.key1) {
arrOfObj.push(subCategory.key2);
}
}
This generally works faster, at least for the small dataset supplied - see this jsPerf test for comparison
“I realise that step 3 is longer than it should be and there has to be a better way to do it. How can step 3 be improved?”
I was thinking of using JSON.stringify() to make an array of counts in step 1 and then use JSON.parse() in step 3 for lookups in constant time, but apparently that was slower, possibly because of the original data set which only has 1 collision. Maybe for a larger dataset that would be faster. See this jsPerf for a comparison.
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
Other feedback
The variable l
is declared without any keyword:
let dedupedArrWithCount = ;
l = arrOfObj.length;
Unless this code is wrapped in an IIFE, l
becomes a global variable, which can lead to unintentional consequences if that name is used later. It is advisable to use const
(or let
if there was a need to re-assign it).
In that same vein, dedupedArrWithCount
could be declared with the const
keyword, since it is never re-assigned, just mutated using the push method.
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
const arr = [
{key1: [{key2: {id: 1, name: 'a'}}]},
{key1: [{key2: {id: 2, name: 'b'}}]},
{key1: [{key2: {id: 2, name: 'b'}}, {key2: {id: 3, name: 'c'}}]}
];
const counts = {};
for (let category of arr) {
for (let subCategory of category.key1) {
const countKey = JSON.stringify(subCategory.key2)
counts[countKey] = (counts[countKey] || 0) + 1;
}
}
const dedupedArrWithCount = ;
for (let key in counts) {
const obj = JSON.parse(key);
dedupedArrWithCount.push(Object.assign(obj, {
count: counts[key]
}));
}
console.log(dedupedArrWithCount);
edited 7 hours ago
answered Nov 1 at 16:40
Sᴀᴍ Onᴇᴌᴀ
7,71061748
7,71061748
add a comment |
add a comment |
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcodereview.stackexchange.com%2fquestions%2f206227%2fmanipulating-arrays-to-extract-unique-objects-and-count-occurrences%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
1
what does this data describe? will there always be two levels or might there ever be 3 or more?
– Sᴀᴍ Onᴇᴌᴀ
Oct 24 at 21:42
The data describes cycling activities (key1) and chunks of them (key2). In total there are around 1,000 key1 objects and arrays under key1 are 1-100 long.
– Jelefra
Oct 24 at 21:48
so does that mean that there will only ever be two levels of keys?
– Sᴀᴍ Onᴇᴌᴀ
Oct 26 at 20:11
Yes that's the case.
– Jelefra
Oct 27 at 14:10