Sorting with low memory size
What is the best way to sort a dictionary with 1Gbyte size(255 char for each word) with 2G of RAM?
I have already tried quicksort and didn't get the acceptable result.
This the quicksort code:
#include <iostream>
#include <fstream>
#include <cstring>
#define MAXL 4000000
using namespace std;
void swap(char *&ch1,char *&ch2)
{
char *temp = ch1;
ch1 = ch2;
ch2 = temp;
}
int partition (char **arr, int low, int high)
{
string pivot = arr[high]; // pivot
int i = (low - 1); // Index of smaller element
for (int j = low; j <= high- 1; j++)
{
// If current element is smaller than or
// equal to pivot
if (arr[j] <= pivot)
{
i++; // increment index of smaller element
swap(arr[i], arr[j]);
}
}
swap(arr[i + 1], arr[high]);
return (i + 1);
}
void quickSort(char **arr, int low, int high)
{
if (low < high)
{
int pi = partition(arr, low, high);
// Separately sort elements before
// partition and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
int main()
{
fstream file("input.txt",ios::in|ios::out|ios::app);
fstream o("output.txt",ios::out);
char **arr = new char*[MAXL];
for(int i=0;i<MAXL;i++)
arr[i] = new char[255];
long long i=0;
while(file)
{
//words are sepearated by spcae
file.getline(arr[i],256,' ');
i++;
}
file.close();
quickSort(arr, 0, i-2);
for(long long j=0;j<i-1;j++)
{
o << arr[j] << "n";
}
}
It takes more than 10 minutes to sort the mentioned list but it shouldn't take more than 20 seconds.
(MAXL is the number of words in the 1G file and input words are stored in a text file)
algorithm performance sorting time quicksort
|
show 10 more comments
What is the best way to sort a dictionary with 1Gbyte size(255 char for each word) with 2G of RAM?
I have already tried quicksort and didn't get the acceptable result.
This the quicksort code:
#include <iostream>
#include <fstream>
#include <cstring>
#define MAXL 4000000
using namespace std;
void swap(char *&ch1,char *&ch2)
{
char *temp = ch1;
ch1 = ch2;
ch2 = temp;
}
int partition (char **arr, int low, int high)
{
string pivot = arr[high]; // pivot
int i = (low - 1); // Index of smaller element
for (int j = low; j <= high- 1; j++)
{
// If current element is smaller than or
// equal to pivot
if (arr[j] <= pivot)
{
i++; // increment index of smaller element
swap(arr[i], arr[j]);
}
}
swap(arr[i + 1], arr[high]);
return (i + 1);
}
void quickSort(char **arr, int low, int high)
{
if (low < high)
{
int pi = partition(arr, low, high);
// Separately sort elements before
// partition and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
int main()
{
fstream file("input.txt",ios::in|ios::out|ios::app);
fstream o("output.txt",ios::out);
char **arr = new char*[MAXL];
for(int i=0;i<MAXL;i++)
arr[i] = new char[255];
long long i=0;
while(file)
{
//words are sepearated by spcae
file.getline(arr[i],256,' ');
i++;
}
file.close();
quickSort(arr, 0, i-2);
for(long long j=0;j<i-1;j++)
{
o << arr[j] << "n";
}
}
It takes more than 10 minutes to sort the mentioned list but it shouldn't take more than 20 seconds.
(MAXL is the number of words in the 1G file and input words are stored in a text file)
algorithm performance sorting time quicksort
6
What was wrong with quicksort ?
– MBo
Nov 21 '18 at 18:59
Any decent sorting algorithm is in-place, which means that 1GB is enough. Overhead will be insignificant.
– Yves Daoust
Nov 21 '18 at 19:12
@YvesDaoust ANY decent sorting algorithm? The best ones for large datasets tend to be based on mergesort, which is not in-place.
– btilly
Nov 21 '18 at 19:21
@YvesDaoust The point that you're ignoring is that, while inappropriate for some purposes, it is a decent sorting algorithm that is not in place. As another example look at Timsort, which is not only decent, but so good on real-world data that it has beaten everything else to become the default in Python, Java, and many other languages. Again, not an in place algorithm. (In part because it can resort to mergesort as a fallback.)
– btilly
Nov 21 '18 at 20:04
The point being that you can't just go to a standard library, take the recommended algorithm, and assume that it is in place. Because there is a very good chance that it isn't.
– btilly
Nov 21 '18 at 20:06
|
show 10 more comments
What is the best way to sort a dictionary with 1Gbyte size(255 char for each word) with 2G of RAM?
I have already tried quicksort and didn't get the acceptable result.
This the quicksort code:
#include <iostream>
#include <fstream>
#include <cstring>
#define MAXL 4000000
using namespace std;
void swap(char *&ch1,char *&ch2)
{
char *temp = ch1;
ch1 = ch2;
ch2 = temp;
}
int partition (char **arr, int low, int high)
{
string pivot = arr[high]; // pivot
int i = (low - 1); // Index of smaller element
for (int j = low; j <= high- 1; j++)
{
// If current element is smaller than or
// equal to pivot
if (arr[j] <= pivot)
{
i++; // increment index of smaller element
swap(arr[i], arr[j]);
}
}
swap(arr[i + 1], arr[high]);
return (i + 1);
}
void quickSort(char **arr, int low, int high)
{
if (low < high)
{
int pi = partition(arr, low, high);
// Separately sort elements before
// partition and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
int main()
{
fstream file("input.txt",ios::in|ios::out|ios::app);
fstream o("output.txt",ios::out);
char **arr = new char*[MAXL];
for(int i=0;i<MAXL;i++)
arr[i] = new char[255];
long long i=0;
while(file)
{
//words are sepearated by spcae
file.getline(arr[i],256,' ');
i++;
}
file.close();
quickSort(arr, 0, i-2);
for(long long j=0;j<i-1;j++)
{
o << arr[j] << "n";
}
}
It takes more than 10 minutes to sort the mentioned list but it shouldn't take more than 20 seconds.
(MAXL is the number of words in the 1G file and input words are stored in a text file)
algorithm performance sorting time quicksort
What is the best way to sort a dictionary with 1Gbyte size(255 char for each word) with 2G of RAM?
I have already tried quicksort and didn't get the acceptable result.
This the quicksort code:
#include <iostream>
#include <fstream>
#include <cstring>
#define MAXL 4000000
using namespace std;
void swap(char *&ch1,char *&ch2)
{
char *temp = ch1;
ch1 = ch2;
ch2 = temp;
}
int partition (char **arr, int low, int high)
{
string pivot = arr[high]; // pivot
int i = (low - 1); // Index of smaller element
for (int j = low; j <= high- 1; j++)
{
// If current element is smaller than or
// equal to pivot
if (arr[j] <= pivot)
{
i++; // increment index of smaller element
swap(arr[i], arr[j]);
}
}
swap(arr[i + 1], arr[high]);
return (i + 1);
}
void quickSort(char **arr, int low, int high)
{
if (low < high)
{
int pi = partition(arr, low, high);
// Separately sort elements before
// partition and after partition
quickSort(arr, low, pi - 1);
quickSort(arr, pi + 1, high);
}
}
int main()
{
fstream file("input.txt",ios::in|ios::out|ios::app);
fstream o("output.txt",ios::out);
char **arr = new char*[MAXL];
for(int i=0;i<MAXL;i++)
arr[i] = new char[255];
long long i=0;
while(file)
{
//words are sepearated by spcae
file.getline(arr[i],256,' ');
i++;
}
file.close();
quickSort(arr, 0, i-2);
for(long long j=0;j<i-1;j++)
{
o << arr[j] << "n";
}
}
It takes more than 10 minutes to sort the mentioned list but it shouldn't take more than 20 seconds.
(MAXL is the number of words in the 1G file and input words are stored in a text file)
algorithm performance sorting time quicksort
algorithm performance sorting time quicksort
edited Nov 22 '18 at 11:00
pedram
asked Nov 21 '18 at 18:47
pedrampedram
13
13
6
What was wrong with quicksort ?
– MBo
Nov 21 '18 at 18:59
Any decent sorting algorithm is in-place, which means that 1GB is enough. Overhead will be insignificant.
– Yves Daoust
Nov 21 '18 at 19:12
@YvesDaoust ANY decent sorting algorithm? The best ones for large datasets tend to be based on mergesort, which is not in-place.
– btilly
Nov 21 '18 at 19:21
@YvesDaoust The point that you're ignoring is that, while inappropriate for some purposes, it is a decent sorting algorithm that is not in place. As another example look at Timsort, which is not only decent, but so good on real-world data that it has beaten everything else to become the default in Python, Java, and many other languages. Again, not an in place algorithm. (In part because it can resort to mergesort as a fallback.)
– btilly
Nov 21 '18 at 20:04
The point being that you can't just go to a standard library, take the recommended algorithm, and assume that it is in place. Because there is a very good chance that it isn't.
– btilly
Nov 21 '18 at 20:06
|
show 10 more comments
6
What was wrong with quicksort ?
– MBo
Nov 21 '18 at 18:59
Any decent sorting algorithm is in-place, which means that 1GB is enough. Overhead will be insignificant.
– Yves Daoust
Nov 21 '18 at 19:12
@YvesDaoust ANY decent sorting algorithm? The best ones for large datasets tend to be based on mergesort, which is not in-place.
– btilly
Nov 21 '18 at 19:21
@YvesDaoust The point that you're ignoring is that, while inappropriate for some purposes, it is a decent sorting algorithm that is not in place. As another example look at Timsort, which is not only decent, but so good on real-world data that it has beaten everything else to become the default in Python, Java, and many other languages. Again, not an in place algorithm. (In part because it can resort to mergesort as a fallback.)
– btilly
Nov 21 '18 at 20:04
The point being that you can't just go to a standard library, take the recommended algorithm, and assume that it is in place. Because there is a very good chance that it isn't.
– btilly
Nov 21 '18 at 20:06
6
6
What was wrong with quicksort ?
– MBo
Nov 21 '18 at 18:59
What was wrong with quicksort ?
– MBo
Nov 21 '18 at 18:59
Any decent sorting algorithm is in-place, which means that 1GB is enough. Overhead will be insignificant.
– Yves Daoust
Nov 21 '18 at 19:12
Any decent sorting algorithm is in-place, which means that 1GB is enough. Overhead will be insignificant.
– Yves Daoust
Nov 21 '18 at 19:12
@YvesDaoust ANY decent sorting algorithm? The best ones for large datasets tend to be based on mergesort, which is not in-place.
– btilly
Nov 21 '18 at 19:21
@YvesDaoust ANY decent sorting algorithm? The best ones for large datasets tend to be based on mergesort, which is not in-place.
– btilly
Nov 21 '18 at 19:21
@YvesDaoust The point that you're ignoring is that, while inappropriate for some purposes, it is a decent sorting algorithm that is not in place. As another example look at Timsort, which is not only decent, but so good on real-world data that it has beaten everything else to become the default in Python, Java, and many other languages. Again, not an in place algorithm. (In part because it can resort to mergesort as a fallback.)
– btilly
Nov 21 '18 at 20:04
@YvesDaoust The point that you're ignoring is that, while inappropriate for some purposes, it is a decent sorting algorithm that is not in place. As another example look at Timsort, which is not only decent, but so good on real-world data that it has beaten everything else to become the default in Python, Java, and many other languages. Again, not an in place algorithm. (In part because it can resort to mergesort as a fallback.)
– btilly
Nov 21 '18 at 20:04
The point being that you can't just go to a standard library, take the recommended algorithm, and assume that it is in place. Because there is a very good chance that it isn't.
– btilly
Nov 21 '18 at 20:06
The point being that you can't just go to a standard library, take the recommended algorithm, and assume that it is in place. Because there is a very good chance that it isn't.
– btilly
Nov 21 '18 at 20:06
|
show 10 more comments
2 Answers
2
active
oldest
votes
If you can't fit it all in memory, a file-based merge sort will work well.
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
1
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
add a comment |
In-place algorithms are your solution. Find more here:
As another example, many sorting algorithms rearrange arrays into sorted order in-place, including bubble sort, comb sort, selection sort, insertion sort, heapsort, and Shell sort. These algorithms require only a few pointers, so their space complexity is O(log n).
QuickSort also requiresO(log n)
when it is properly implemented
– MBo
Nov 21 '18 at 18:58
@MBo Curious about where thelog n
here comes from. Is it the description of the size of the input?
– mrmcgreg
Nov 21 '18 at 19:03
1
@mrmcgreg Yes.log(n)
is size of stack for quicksort. Butlog(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.
– MBo
Nov 21 '18 at 19:11
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
time matters in this sort
– pedram
Nov 22 '18 at 10:02
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53418720%2fsorting-with-low-memory-size%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
If you can't fit it all in memory, a file-based merge sort will work well.
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
1
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
add a comment |
If you can't fit it all in memory, a file-based merge sort will work well.
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
1
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
add a comment |
If you can't fit it all in memory, a file-based merge sort will work well.
If you can't fit it all in memory, a file-based merge sort will work well.
answered Nov 21 '18 at 19:22
Jaco Van NiekerkJaco Van Niekerk
3,27511435
3,27511435
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
1
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
add a comment |
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
1
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
How can 1GB data fail to fit in 2GB memory ?
– Yves Daoust
Nov 21 '18 at 19:29
1
1
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@YvesDaoust Because there is overhead in the in memory representation of raw data. For example if you're using Python, per stackoverflow.com/questions/18596342/… every string has 37 bytes of overhead. So a 1 GB file of words will actually take several GB to load into memory.
– btilly
Nov 21 '18 at 20:15
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
@btilly: the question says 1 GB size. I take it literally.
– Yves Daoust
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
it fits in the memory
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
thank you for the answer
– pedram
Nov 21 '18 at 20:18
add a comment |
In-place algorithms are your solution. Find more here:
As another example, many sorting algorithms rearrange arrays into sorted order in-place, including bubble sort, comb sort, selection sort, insertion sort, heapsort, and Shell sort. These algorithms require only a few pointers, so their space complexity is O(log n).
QuickSort also requiresO(log n)
when it is properly implemented
– MBo
Nov 21 '18 at 18:58
@MBo Curious about where thelog n
here comes from. Is it the description of the size of the input?
– mrmcgreg
Nov 21 '18 at 19:03
1
@mrmcgreg Yes.log(n)
is size of stack for quicksort. Butlog(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.
– MBo
Nov 21 '18 at 19:11
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
time matters in this sort
– pedram
Nov 22 '18 at 10:02
add a comment |
In-place algorithms are your solution. Find more here:
As another example, many sorting algorithms rearrange arrays into sorted order in-place, including bubble sort, comb sort, selection sort, insertion sort, heapsort, and Shell sort. These algorithms require only a few pointers, so their space complexity is O(log n).
QuickSort also requiresO(log n)
when it is properly implemented
– MBo
Nov 21 '18 at 18:58
@MBo Curious about where thelog n
here comes from. Is it the description of the size of the input?
– mrmcgreg
Nov 21 '18 at 19:03
1
@mrmcgreg Yes.log(n)
is size of stack for quicksort. Butlog(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.
– MBo
Nov 21 '18 at 19:11
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
time matters in this sort
– pedram
Nov 22 '18 at 10:02
add a comment |
In-place algorithms are your solution. Find more here:
As another example, many sorting algorithms rearrange arrays into sorted order in-place, including bubble sort, comb sort, selection sort, insertion sort, heapsort, and Shell sort. These algorithms require only a few pointers, so their space complexity is O(log n).
In-place algorithms are your solution. Find more here:
As another example, many sorting algorithms rearrange arrays into sorted order in-place, including bubble sort, comb sort, selection sort, insertion sort, heapsort, and Shell sort. These algorithms require only a few pointers, so their space complexity is O(log n).
answered Nov 21 '18 at 18:57
OmGOmG
7,87352643
7,87352643
QuickSort also requiresO(log n)
when it is properly implemented
– MBo
Nov 21 '18 at 18:58
@MBo Curious about where thelog n
here comes from. Is it the description of the size of the input?
– mrmcgreg
Nov 21 '18 at 19:03
1
@mrmcgreg Yes.log(n)
is size of stack for quicksort. Butlog(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.
– MBo
Nov 21 '18 at 19:11
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
time matters in this sort
– pedram
Nov 22 '18 at 10:02
add a comment |
QuickSort also requiresO(log n)
when it is properly implemented
– MBo
Nov 21 '18 at 18:58
@MBo Curious about where thelog n
here comes from. Is it the description of the size of the input?
– mrmcgreg
Nov 21 '18 at 19:03
1
@mrmcgreg Yes.log(n)
is size of stack for quicksort. Butlog(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.
– MBo
Nov 21 '18 at 19:11
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
time matters in this sort
– pedram
Nov 22 '18 at 10:02
QuickSort also requires
O(log n)
when it is properly implemented– MBo
Nov 21 '18 at 18:58
QuickSort also requires
O(log n)
when it is properly implemented– MBo
Nov 21 '18 at 18:58
@MBo Curious about where the
log n
here comes from. Is it the description of the size of the input?– mrmcgreg
Nov 21 '18 at 19:03
@MBo Curious about where the
log n
here comes from. Is it the description of the size of the input?– mrmcgreg
Nov 21 '18 at 19:03
1
1
@mrmcgreg Yes.
log(n)
is size of stack for quicksort. But log(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.– MBo
Nov 21 '18 at 19:11
@mrmcgreg Yes.
log(n)
is size of stack for quicksort. But log(n)
space for mentioned algorithms is (too) formal definition based on bit size of variables needed for n range.– MBo
Nov 21 '18 at 19:11
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
Several of the cited algorithms require O(1) space. I can't adhere to the last sentence.
– Yves Daoust
Nov 21 '18 at 19:31
time matters in this sort
– pedram
Nov 22 '18 at 10:02
time matters in this sort
– pedram
Nov 22 '18 at 10:02
add a comment |
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53418720%2fsorting-with-low-memory-size%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
6
What was wrong with quicksort ?
– MBo
Nov 21 '18 at 18:59
Any decent sorting algorithm is in-place, which means that 1GB is enough. Overhead will be insignificant.
– Yves Daoust
Nov 21 '18 at 19:12
@YvesDaoust ANY decent sorting algorithm? The best ones for large datasets tend to be based on mergesort, which is not in-place.
– btilly
Nov 21 '18 at 19:21
@YvesDaoust The point that you're ignoring is that, while inappropriate for some purposes, it is a decent sorting algorithm that is not in place. As another example look at Timsort, which is not only decent, but so good on real-world data that it has beaten everything else to become the default in Python, Java, and many other languages. Again, not an in place algorithm. (In part because it can resort to mergesort as a fallback.)
– btilly
Nov 21 '18 at 20:04
The point being that you can't just go to a standard library, take the recommended algorithm, and assume that it is in place. Because there is a very good chance that it isn't.
– btilly
Nov 21 '18 at 20:06