I have the following piece of code:
#include <iostream>
using namespace std;
int main() {
// Number of inputs
int N;
scanf("%d", &N);
// Takes in input and simply outputs the step it's on
for (int i = 0; i < N; i ) {
int Temp;
scanf("%d", &Temp);
printf("%d ", i);
}
}
When taking in a large amount of integer input, C stops at a certain point in printing output, seemingly waiting for more input to come. Given an input of 2049 1's, the program stops after printing 2048 integers (0 up to 2047), and does not print the final 2048 (the 2049th integer). 2048 looks suspicious, being a power of 2. It seems to be the case that the larger the input values, the quicker the program decides to stop, and in this case after what looks like a random number of steps. For example, I gave it 991 integers (up to the ten thousands), and the program stopped outputting after iteration 724. Note that I copied and pasted the numbers as a whole block, rather than typing and entering them one by one, but I doubt this plays a role. I also tried cin and cout, but they did not help. Could someone please explain the reasons behind this phenomenon?
CodePudding user response:
I have found the answer to my question. The reason behind the failure is indeed due to copying and pasting large chunks of input, as many have suggested, and I thank everyone for their help. There were no incorrect characters, though, and cause of this problem is instead the 4096 character limit posed by canonical mode.
In canonical mode, the terminal lets the user navigate the input, using arrow keys, backspace, etc. It sends the text to the processor only when there is a newline or the buffer is full. The size of this buffer being 4096 characters, it becomes clear why the code fails to parse more input than that, i.e. 2049 "1 "
s is 4098 characters. One can switch to noncanonical mode, which allows larger input at the expense of not being able to navigate it, using stty -icanon
. Entering stty icanon
takes it back to canonical mode.
Practically speaking, entering the input with newlines separating the numbers seems like the easiest fix.
This source was quite helpful to me: http://blog.chaitanya.im/4096-limit. This post on unix stack exchange is similar to my problem: https://unix.stackexchange.com/questions/131105/how-to-read-over-4k-input-without-new-lines-on-a-terminal.
CodePudding user response:
My first thought was that you're reaching some sort of barrier on the input side...
The limit for the length of a command line is not [typically] imposed by the shell, but by the operating system.
Bash Command Line and Input Limit - SO
However, this is probably not the case.
First, focus on your data, make sure your data is what you think it is (no unexpected characters) and then try to debug your reads, make sure the values are making it into memory like you intend.
Try separating out your read and write into two loops, this might help your debugging a little easier depending on your skill level, but again, making sure something funky isn't going on with your reads. Suspicion is high with the reads on this one...
Here's a couple of cracks at it below... haven't tested. Hope this helps!
#include <iostream>
int main() {
int N;
std::cin >> N;
// Read N integers, storing to array
int* numbers = new int[N];
for (int i = 0; i < N; i ) {
std::cin >> numbers[i];
}
// Print
for (int i = 0; i < N; i ) {
std::cout << numbers[i] << " ";
}
std::cout << std::endl;
// Free the dynamically allocated memory
delete[] numbers;
return 0;
}
Okay... maybe a little more optimized...
#include <iostream>
int main() {
int N;
std::cin >> N;
// fixed-size on the stack
int numbers[N];
// cin.tie(nullptr) and ios::sync_with_stdio(false) might improve perf.
std::cin.tie(nullptr);
std::ios::sync_with_stdio(false);
// Read N integers, storing to array
for (int i = 0; i < N; i ) {
std::cin >> numbers[i];
}
// Print
for (int i = 0; i < N; i ) {
std::cout << numbers[i] << " ";
}
std::cout << std::endl;
return 0;
}