Home > Back-end >  Is there a data structure for implementing a function equivalent to 'tail -n' command in C
Is there a data structure for implementing a function equivalent to 'tail -n' command in C

Time:04-06

I want to write a function equivalent to the Linux tail -n command in C . While, I parsed over the data of that file line-by-line thereby incrementing the line count, if the file size gets really big(~gigabytes), this method will take a lot of time! Is there a better approach or a data structure to implement this function?

Here are my 2 methods:

int File::countlines()
{
    int lineCount = 0;
    string str;

    if (file)
       {
            while (getline(file, str))
            {
                lineCount  = 1;
            }
     }
   return lineCount;
}

void File::printlines()
{
    int lineCount = 0;
    string line;

    if (file)
    {
        lineCount = countlines();

        file.clear();
        file.seekg(ios::beg);

        if (lineCount <= 10)
        {
             while (getline(file, line))
             {
                 cout << line << endl;
             }
        }
        else
        {
            int position = lineCount - 10;
            while (position--)
            {
                getline(file, line);
            }
           while (getline(file, line))
           {
               cout << line << endl;
           }
       }
}
}

This method is time consuming if the file size increases, so I want to either replace it with another data structure, or write a more efficient code.

CodePudding user response:

One of the things that is slowing down your program is reading the file twice, so you could keep the last n EOL positions (n=10 in your program) and the most convenient data structure is a circular buffer but this isn't provided by the standard library as far as I know (boost has one). It can be implemented by an array with an index where a modulo of n is done after incrementing.

With that circular buffer, you can jump immediately to the lowest offset (next one if buffer is full) in the file and print the needed lines.

CodePudding user response:

When I've done this, I've done a generous estimate of the maximum length of a line (e.g., one kilobyte), seeked to that distance from the end, and started reading lines into a circular buffer until the end of the file.

In nearly every case, you get more than n lines, so you just print out the contents of the circular buffer, and you're done. Note, however, that you do need to assure that you read more than n lines, not just n lines. The first line you read will usually only be a partial line, so if you read exactly n lines, the first would probably be only a partial line.

On rare occasion, you haven't gotten the required number of lines, so you seek back twice as far (or other factor of your choice), and restart. If you want to get really fancy, you can extrapolate the number of lines you'll need based on the average length of the lines you did read (but honestly, this is such a rare situation it's not worth a lot of work to optimize it).

This normally works essentially instantly, regardless of file size. I suppose (in theory) for a file with incredibly long lines, it would get slower, but if that's the case, the user has probably made a mistake, and tried to tail something that isn't a text file (which is generally useless anyway).

  • Related