read consecutive tabs as empty field fscanf

后端 未结 2 626
花落未央
花落未央 2021-01-24 23:06

I have a file that has certain fields separated by tabs. There will always be 12 tabs in a line, certain tabs are consecutive which indicates an empty field. I wanna use fscanf

2条回答
  •  天命终不由人
    2021-01-24 23:41

    fscanf is a non-starter. The only way to read empty fields would be to use "%c" to read delimiters (and that would require you to know which fields were empty beforehand -- not very useful) Otherwise, depending on the format specifier used, fscanf would simply consume the tabs as leading whitespace or experience a matching failure or input failure.

    Continuing from the comment, in order to tokenize based on delimiters that may separate empty fields, you will need to use strsep as strtok will consider consecutive delimiters as one.

    While your string is a bit unclear where the tabs are located, a short example of tokenizing with strsep could be as follows. Note that strsep takes a pointer-to-pointer as its first argument, e.g.

    #include 
    #include 
    #include 
    
    int main (void) {
    
        int n = 0;
        const char *delim = "\t\n";
        char *s = strdup ("usrid\tUser Id 0\t15\tstring\td\tk\ty\ty\t\t\t0\t0"),
            *toks = s,   /* tokenize with separate pointer to preserve s */
            *p;
    
        while ((p = strsep (&toks, delim)))
            printf ("token[%2d]: '%s'\n", n++ + 1, p);
    
        free (s);
    }
    

    (note: since strsep will modify the address held by the string pointer, you need to preserve a pointer to the beginning of s so it can be freed when no longer needed -- thanks JL)

    Example Use/Output

    $ ./bin/strtok_tab
    token[ 1]: 'usrid'
    token[ 2]: 'User Id 0'
    token[ 3]: '15'
    token[ 4]: 'string'
    token[ 5]: 'd'
    token[ 6]: 'k'
    token[ 7]: 'y'
    token[ 8]: 'y'
    token[ 9]: ''
    token[10]: ''
    token[11]: '0'
    token[12]: '0'
    

    Look things over and let me know if you have further questions.

提交回复
热议问题