Convert []string to []byte

后端 未结 6 1740
梦谈多话
梦谈多话 2021-02-07 01:36

I am looking to convert a string array to a byte array in GO so I can write it down to a disk. What is an optimal solution to encode and decode a string array ([]string

6条回答
  •  眼角桃花
    2021-02-07 02:35

    Lets ignore the fact that this is Go for a second. The first thing you need is a serialization format to marshal the []string into.

    There are many option here. You could build your own or use a library. I am going to assume you don't want to build your own and jump to serialization formats go supports.

    In all examples, data is the []string and fp is the file you are reading/writing to. Errors are being ignored, check the returns of functions to handle errors.

    Gob

    Gob is a go only binary format. It should be relatively space efficient as the number of strings increases.

    enc := gob.NewEncoder(fp)
    enc.Encode(data)
    

    Reading is also simple

    var data []string
    dec := gob.NewDecoder(fp)
    dec.Decode(&data)
    

    Gob is simple and to the point. However, the format is only readable with other Go code.

    Json

    Next is json. Json is a format used just about everywhere. This format is just as easy to use.

    enc := json.NewEncoder(fp)
    enc.Encode(data)
    

    And for reading:

    var data []string
    dec := json.NewDecoder(fp)
    dec.Decode(&data)
    

    XML

    XML is another common format. However, it has pretty high overhead and not as easy to use. While you could just do the same you did for gob and json, proper xml requires a root tag. In this case, we are using the root tag "Strings" and each string is wrapped in an "S" tag.

    type Strings struct {
        S []string
    }
    
    enc := xml.NewEncoder(fp)
    enc.Encode(Strings{data})
    
    var x Strings
    dec := xml.NewDecoder(fp)
    dec.Decode(&x)
    data := x.S
    

    CSV

    CSV is different from the others. You have two options, use one record with n rows or n records with 1 row. The following example uses n records. It would be boring if I used one record. It would look too much like the others. CSV can ONLY hold strings.

    enc := csv.NewWriter(fp)
    for _, v := range data {
        enc.Write([]string{v})
    }
    enc.Flush()
    

    To read:

    var err error
    var data string
    dec := csv.NewReader(fp)
    for err == nil {        // reading ends when an error is reached (perhaps io.EOF)
        var s []string
    
        s, err = dec.Read()
        if len(s) > 0 {
            data = append(data, s[0])
        }
    }
    

    Which format you use is a matter of preference. There are many other possible encodings that I have not mentioned. For example, there is an external library called bencode. I don't personally like bencode, but it works. It is the same encoding used by bittorrent metadata files.

    If you want to make your own encoding, encoding/binary is a good place to start. That would allow you to make the most compact file possible, but I hardly thing it is worth the effort.

提交回复
热议问题