Serialise and deserialise vector in binary

前端 未结 4 1212
时光说笑
时光说笑 2020-12-11 21:42

I am having problems trying to serialise a vector (std::vector) into a binary format and then correctly deserialise it and be able to read the data. This is my first time us

相关标签:
4条回答
  • 2020-12-11 21:53

    You can't unserialise a non-POD class by overwriting an existing instance as you seem to be trying to do - you need to give the class a constructor that reads the data from the stream and constructs a new instance of the class with it.

    In outline, given something like this:

    class A {
        A();   
        A( istream & is );    
        void serialise( ostream & os );
        vector <int> v;
    };
    

    then serialise() would write the length of the vector followed by the vector contents. The constructor would read the vector length, resize the vector using the length, then read the vector contents:

    void A :: serialise( ostream & os ) {
        size_t vsize = v.size();    
        os.write((char*)&vsize, sizeof(vsize));
        os.write((char*)&v[0], vsize * sizeof(int) );
    }
    
    A :: A( istream & is ) {
        size_t vsize;
        is.read((char*)&vsize, sizeof(vsize));
        v.resize( vsize );
        is.read((char*)&v[0], vsize * sizeof(int));
    }
    
    0 讨论(0)
  • 2020-12-11 22:06

    You're using the address of the vector. What you need/want is the address of the data being held by the vector. Writing, for example, would be something like:

    size = example.size();
    file.write((char *)&size, sizeof(size));
    file.write((char *)&example[0], sizeof(example[0] * size));
    
    0 讨论(0)
  • 2020-12-11 22:11

    I would write in network byte order to ensure file can be written&read on any platform. So:

    #include <fstream>
    #include <iostream>
    #include <iomanip>
    #include <vector>
    
    #include <arpa/inet.h>
    
    int main(void) {
    
      std::vector<int32_t> v = std::vector<int32_t>();
      v.push_back(111);
      v.push_back(222);
      v.push_back(333);
    
    
      {
        std::ofstream ofs;
        ofs.open("vecdmp.bin", std::ios::out | std::ios::binary);
    
        uint32_t sz = htonl(v.size());
        ofs.write((const char*)&sz, sizeof(uint32_t));
        for (uint32_t i = 0, end_i = v.size(); i < end_i; ++i) {
          int32_t val = htonl(v[i]);
          ofs.write((const char*)&val, sizeof(int32_t));
        }
    
        ofs.close();
      }
    
      {
        std::ifstream ifs;
        ifs.open("vecdmp.bin", std::ios::in | std::ios::binary);
    
        uint32_t sz = 0;
        ifs.read((char*)&sz, sizeof(uint32_t));
        sz = ntohl(sz);
    
        for (uint32_t i = 0; i < sz; ++i) {
          int32_t val = 0;
          ifs.read((char*)&val, sizeof(int32_t));
          val = ntohl(val);
          std::cout << i << '=' << val << '\n';
        }
      }
    
      return 0;
    }
    
    0 讨论(0)
  • 2020-12-11 22:13

    Read the other's answer to see how you should read/write a binary structure.

    I add this one because I believe your motivations for using a binary format are mistaken. A binary format won't be easier that an ASCII one, usually it's the other way around.

    You have many options to save/read data for long term use (ORM, databases, structured formats, configuration files, etc). The flat binary file is usually the worst and the harder to maintain except for very simple structures.

    0 讨论(0)
提交回复
热议问题