c# serialization ascii confusion

我是研究僧i 提交于 2021-01-27 07:43:37

问题


Here's the code.

[Serializable]
public class HostedGame
{
    public int ID { get; set; }

    public int UID { get; set; }

    public String Name { get; set; }

    public Boolean Available { get; set; }

    public String Description { get; set; }

    public List<int> Users { get; set; }

    public int Port { get; set; }

    public HostedGame(int uid, String name, String description, int port)
    {
        UID = uid;
        Name = name;
        Description = description;
        Available = true;
        Port = port;
        Users = new List<int>();
    }

    public int CompareTo(Object obj)
    {
        int result = 1;
        if(obj != null && obj is HostedGame)
        {
            HostedGame w = obj as HostedGame;
            result = this.ID.CompareTo(w.ID);
        }
        return result;
    }

    static public int Compare(HostedGame x, HostedGame y)
    {
        int result = 1;
        if(x != null && y != null)
        {
            result = x.CompareTo(y);
        }
        return result;
    }

    public static HostedGame DeSerialize(byte[] data)
    {
        MemoryStream ms = new MemoryStream(data);
        BinaryFormatter bff = new BinaryFormatter();
        return (HostedGame)bff.Deserialize(ms);
    }

    public static byte[] Serialize(HostedGame obj)
    {
        BinaryFormatter bff = new BinaryFormatter();
        MemoryStream ms = new MemoryStream();
        bff.Serialize(ms, obj);
        return ms.ToArray();
    }
}

The code bellow doesn't seem to work right:

HostedGame hs = new HostedGame(12,"Name", "Description", 8088);
String s = Encoding.ASCII.GetString(HostedGame.Serialize(hs));
HostedGame HACK = HostedGame.DeSerialize(Encoding.ASCII.GetBytes(s));

HACK.Port for some reason comes out being 7999?

When I just do this...

HostedGame HACK = HostedGame.DeSerialize(HostedGame.Serialize(hs));

It works fine.

So, what I'm asking is

  1. Why am I getting a wrong value?
  2. Is there a better way to convert the bytes to a string and back again?

回答1:


You cannot use Encoding.ASCII.GetString to convert any byte array to a string. You are losing some data when you do this. Use Convert.ToBase64String instead. This one will make a string from any byte sequence without losing the data.

HostedGame hs = new HostedGame(12,"Name", "Description", 8088);
String s = Convert.ToBase64String(HostedGame.Serialize(hs));
HostedGame HACK= HostedGame.DeSerialize(Convert.FromBase64String(s));

Here is an example, that shows how using Encoding.ASCII loses the data.

var testBytes = new byte[] { 250, 251, 252 };
var text = Encoding.ASCII.GetString(testBytes);
var bytes = Encoding.ASCII.GetBytes(result); // will be 63, 63, 63



回答2:


Binary serialization generates a byte array which is not (necessarily) a valid string in any encoding.

When you try to read it as ASCII text, the ASCII decoder will convert any invalid bytes (> 128) into ? characters.
Therefore, when you turn it back into ASCII bytes, you end up with a different set of bytes.

In short, don't treat binary data as ASCII, or as any other text encoding.

If you need to send binary data as plain text, use Base64 to safely convert it to text.



来源:https://stackoverflow.com/questions/6403352/c-sharp-serialization-ascii-confusion

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!