deserialization

How do I use Serde to (de)serialize arrays greater than 32 elements, such as [u8; 128]?

不羁的心 提交于 2020-07-03 05:55:07
问题 I have a struct containing a byte array that I would like to serialize and deserialize to and from binary, but it only works for arrays up to 32 elements. Here is my minimal example code main.rs : #[macro_use] extern crate serde_derive; extern crate serde; extern crate bincode; use bincode::{serialize, deserialize, Infinite}; const BYTECOUNT: usize = 32; // 33 and more does not work, I need 128 type DataArr = [u8; BYTECOUNT]; #[derive(Serialize, Deserialize, Debug)] struct Entry { number: i64

C# Enum deserialization with Json.Net: Error converting value to type

老子叫甜甜 提交于 2020-07-03 01:54:09
问题 I'm using Json.NET to serialize/deserialize some JSON APIs. The API response have some integer values that map to an Enum defined in the application. The enum is like this: public enum MyEnum { Type1, Type2, Type3 } and the json API response has the following: { "Name": "abc", "MyEnumValue":"Type1" } sometimes the API returns a value for the MyEnumValue field that's not defined in my enum, like this: { "Name": "abc", "MyEnumValue":"Type4" } That throws an exception: Error converting value

Invalid cast exception Xunit deserialization error

守給你的承諾、 提交于 2020-06-29 05:13:09
问题 When trying to run a test case that uses the xunit framework through Visual Studio I am currently getting the following error. System.InvalidCastException HResult=0x80004002 Message=Specified cast is not valid. Source=xunit.execution.desktop StackTrace: at Xunit.Serialization.XunitSerializationInfo.GetValue[T](String key) in C:\Dev\xunit\xunit\src\common\XunitSerializationInfo.cs:line 40 at Xunit.Sdk.XunitTestCase.Deserialize(IXunitSerializationInfo data) in C:\Dev\xunit\xunit\src\xunit

Another failure at deserializing data with discriminated unions, in F#

我怕爱的太早我们不能终老 提交于 2020-06-29 03:47:31
问题 Following a question where the answer provided a working solution to serialize / deserialize discriminated unions (IgnoreMissingMember setting doesn't seem to work with FSharpLu.Json deserializer) I have now a practical case where this fails (although it works in simpler cases). here is the test code: open System.Collections.Generic open Microsoft.FSharpLu.Json open Newtonsoft.Json open Newtonsoft.Json.Serialization // set up the serialization / deserialization based on answer from: // https:

Binary stream '0' does not contain a valid BinaryHeader error on deserialization

十年热恋 提交于 2020-06-25 20:21:12
问题 After searching for an answer to this issue for the last 2 days, I'm hoping someone here can help. I have written a program in c# using VS2012 that saves the user's project data using BinaryFormatter to serialize a serializable class to a Stream before saving it to a file. The programme has been in use for some time, however recently a user couldn't open a file he saved the day before. He sent me the file, and the error I get in the degugger is: "Binary stream '0' does not contain a valid

Binary stream '0' does not contain a valid BinaryHeader error on deserialization

别说谁变了你拦得住时间么 提交于 2020-06-25 20:21:12
问题 After searching for an answer to this issue for the last 2 days, I'm hoping someone here can help. I have written a program in c# using VS2012 that saves the user's project data using BinaryFormatter to serialize a serializable class to a Stream before saving it to a file. The programme has been in use for some time, however recently a user couldn't open a file he saved the day before. He sent me the file, and the error I get in the degugger is: "Binary stream '0' does not contain a valid

Deserialization of large numpy arrays using pickle is order of magnitude slower than using numpy

淺唱寂寞╮ 提交于 2020-06-16 05:44:29
问题 I am deserializing large numpy arrays (500MB in this example) and I find the results vary by orders of magnitude between approaches. Below are the 3 approaches I've timed. I'm receiving the data from the multiprocessing.shared_memory package, so the data comes to me as a memoryview object. But in these simple examples, I just pre-create a byte array to run the test. I wonder if there are any mistakes in these approaches, or if there are other techniques I didn't try. Deserialization in Python

Deserialization of large numpy arrays using pickle is order of magnitude slower than using numpy

ぃ、小莉子 提交于 2020-06-16 05:41:52
问题 I am deserializing large numpy arrays (500MB in this example) and I find the results vary by orders of magnitude between approaches. Below are the 3 approaches I've timed. I'm receiving the data from the multiprocessing.shared_memory package, so the data comes to me as a memoryview object. But in these simple examples, I just pre-create a byte array to run the test. I wonder if there are any mistakes in these approaches, or if there are other techniques I didn't try. Deserialization in Python

Deserialization of large numpy arrays using pickle is order of magnitude slower than using numpy

↘锁芯ラ 提交于 2020-06-16 05:41:10
问题 I am deserializing large numpy arrays (500MB in this example) and I find the results vary by orders of magnitude between approaches. Below are the 3 approaches I've timed. I'm receiving the data from the multiprocessing.shared_memory package, so the data comes to me as a memoryview object. But in these simple examples, I just pre-create a byte array to run the test. I wonder if there are any mistakes in these approaches, or if there are other techniques I didn't try. Deserialization in Python

Newtonsoft Json.NET JsonConverter attribute preserve references issue when deserializing

岁酱吖の 提交于 2020-06-12 18:34:14
问题 In the models of a project I am using a JsonConverter attribute to help with the (de)serialization of those models. The converter currently looks like this: public class CustomJsonConverter : Newtonsoft.Json.JsonConverter { bool _canWrite = true; public override bool CanWrite { get { return _canWrite; } } public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer) { serializer.PreserveReferencesHandling = PreserveReferencesHandling.Objects; serializer