问题
I know there are other posts which have addressed this fact but I have still been unable to find a solution to my problem.
I am received massive chunks of data from a serial port which will include 0x1A in them. I receive them in discrete chunks (at this point in chunks of 10, 18 or 528 and I know when to expect each size). The problem is that whenever I receive a 0x1A it would appear to drop this byte.
I have tried:
if (e.EventType == SerialPort.Eof) return;
but I get the error:
System.IO.Ports.SerialPort
does not contain a definition for 'Eof'.
What I would like to do is simply disable this Eof feature; is it possible so that this character is no longer anything special?
Thanks in advance and sorry if this has been answered elsewhere but I haven't been able to find a straight answer anywhere...
回答1:
Of course immediately after posting a question I figure it out.
I needed to use SerialData
instead of SerialPort
. My bad. I hope this helps other people out there!
来源:https://stackoverflow.com/questions/13711594/disabling-eof-0x1a-using-serialport