byte

Issues with Bytes from a Microcontroller in Python

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-06 14:36:40
I am using Python to read micro controller values in a windows based program. The encodings / byte decodings and values have begun to confuse me. Here is my situation: In the software, I am allowed to call a receive function once per byte received by the Python interpreter, once per line (not quite sure what that is) or once per message which I assume is the entire transmission from the micro controller. I am struggling with the best way to decode these values. The microcontroller is putting out specific values that correlate to a protocol. For example, calling a function that is supposed to

Scaling An Array (Matrix)

别来无恙 提交于 2019-12-06 13:45:45
The intention of this program is to create a larger array of bytes scaled up by a factor of 10 from the original array. For example, the 1 in [0][0] should be a 10x10 square of 1's in the new array. I provide the code and the output, which seems to work properly during population of the larger array, but then prints different values. I'm currently experimenting with just the rows in order to limit the number of variables I'm dealing with during testing. Can anyone think of a reason why this happens? public class Test { static byte[][] byteArray = {{1, 0}, {0, 1}}; public static void main

How can I convert bytes object to decimal or binary representation in python?

廉价感情. 提交于 2019-12-06 12:08:24
I wanted to convert an object of type bytes to binary representation in python 3.x. For example, I want to convert the bytes object b'\x11' to the binary representation 00010001 in binary (or 17 in decimal). I tried this: print(struct.unpack("h","\x11")) But I'm getting: error struct.error: unpack requires a bytes object of length 2 Starting from Python 3.2, you can use int.from_bytes . Second argument, byteorder , specifies endianness of your bytestring. It can be either 'big' or 'little' . You can also use sys.byteorder to get your host machine's native byteorder. import sys int.from_bytes(b

Hex to Byte Array in C# and Java Gives Different Results

谁说胖子不能爱 提交于 2019-12-06 11:58:46
问题 First of all, sorry for the long post, I want to include all my thoughts so it's easier for you guys to find what's wrong about my code. I want to transfer an Hex string from a C# application to a Java application. But, when I convert the same Hex value to a Byte Array on both languages, the output is different. For instance, the same Hex value gives [101, 247, 11, 173, 46, 74, 56, 137, 185, 38, 40, 191, 204, 104, 83, 154] in C# and [101, -9, 11, -83, 46, 74, 56, -119, -71, 38, 40, -65, -52,

String to binary and vice versa: extended ASCII

泪湿孤枕 提交于 2019-12-06 11:54:50
I want to convert a String to binary by putting it in a byte array ( String.getBytes[] ) and then store the binary string for each byte ( Integer.toBinaryString(bytearray) ) in a String[]. Then I want to convert back to normal String via Byte.parseByte(stringarray[i], 2) . This works great for standard ASCII-Table, but not for the extended one. For example, an A gives me 1000001 , but an Ä returns 11111111111111111111111111000011 11111111111111111111111110000100 Any ideas how to manage this? public class BinString { public static void main(String args[]) { String s = "ä"; System.out.println

Typing Python sequence to Cython array (and back)

生来就可爱ヽ(ⅴ<●) 提交于 2019-12-06 11:38:02
I have successfully used Cython for the first time to significantly speed up packing nibbles from one list of integers ( bytes ) into another (see Faster bit-level data packing ), e.g. packing the two sequential bytes 0x0A and 0x0B into 0xAB . def pack(it): """Cythonize python nibble packing loop, typed""" cdef unsigned int n = len(it)//2 cdef unsigned int i return [ (it[i*2]//16)<<4 | it[i*2+1]//16 for i in range(n) ] While the resulting speed is satisfactory, I am curious whether this can be taken further by making better use of the input and output lists. cython3 -a pack.cyx generates a

Web - Video : bytes range to time

本小妞迷上赌 提交于 2019-12-06 10:13:30
问题 I have a PHP script for stream a video from an url, and I want to get the time to control the flow. Browsers makes HTTP requests with a range of bytes when jumping at a time of the video. Request Headers Accept:*/ * Accept-Encoding:identity;q=1, *;q=0 Accept-Language:fr-FR,fr;q=0.8,en-US;q=0.6,en;q=0.4 Connection:keep-alive Host:h.com If-Range:Tue, 20 Oct 2015 23:38:00 GMT Range:bytes=560855038-583155711 Referer:http://h.com/7743a76d2911cdd90354bc42be302c6946c6e5b4 User-Agent:Mozilla/5.0 (X11

Convert Bytes to Image ASP.NET c# and use it in Image1.Url

别来无恙 提交于 2019-12-06 10:06:17
I have a WEB-APP that is a Web-Cam App that takes images and stores into a database as bytes, Now with that being said I also don't want to save the images those are taken and save it in any kind of folder right now the only way to show the image that is captured for me to save it and view it again to do that I have a input stream that's fired when the capture image is clicked. using (StreamReader reader = new StreamReader(Request.InputStream)) { hexString = Server.UrlEncode(reader.ReadLine()); string imageName = DateTime.Now.ToString("dd-MM-yy hh-mm-ss"); string imagePath = string.Format("~

Write one single bit to binary file using BinaryWriter

左心房为你撑大大i 提交于 2019-12-06 09:59:07
I want to write one single bit to a binary file. using (FileStream fileStream = new FileStream(@"myfile.bin", FileMode.Create)) using (BinaryWriter binaryWriter = new BinaryWriter(fileStream)) { binaryWriter.Write((bool)10); } Something like binaryWriter.Write((bit)1); When I use binaryWriter.Write((bool)1) the file has one byte, but I want to write one single bit. Is this possible? You cannot store only 1 bit in a file. Almost all modern filesystems and hardware store data in segments of 8 bits, aka bytes or octets. If you want store a bit value in a file, store either 1 or 0 as a byte

Swift - Convert UInt8 byte to array of bits

六眼飞鱼酱① 提交于 2019-12-06 08:16:22
I'm trying to decode a protobuff encoded message, so I need to convert the first byte (the key) in the protobuff message into bits, so I can find the field number. How do I convert a UInt8 (the byte) into an array of bits? Pseudo Code private func findFieldNum(from byte: UInt8) -> Int { //Byte is 0001 1010 var fieldNumBits = byte[1] ++ byte[2] ++ byte[3] ++ byte[4] //concatentates bits to get 0011 getFieldNum(from: fieldNumBits) //Converts 0011 to field number, 2^1 + 2^0 = 3 } I saw this question , which converts an array of bits into array of bytes. Here's a basic function to get a Bit array