microcontroller

is it possible write to console without stdlibs? c/c++

跟風遠走 提交于 2019-12-22 04:04:36
问题 I am programming on an arm microprocessor and am trying to debug using print statements via UART. I do not want to add stdlibs just for debugging. Is there a way to print to the console without stdio.h / iostream.h ? Is it possible for me to write my own printf() ? Alternatively I can do this using a DMA controller and writing to the UART directly. However I would like to avoid that is possible. Using the built in test function "echo" or "remote loop-back" I know I have the UART configured

Call tree for embedded software [closed]

别等时光非礼了梦想. 提交于 2019-12-21 09:17:20
问题 Closed. This question is off-topic. It is not currently accepting answers. Want to improve this question? Update the question so it's on-topic for Stack Overflow. Closed 7 months ago . Does anyone know some tools to create a call tree for C application that will run on a microcontroller (Cortex-M3)? It could be generated from source code (not ideal), object code (prefered solution), or at runtime (acceptable). I've looked at gprof, but there's still a lot missing to get it to work on an

Tic Tac Toe and Minimax - Creating an imperfect AI on a microcontroller

元气小坏坏 提交于 2019-12-21 04:01:50
问题 I have created a Tic-Tac-Toe game on a microcontroller, including a perfect AI (perfect meaning that it doesn't lose). I did not use a minimax algorithm for that, just a little state machine with all possible and optimal moves. My problem now is that I wanted to implement different difficulties (Easy, Medium and Hard). The AI so far would be the hard one. So I've thought about how to do this the best way and ended up wanting to use the minimax algorithm but in a way that it calculates all the

Using Microhip's MLDP data streaming from Android or iOS

♀尐吖头ヾ 提交于 2019-12-21 02:26:11
问题 Microchip defined a way to stream data over BlueTooth low energy (BLE) and called it MLDP (Microchip Low-energy Data Profile). They built it into their RN4020 chip, and there is even an sample Android app. However, I can't find any specification of how the protocol works or source for the app. I'd like to be able to use it to debug an embedded device from Android and/or iOS. Does anyone know the specification for this protocol or software that implements it? 回答1: Hi i was in the same problem,

Whats the best resource to learn Assembly language for PIC microcontroller's [closed]

牧云@^-^@ 提交于 2019-12-20 23:27:01
问题 As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance. Closed 8 years ago . I'm going to start working on a project where I need to have a decent understanding of Assembly language for the PIC microcontroller's

Serializing object to byte-array in C++

℡╲_俬逩灬. 提交于 2019-12-19 08:58:56
问题 I am working on an embedded device (microcontroller), and I want to save objects to permanent storage (an EEPROM). Most of the serialization solutions I can find, use the file-system in some way, but my target has no file-system. Therefore my question is, how can I serialize an object to a byte-array so I can save that byte-array to an EEPROM afterwards? Here is an example of what i am trying to do: class Person{ //Constructor, getters and setters are omitted void save(){ char buffer[sizeof

Converting to ASCII in C

 ̄綄美尐妖づ 提交于 2019-12-19 02:02:51
问题 Using a microcontroller (PIC18F4580), I need to collect data and send it to an SD card for later analysis. The data it collects will have values between 0 and 1023, or 0x0 and 0x3FF. So what I need to do is convert 1023 into a base 10 string of literal ASCII values (0x31, 0x30, 0x32, 0x33, ...). My problem is that the only way I can think of to split the digits apart requires a lot of division. char temp[4]; temp[0] = 1023 % 10; temp[1] = (1023 % 100) / 10; temp[2] = (1023 % 1000) / 100; temp

How to force an unused memory read in C that won't be optimized away?

a 夏天 提交于 2019-12-18 12:52:28
问题 Microcontrollers often require a register to be read to clear certain status conditions. Is there a portable way in C to ensure that a read is not optimized away if the data is not used? Is it sufficient that the pointer to the memory mapped register is declared as volatile? In other words, would the following always work on standard compliant compilers? void func(void) { volatile unsigned int *REGISTER = (volatile unsigned int *) 0x12345678; *REGISTER; } I understand that dealing with

How to force an unused memory read in C that won't be optimized away?

烈酒焚心 提交于 2019-12-18 12:52:28
问题 Microcontrollers often require a register to be read to clear certain status conditions. Is there a portable way in C to ensure that a read is not optimized away if the data is not used? Is it sufficient that the pointer to the memory mapped register is declared as volatile? In other words, would the following always work on standard compliant compilers? void func(void) { volatile unsigned int *REGISTER = (volatile unsigned int *) 0x12345678; *REGISTER; } I understand that dealing with

CRC16 checksum: HCS08 vs. Kermit vs. XMODEM

情到浓时终转凉″ 提交于 2019-12-18 12:36:25
问题 I'm trying to add CRC16 error detection to a Motorola HCS08 microcontroller application. My checksums don't match, though. One online CRC calculator provides both the result I see in my PC program and the result I see on the micro. It calls the micro's result "XModem" and the PC's result "Kermit." What is the difference between the way those two ancient protocols specify the use of CRC16? 回答1: you can implement 16 bit IBM, CCITT, XModem, Kermit, and CCITT 1D0F using the same basic code base.