cancel
Showing results for 
Search instead for 
Did you mean: 

Transition from PIC

AThor.2
Associate II

Hello All

Noob question here - seeking advice.

I've been using PICs for many years now and writing only in assembler. I've reached the end of my tether with microchip and the lottery of which IDE will work with which programming tool and with which device, plus many other glitches and bugs with their hardware and software.

It seems like time to migrate to ARM but I still want to mostly code in ASM. Could you please advise what IDE (if any) would be best for dipping my toe in.

TIA

Andy

12 REPLIES 12

Really going to come down to a reliable/comfortable debugger. I'd say take Keil for a spin, might take a while to hit the 32KB ceiling for evaluation linker/debugger, and if you're comfortable with just a raw disassembly perhaps beyond.

Making with GNU/GCC is also a viable part, assembler syntax between KEIL, IAR and GNU perhaps the biggest potential annoyance.

To play one can take example projects and use startup_stmxyz.s and start adding functions and calling from Reset_Handler, all the vector table in place, and CPU starts from internal HSI or MSI clock.

Tips, buy me a coffee, or three.. PayPal Venmo Up vote any posts that you find helpful, it shows what's working..

> It seems like time to migrate to ARM but I still want to mostly code in ASM.

I wouldn't recommend that. And, believe me, I have logged quite some years of asm on various platforms (not PICs though, not that I've never wrote anything for them, but I hate them and avoid them... 🙂 ). But, when it came to RISCs - and it started with the "babyRISC", i.e. AVR - I bit the bitter pill, and learned C. And I hate C ever since...

The main reason is, that RISCs are inhuman. In detail

  • there are too many registers (16 in ARM, although some of them are dedicated e.g. PC) and one usually loses track of the register allocation across longer pieces of asm code
  • usage of registers is asymmetric, i.e. you can't use any register in individual instructions
  • instructions are picky about the literals, there's only a subset of possible numbers you can use as a literal (e.g. powers of 2, zero, 16-bit literals if the instruction set allows it); arbitrary numbers have to be loaded into registers in a cumbersome way, either relative-referenced to current PC (for which most asms do have support, but still), or in a two-step procedure assembling the literal from parts
  • the load-store architecture leads to longer code than you've been used to, that again means harder to track
  • the Thumb2 instruction set used in the Cortex-M, i.e. 16-bit reduced variant of the original ARM instruction set, introduced loads of assymetries in mentioned registers usability, available literals, also in usability of certain settings e.g. in if-then sequences, only a subset of instructions can set flags; some instructions have several variants depending on what sort of literal or register combination you want to use, and they may have slightly different syntax - this all makes it hard to write code and sometimes leads to frustration, when porting/moving/modifying code or data placement/structures
  • some other than ARM 32-bit instruction sets, in helping execution and simple architecture, include unwieldy things like jump slots, which are hard to utilize manually
  • sometimes it might be challenging to maintain required alignment of data and/or jump targets (the latter again is not a problem on ARM, but the former is, on M0/M0+, lesser so on M3/4/7)

To this, add the "infrastructural" obstacles:

  • syntax is not very well unified
  • as asm is primarily used to process compiler output, assemblers often are poorly documented
  • there are virtually no textbooks or other learning resources
  • there are very few users who can help you out in case of troubles (in particular, I know of no regular on this forum who would actively use any form of asm)

I don't say it's not viable, I just say, it is probably too problematic.

JW

AThor.2
Associate II

Thanks for the info and for taking the time to go into such detail - that is very useful. I thought that there might be a fairly steep learning curve but it sounds perhaps steeper than hoped.

Most of what I develop is probably quite straightforward compared with what I suspect many proper developers on this forum do day to day. The most complex thing I've done is a bit of GSM module control/data passing over TTL with some character LCD driving and a bit of I2C shuttling around. Bits of AtoD for battery monitoring and the usual switch/LED control, all in assembler. Perhaps ARM is a sledgehammer cracking a nut in my case.

Learning C might be a step too far for me. I've done bits of high-level code in the past but not enough to hit the ground running. I'd hoped to begin using mixed C/ASM so I could use manufacturers' libraries in places rather than reinvent the wheel; perhaps I've been over-optimistic.

If microchip weren't so all-over-the-place with their products, I would probably persevere but I'm getting tired of it now.

Just this afternoon, I was updating some old products with the latest firmware and I spent some considerable time finding a combination of IDE/IPE/programmer that would work. No logic to it - just frustration and time-wasting.

Old 8-bit MCU were honestly more brutal.

ARM and 68K are both pretty orthogonal in register usage, there are quite a few books on ARM assembler, though the evolving and sometimes shrinking ISA on certain Cortex-Mx can take some adapting too.

Keil handles the literal/immediate management quite well. It can probably handle stack frame / parameters too, really haven't probed too deeply, bulk of my assembler usage was with X86 MASM. My goals with disassemblers is produce clearly readable code for static analysis, and a relatively short walk for reassembling portions of interest for dynamic analysis.

Using assembler on STM32's here for flashing and bootstrap code. Some serious consideration is being made into refactoring some designs into assembler to cram more functionality into small L0 parts. Nobody here wants to build with wafer-scale parts.

Tips, buy me a coffee, or three.. PayPal Venmo Up vote any posts that you find helpful, it shows what's working..

> I was updating some old products with the latest firmware and I spent some considerable time finding a combination of IDE/IPE/programmer that would work.

As I've said I hate PICs - emotionally, not rationally (although there are certainly rational bits to it, too, as it's so often with emotions); but at the same time, I wouldn't really be much confident in saying that the STM32 world, or ARM world as such, is much better in this.

As an example, current ST software does not support the first version of their prime programmer/debugger, STLink-V1, while still actively selling https://www.st.com/en/evaluation-tools/stm32vldiscovery.html which contains it.

(My strategy always was to start development with fairly new tools, and then maintain the same tools (physically) at all costs, while the product's lifetime lasts. It's a non-trivial effort, too. )

> I thought that there might be a fairly steep learning curve but it sounds perhaps steeper than hoped.

I didn't mean to deter you, sorry. Please proceed as Clive outlined. For your purposes, probably the 'F0 family will suffice. Get some Nucleo or one of the simpler Disco board, and experiment. Blinky is the way to go. Observing disasm or raw-compiler-output of some simple code may probably be a useful starting point, too.

Most of the regulars here do have some ARM asm experience, even if mostly "read-only", from observing the code's behaviour, and sometimes writing short snippets for optimization or control reasons; so probably your questions won't remain unanswered - as long as they won't get to too intimate details.

JW

Jack Peacock_2
Senior III

From long embedded experience (started with Intel 8080 after coding in assembler on a CDC 6000) I urge you to reconsider going the ASM route with ARM controllers. PIC12 and PIC16 architecture dates back to the old General Instruments controllers from the 70s, which the emphasis on minimal gate count and geometries in the hundreds of nms. None of those restrictions apply to 32-bit embedded. Gate count at the 40-60nm geomtry level for an M series controller is a concern only as far as flash array size. Memory and CPU cycles are orders of magnitude larger while power is orders of magnitude smaller. That shifts the development focus away from hardware and towards software. Of greater concern is code reuse and time to market, where only an HLL like C can fill the bill.

My last major PIC project was a 3-phase controller using a PIC33, mandated by engineering mgr. The code was in C, along with ASM for the DSP instructions (the Microchip DSP library didn't work, so bad Microchip did not recommend using it). The same board with an M4 controller would have taken half the time to develop and require far less expertise to maintain. After that I gave up on PICs. But the point is the code in C could have been ported to an M4 with minimal effort. With assembler you don't have that code reuse; every project starts over when the architecture changes.

Over the last 15 years I've built up a large library of code for M series, mostly for ST though there's no inherent design flaw preventing porting to other vendors after adapting to the low level HAL library most vendors supply (ST used to do this too, with the SPL. The replacement LL library is a joke.) I'm sure the other commercial developers here do the same thing in some for or other. I don't recommend using ST's Cube/HAL/LL other than as an introduction. To borrow a cliche from TV, it's not ready for prime time (i.e. large scale commercial production).

Cross-vendor support is a hassle, and ST goes out of their way to make it worse, though to be fair they aren't the only vendor who does that. They all want a lockin for their parts. If you forsee working across several vendors with Mseries controllers then you might want to look at using a Segger J-Flash JTAG pod. They are very good at supporting a large variety of controllers, in some cases up to A and R series as well as M.

Don't be intimidated at the prospect of learning C. Once you start you'll recognize the equivalent constructs from assembler. I came to C from Algol and PL/I (yeah that really dates me) as well as assembler from quite a few architectures (S360, 8080/Z80, CDC6000, DEC20). My obstacles were getting away from mainframe support into the bare iron, but C does work well for peripheral drivers, especially on ARM with GCC.

If you plan on a career in embedded development there's no way around learning C. You'll see some recommendations to start with C++ to avoid bad habits; don't believe it. Object programming takes more experience than basic C programming, so stick with GCC and Eclipse IDE or a vendor package to start.

And yes, it is time to migrate to ARM. It all but dominates the controller market from the low end (M series), to reliability (R series) and smartphones (A series). Even Microchip finally gave in and bought Atmel after all but giving up on their own PIC32 line, which will spell the eventual end of the PIC line aside from legacy support. You can see already how the Atmel parts are diverting away most of Microchip's resources from PICs.

Jack Peacock

Jack,

> With assembler you don't have that code reuse; every project starts over when the architecture changes.

This is an often repeated argument against asm (together with "it's hard to do math in asm" which is obviously nonsense), and I beg to disagree.

Besides a couple of smaller projects, I also ported an extensive project from asm to asm between platforms which had little in common. I found, that most of the code - which is usually "algorithmic" in nature and relatively well structurable - was easy to port, almost a mechanical job, although I wouldn't really go into automating it. What took more time was to adjust to the hardware-dependent stuff. But hey, that would be exactly the same in C!

Truth to be said, both the source and destination assemblers were written in-house, so they shared features which made the porting easier. We went even so far that we ignored the rather cumbersome mnemonics of the target, inventing our own, matching closer the source. But the point here is, that writing assemblers is really easy, so this is simply part of the game.

Generally speaking, code reuse is IMO mostly a myth, at least in the truly micro*controller* world. What gets reused are mostly concepts and experience. Technique I use the most is Copy-Paste-Modify, almost never getting away without the third part; and I see the same around me.

Jan

> Old 8-bit MCU were honestly more brutal.

Maybe it's matter of background. I am from the cheap side of the world. I could only dream of x86 and 68k in our past designs. Maybe this also explains why I stick to the "dark side".

> Some serious consideration is being made into refactoring some designs into assembler to cram more functionality into small L0 parts.

I would love to hear how that went. I was told Keil produces pretty dense code.

Jan

TDK
Guru

> Could you please advise what IDE (if any) would be best for dipping my toe in.

STM32CubeIDE has gotten progressively better over the past few years and has reached the point where it's quite usable and works on virtually every chip made in the past decade or more. However, if you want to do ASM, you don't need a lot of the features.

I use CLion for the majority of my development and quite enjoy it, but it was a PITA to set up. It's way more responsive than Eclipse based IDEs which is my main complaint with STM32CubeIDE.

Also agree with the others regarding assembly. Programming in C is going to be much more productive and will free your mental resources from keeping track of individual registers. Skip HAL and use direct register programming if that's what you're most comfortable with.

If you feel a post has answered your question, please click "Accept as Solution".