PC Services


PC Services (Electronics)

Perils of GPIO

Tel: 0118 946 3634

linked-in profile facebook profile

   The Company 
   Resources / Examples 
     Perils of GPIO 

Perils of GPIO - The embedded micro GPIO and peripherals nightmare

The perils of GPIO, documentation and libraries

Embedded microcontrollers and handling of GPIO is getting more and more difficult mainly due to -

  1. Hardware documentation
  2. Register, Library documentation and implementation
  3. GPIO Implementation at pin level
  4. GPIO selection possibilities at register level.
  5. Expensive methods using IP processors in FPGAs

It is time it had a major rethink from manufacturers of the chips and the tools vendors.

In most cases you have a design and a PCB locked to have GPIOs on set pins of micro and set to ONE configuration, so WHY do we have to write code blocks to initialise?

In days gone by on dedicated hardware this would be a ROM or these days Flash/NVRAM configuration for FPGAs. What we are creating for any project is a FIXED design for that version of software on THAT PCB. Once done we then make 100 to 100,000 or more of the design.

About the only case I have seen of needing to change GPIO mode on an embedded design (NOT general purpose board like Pi, Arduino or Evaluation Board) is poor man's bi-directional data lines like PS/2 keyboard, SD Card and the like. Mainly because the idea of a bi-directional GPIO at run time has always been a kludge in GPIO terms. Bit like quite a few implementations of SPI interface and providing the Slave Select (SS) pins.

By kludge I mean we have read and write a GPIO as fixed directions, we have to fudge changing directions of GPIO (in some case hoping the timing is right) then do different calls. Remembering on some architectures what to do if the GPIO is also a GPIO interrupt how we handle things and not end up in never never land.

Interfaces like I2C or SPI are generally set to be either Master or Slave for THAT design. Rarely do they change between Master or Slave in use.

Poor Hardware Documentation

Missing mode definitions, loose descriptions in register bit settings with no documentation on how to use.

Hidden bits like I have seen -

  • Programming pin can withstand 13V, (footnote 300 pages later - only when 5V is stable)
  • Most common SPI mode only supports single byte transfer for SS signal as it is disabled between bytes despite 16 byte FIFO, so anything requiring 16 to 32 bit transfers or blocks of data from RAM/EEPROM has to use another GPIO pin and extra software.

Then there is the often not well defined methods of using a GPIO

  • Pull-up/Pull-Down/Open-Drain/'Normal' output
  • What interrupt modes if any
  • What other functions use that pin and how the override is done
  • What happens on direction/mode switching (some do strange things)

Do not even start me on the so called "able to drive an LED" port descriptions that usually mean either only some pins (depending on model may not be available on pins), or worst still one bit per bank of GPIO and you have to guess which one. Then driving an LED means a current of 5 to 8mA, not 10 to 20 mA type levels.

Back to top

Register, Library documentation and implementation

Often for GPIO (and other peripherals) you have 29 modes listed only 3 documented (even if the other 26 are subsets no mention), and lack of mention of what can be used with what and other limitations usually listed elsewhere or in separate manual, if at all.

Having to reverse engineer and device or board support library source code to find out what the API does or worst still what register bits do what for any mode is a right pain. Often I find mistakes like setting clocks for SPI that forgets integer division errors, so instead of making clock slower than requested, always errs upwards. So what you say, well if I have a 10MHz SPI device, and my choices are 8MHz or 16MHz, I would rather have 8MHz and working, than 16MHz and not working. Also the ability to get returned the actual frequency that can be achieved.

WORST failure having an API that INSISTS that EACH pin has SEPARATE calls to set -

  • Mode
  • Direction
  • Initial Value

Especially when it is a 32 bit register at initialisation in most cases it is set as many bits as possible at once. Why create extra code levels. For the sake of abstraction as if this was a general purpose system, our dedicated system starts using lots of lines of code to replace potentially ONE line of code!

So we now have situation of wasting CPU cycles and extending startup times due to clumsy hardware and bloated abstraction layers. Let alone onerous amount of software to change modes or even toggle more than one bit at a time during run time.

Changing individual pin's mode during run time is different, but I wish I did not have to do at all.

What happens to interrupt setting on changing direction is often different across architectures

Now comes the issue of combination of bits, in modern GPIO we tend to have 32 bit GPIO register sets which have one or more of the following -

  • Set register
  • Clear register
  • Enable/mask registers

So just taking 8 bits that need to be set and cleared can be a nightmare

In old days at worst to set a multi-bit pattern it was -

  1. Read register or saved value
  2. Clear bits to clear
  3. Set bits to set
  4. Write whole pattern at once

Now if you are lucky you have the ability from API to

  1. Clear bits all at once
  2. Set bits all at once

Just hope you do NOT need on external logic or devices with one bit to clear and one bit to set at same time not an instruction cycle or more later.

In WORST case you have an API that has up to 32 calls to set/clear each bit individually, and hope we do not have external issues with timing between bits being set or cleared at different times of MANY instruction cycles. So the GPIO is best suited to just attaching LEDs and switches and nothing very fast.

Sometimes you have to add external latching/buffers to control timing.

GPIO configurations and modes seem to be becoming complex in wrong ways not ways to simplify, because you have a slow interface like a character LCD, you have to put lots of software in to get around inadequacies of the GPIO port configurations, means using faster processor and more power used.

The worst mistake is libraries that include every possible interface no matter what model you actually are using, then forgetting in the standard build IDE interface or instructions to put gcc flags like '-g' to remove unused code when linking. It is better to know up front what code is being included and not to save on host operations and NOT RELY on compiler tricks that may change so you cannot build the same binary again.

Back to top

GPIO Implementation at pin level

Worst mistake is NOT defining WHICH pins when used as an input has hysteresis on (Schmitt triggering), so when also used as an interrupt you then find you have to add external Schmitt trigger or similar for a switch input.

Often Multi function choices on pins actually mean you do not have as many peripherals or GPIO as you might have first thought, so you have to change design or change micro. This stage of design is not a 5 minute job and can hold up even outline proposals.

The power on default levels, is often not documented well enough and can cause problems when changing modes or peripherals on pins to ensure you do not get strange pulses confusing other connected devices. Seen cases where on power up external device determines I2C/SPI mode of connection on pin levels used for transfer later at power up time, so power up pulses from micro confuses external device.

Poor GPIO bit groupings from 32 bit register often with random bits taken out to pinsel mux rarely 4 or more adjacent bits of a port even on same side or row of pins. Making EXTRA software to do GPIO bit shuffling to drive things like character LCD or SD Card.

Configurations of timers and analogue sections are my biggest nightmare, having to feed counter outputs back in sometimes with minimal gating, or synchronising sampling of A/D to a counter is at best peculiar if not insane, especially with burst or gated clocks.

Finding out after a week of examining configurations and all modes required or worst still part way through PCB layout, because of peripheral combination and/or modes -

  • Another peripheral is now lost
  • Straight GPIO pins are all over the device (if not lost)

Because of some modes you may have spare pins you cannot use as this is reserved for that mode but you don't need that signal or input. For example setting up for SPI may require forcing a MISO (receive) pin but you have write only devices like DAC. Another example is simplex UART connections because I am using TX does NOT mean I need RX at all, or vice versa for a lot of GPS devices.

Back to top

GPIO selection possibilities at register level.

The VAST majority of GPIO settings are ONE TIME ONLY at power up, such as

  • Pull-up/pull-down/open drain
  • Interrupt
  • Which pins are used
  • Which peripherals are used
  • DEFAULT state of pin

As soon as a mode changes something else goes awry or gets illegal pinmux setting.

Not all manufacturers list the pinsel mux priorities of what mode over rides what.

Back to top

Expensive methods using IP processors in FPGAs

Yes you can get IP core for a CPU add your own Flash and RAM and hope you have enough logic left in the FPGA for any other functions you require.

Whilst this is good for specialised and usually high speed DSP type work, you suffer problems of -

  • More devices and cost
  • Lock in to specific devices, compilers and programming tools
  • Hoping you don't have future support problems with the tools and devices
  • Longevity of the Flash and RAM devices you may now require

If you can withstand all the costs, your product is either sufficiently expensive or high enough volume for short time period then this can be your solution. For most cases this is not a viable solution.

Back to top


In most cases we have a pinout requirement, that is then set in stone, because we have a PCB layout, and if we really want to save on costs a few days rewriting code is way cheaper than re-spinning PCBs and binning old revision.

So it would be good to have a HARDWARE way of setting pin modes at startup, code is not going to change until a PCB revision and code execution will be slower than hardware. Let alone other issues listed above.

Personally I wish there was a global all pins are pulled up or pulled down at reset UNTIL a hardware configuration has completed then simultaneous switch to GPIO and peripheral settings at default levels.

Most microcontrollers these days have some form of JTAG like capabilities, which can be used for this even internally.

Minimum Flash configuration or configuration bits read as part of reset/power recovery mechanism for
  • First bit pull up or pull down all pins
  • Peripheral enable/disable etc..
  • Modes
  • default state
  • PLL config and start of stabilise at least
  • Ability to bring out signal config done
Better Above plus SMALL Pin Routing Matrix to select which pins on whole chip
  • Have the signals INSTEAD of pinmux settings read as part of configuration
  • Ability to do route timers for cascading and gating
  • Ability to sort timer, DMA, ADC, DAC triggering
  • Ability to do byte wide on fairly close pins (LCD or external A/D interface)
Best All above with small Gate Array for adding things like
  • Write/Read to group of pins produces suitable strobe pulse or timer pulse
  • As Above but from timer or timer gate to afterwards also trigger interrupt or DMA or A/D
  • Create combinations of things like counter outputs or GPIO to reduce external logic
  • Ability to do byte wide on fairly close pins
  • Bi-directional support from strobe pins and the like easier interfacing to SD cards and the like.

I am reminded of a project I did in 2003, on a 20MHz 16 bit (32bit internal) micro, that was reading a 2048 pixel CCD Line Scan, by using binary voltage comparators externally, and timers to detect edges of the pulse. Yes folks high resolution scanning cameras have been around for a long time and much higher resolutions were available even then. These were used in all sorts of places including probes to map planets and other applications like saw mills, long before common usage as desktop flat bed scanners.

However to drive and synchronise the signals to the Line Scan was complicated as the pixel clock (1 to 10MHz) was a gated signal and I needed a selection of synchronised counters to drive signals as shown in the picture. Obviously the pixel clock was feed back into the pulse measuring counter to determine position of edges.

This involved all sorts of external gates and feeding counter outputs out of the micro and back in again as clock to another counter. So much of this should have been configurable one time in the device but was not.

Line Scan CCD timing diagram

Back to top

Why might this happen or more likely not

First of all to use something like this requires a tool where you define what peripherals, interrupts, GPIOs you need and see if this is possible or movable so you can get groupings of GPIO and easy PCB layout options.

More importantly this can tell you sooner if you need the model up or someone else's micro.

This also means if using the better or best methods listed above you do not have problem of because I need 3 UARTs, I can not have an I2C controller.

Obviously some pins are ALWAYS fixed

  • Power
  • Non maskable interrupts (RESET, NMI..)
  • Programming pins (mode select, JTAG or similar)
  • Analogue inputs, outputs, reference (if not or possible to use as GPIO as well anything above about 1kHz sampling rate can have reduced Effective Number Of Bits)

Some pins should be on special pins sometimes not, like USB, Ethernet PHY and I2C. I2C should be as it really should have a mid-rail threshold level not 0.4V or similar for VIL.

However the rest should be able to be placed anywhere.

Back to top

Why won't the manufacturers do it ?

Well -

  • Cross over with FPGA business
  • Cannot upsell larger model after buying original model
  • Gives more information to engineers before initial purchase
  • Easier to make pin compatible drop in replacement
  • Reduces Lock in
  • Not Invented here

Much as I would like to see it, I doubt it will be mainstream.

© 2017 onwards by PC Services, Reading UK Last Updated: 18th January 2017
If you encounter problems with this page please email your comments to webmaster