top of page

USB Explained

By: Jos Hartog:

Cape Hazard

​

​

Let's start with a history lesson:

Around 1995, USB 1 was introduced to the PC world. The idea was to create a simple, universal interface for computer peripherals (keyboard, mouse, printers, etc.). It uses a communication method similar to the old serial ports but with only 4 wires. It also allowed for expansion by using hubs (i.e. use one port on the PC for a multi port hub).

Over the years the USB specification has been updated to versions 1.1; 2; 2.2 ; 3 to name a few. The changes relate mostly to data speed and power availability from the port. Providing power for low consuming devices was always in the design, but charging devices (phones, etc.) was not part of the original concept but was brought into consideration for USB 2 and onwards.

USB 1 (and sub versions) had a power limit of 0.5A (500mA) and this was upped to 0.9A (900mA) from USB 2 onwards.

On to real life today:

So you plug in the latest iPAD (which wants a 2.1A charge rate) into your PC USB (which can only deliver 0.9A at best) and it blows the port on your PC  

Except that it doesn't blow the port. Why? Furthermore some devices will charge (albeit slowly) and other not at all. Why?

The peripheral device manufacturer must ensure that when the device is connected to the charge port, it must be able to detect what the port is capable of. One way to do this is to communicate with the port (PC) - easy to do but not practical as you would then have to build this ability into the AC charger as well and that's expensive.

The easy (and cheap) way to determine the port's power capability is to .......

But first how does USB charger port work:
The USB port has 4 wires - +5; Data + (D+); Data - (D-) and 0V (ground or negative). When using a charger, the D+ and D- lines are not used for data so the manufacturers apply a fixed, predetermined voltage to these lines which the device "sees".

If the device does not see the correct voltages on these lines (i.e. this is not the OEM charger), the device will either not charge at all (typical of later Apple models but Samsung and others are also starting this nasty trend) or it will back off to 0.5A charge rate to protect the port.

Also, the b@stard manufacturers (esp. Apple [AGAIN]) cannot agree on an industry standard because they want to protect their sales of OEM chargers etc. It would be very easy to standardise, for example, D+ at 3V and D- at 1V for all 2A chargers. All devices that want 2A, or less, will see this charger as a 2A irrespective of manufacturer.

But no, each manufacturer does his own thing, not just by brand but by model too. So a 2A charger that works fine on Samsung S6 won't work on a Note 7 or an iPhone 6 which all use a 2A charger (not sure if that's actually true for these models, but you get my drift).

The only way out is to make a USB port with it's own 5V supply capable of 3+ Amps (for future proofing) and adding in an interface to create the D+ and D- voltages suited to your particular device.

We did this a few years ago for the W Cape Metro ambulances which are fitted with 2 tablets (7" in front and 10" in the back) in each vehicle. The tablets are enclosed in a steel cradle (anti-theft) so it was easy to build interfaces and install them within the cradle. Over the years the original tablets were replaced with new models which required a different interface.

So, there you have it; Easy to do if we all play together nicely 

bottom of page