27 votes

USB and the myth of 500 milliamps

13 comments

  1. [11]
    JCPhoenix
    Link
    I can't say I know much about electrical engineering, but having searched for compatible USB-C cables, wall warts, and power banks for my devices, I've spent way, way, way more time than I'd've...

    I can't say I know much about electrical engineering, but having searched for compatible USB-C cables, wall warts, and power banks for my devices, I've spent way, way, way more time than I'd've liked looking into stuff like this. I even bought a USB digital power tester so I could see how many watts devices were getting with various cables and chargers. So that I could know for sure I was getting what I was paying for.

    So maybe on the engineering side, USB-C has "demolished this complexity," but as a consumer -- and a techy one at that -- I'm not so sure!

    13 votes
    1. [10]
      Greg
      Link Parent
      I feel this one! I think the USB-IF’s utterly insane approach to branding, combined with Amazon et al’s Wild West of product claims, means there’s not a lot of hope for most users if they need...

      I feel this one! I think the USB-IF’s utterly insane approach to branding, combined with Amazon et al’s Wild West of product claims, means there’s not a lot of hope for most users if they need something specific.

      The one saving grace for me is that we’ve also passed the point that even mid-range is now way overkill for 95% of actual use, so it somewhat balances out. Does that cable fully comply with 100W PD like the listing says? Probably not, but it was £7 for a three pack and my phone only supports 30W anyway. Is it 3.2 2x4 or 3.1 4x2 or some other essentially arbitrary combination of numbers that denotes >10Gbps speed? Well it’s going into a portable hard drive that’ll do 2Gbps on a good day, so it’s going to be fine regardless.

      Frustrating to have to fall back on that, but at least I can generally put it out of my mind. Display cables (either USB-C or HDMI) are the notable exception: pushing 48Gbps over one of those is pretty common even with an upper mid range modern display, and whether that’ll work without dropouts is an absolute crapshoot in my experience. That and tunnelling PCIe gen 4 over USB4, but that’s one I accept is at the edge of the spec and I’m one of like six people actually using.

      7 votes
      1. [2]
        papasquat
        Link Parent
        I honestly detest the marketing terms that get applied to standards. When you only have a couple of different flavors of the standard it's not so bad, you can say firewire 400 vs firewire 800, or...

        I honestly detest the marketing terms that get applied to standards.

        When you only have a couple of different flavors of the standard it's not so bad, you can say firewire 400 vs firewire 800, or cat 6 vs cat6a, and people know what those standards mean, they're pretty well adhered to, and there aren't many issues with just referring to the cable or device by the standard.

        When you start getting a ton of different capabilities all lumped into a marketing term like USB-C, it just becomes an absolute mess, and no new marketing term like "high speed" or "ultra speed" will save you.

        Manufacturers should just be required to print the capabilities of the cable on the connector or cable, clearly mark the advertisement, and then consumers just have to figure it out themselves. The giant question mark quagmire that exists right now is a disaster.

        6 votes
        1. JCPhoenix
          Link Parent
          What they did to MicroSD cards is a crime against humanity. I still don't know what half of those symbols and classes and stuff mean. Though on the other hand, I usually just buy whatever card and...

          What they did to MicroSD cards is a crime against humanity. I still don't know what half of those symbols and classes and stuff mean.

          Though on the other hand, I usually just buy whatever card and yet they work on my dashcams, security cameras, etc. So maybe like @Greg said with regards to USB-C, none of it will matter to most.

          2 votes
      2. [7]
        bitwaba
        Link Parent
        I think what has surprised me the most is the slow adoption rate of USB-C. We're 8+ years into it at this point, and I can STILL find newly created devices from manufacturers that have micro USB....

        I think what has surprised me the most is the slow adoption rate of USB-C. We're 8+ years into it at this point, and I can STILL find newly created devices from manufacturers that have micro USB. Like, the move from USB-A and/or B to mini to micro happened in a decade. But I can buy a brand new Logitech mouse today with a micro USB charger port.

        When USB mice started to become a thing, every one you bought came with a USB A -> PS/2 adapter. There's no reason why every mouse and keyboard manufacturer today can't do so with a USB-C interface and package a C->A converter with it. Speaking of that, I don't think I've EVER seen a keyboard with a USB-C connector on the non-keyboard end (I'm not doubting they exist, but I've literally never seen one in the wild). The best I've seen is my mechanical keyboard has a USB C interface on it, and it shipped with a C to A cable which I could presumably replace with a C to C cable.

        But even if I did.... what am I going to plug it into? My $200 AM4 motherboard I bought in 2022 has ONE USB C interface on it. The AM5 motherboard I'm about to buy for my girlfriend has ONE USB C interface. But it has FIVE USB A connectors?!

        Pretty much the only place I can reliably count on having a USB C port is my phone, and at almost a decade in, that's just absurd. (I guess you can almost guarantee USB C for power on laptops too? I don't have enough experience with the various manufacturers to know what they're shipping with across the market)

        I'd really like an explanation from the USB gods on this one. I feel like they owe humanity an explanation.

        4 votes
        1. skybrian
          Link Parent
          My theory is that USB-C ports are relatively expensive due to the power and bandwidth needed, and USB-2 is cheap. So, you don’t see a lot of USB-C ports in designs. For another example, MacBook...

          My theory is that USB-C ports are relatively expensive due to the power and bandwidth needed, and USB-2 is cheap. So, you don’t see a lot of USB-C ports in designs.

          For another example, MacBook Air laptops tend to have only two USB-C ports, and they brought back the magsafe connector for power. It frees a USB-C port without having to add another USB-C port, which presumably was hard for some reason. (Or they could have added a USB-C port that’s less capable than the others, but that would be an annoyance.) Fortunately, charging via a USB-C port still works, so I don’t need to bring the MagSafe cable with me.

          On a low-end Mac Mini, they also have only two USB-C ports, but also two USB3 ports. It’s a cost-cutting measure I presume, and also reasonable since USB-2 mice and keyboards are easy to find.

          So I suspect that USB3 is going to stick around as a low-end connector because USB-C is so capable that it’s too expensive to use everywhere.

          7 votes
        2. [3]
          papasquat
          Link Parent
          Most PC peripherals are still USB-A, and I don't see that changing any time soon. Frankly most peripherals wouldn't benefit from the increased speed of USB 3.0 or power delivery capabilities of...

          Most PC peripherals are still USB-A, and I don't see that changing any time soon. Frankly most peripherals wouldn't benefit from the increased speed of USB 3.0 or power delivery capabilities of USB-C, so it doesn't make sense for manufacturers to devote resources to swapping over.

          Motherboard manufacturers are just reacting to what's out there in the market, and the PC peripheral market says that USB-A is still the standard.

          I don't personally really have an issue with it. Port real estate usually isn't a big concern, and I rarely connect and disconnect those devices, so reversibility isn't a huge plus for me either.

          5 votes
          1. [2]
            bitwaba
            Link Parent
            Yeah, that's the problem I have with it in the first place. Motherboard manufacturers won't move the USB C because peripherals haven't moved to USB C. Perephirial manufacturers haven't movedto USB...

            Yeah, that's the problem I have with it in the first place. Motherboard manufacturers won't move the USB C because peripherals haven't moved to USB C. Perephirial manufacturers haven't movedto USB C because motherboard manufacturers haven't. Its a stalemate.

            If I'm still here in 2034 bitching about USB A still existing, this is why.

            We're better than this.

            1. papasquat
              Link Parent
              Honestly, I don't really see a compelling reason for them to switch to USB-C. It would just be more expense and confusion for no real benefit.

              Honestly, I don't really see a compelling reason for them to switch to USB-C. It would just be more expense and confusion for no real benefit.

              2 votes
        3. Carrow
          Link Parent
          There's actually a good reason -- such adapters are not compliant. The same author has an article on USB C non-compliant adapters. In short, one could use them to fashion an A-A cable which would...

          There's no reason why every mouse and keyboard manufacturer today can't do so with a USB-C interface and package a C->A converter with it.

          There's actually a good reason -- such adapters are not compliant. The same author has an article on USB C non-compliant adapters. In short, one could use them to fashion an A-A cable which would cause a current fight over the two 5V supplies. Which in some cases could fry your stuff. All in all, it's an edge case though.

          Nonetheless, reputable manufacturers don't want to sell non compliant products, even if the risk is zilch when used correctly. And non-reputable manufacturers likely don't want to spend extra selling a non-compliant adapter when they can just use the cheap standard, especially since the standard is still ubiquitous.

          4 votes
        4. Weldawadyathink
          Link Parent
          I am absolutely there with you. I have been on a personal mission for years (ever since I got my first android phone with USB C) to excise all other usb from my life. For everything except usb A,...

          I am absolutely there with you. I have been on a personal mission for years (ever since I got my first android phone with USB C) to excise all other usb from my life. For everything except usb A, I am getting so close. For years, I refused to buy any device with micro or mini usb. The only exceptions were devices where there was no alternative, but I recently sold off those devices to grab a steam deck (new 3ds, psp go, ps vita). Now that I have sold my desktop, I am a step further on the route towards removing usb a from my life. The last holdouts are my battery bank (I have one on order with usb c only, but it was a crowdfunding thing and it’s gotten delayed) and a few usb c docks I got for my steam deck. Manufacturers, if you are listening, I will pay MORE for devices that just don’t have usb A on them. Ideally you would replace those with usb c, but you don’t even have to do that. Just don’t solder the usb a plugs and make a new case without those holes, and I would be happy to buy it. Anker is doing a poor job of this right now. They have a ton of Great Wall chargers with a bunch of usb c ports, but often throw a usb a port in there too. Just give me usb c only please!

          As for an explanation, for chargers it’s very simple to add more usb A, and more complex to add usb C. Have a 5v rail and you can put as many usb a as it can support. But we expect usb c to have PD, which means that each port on the device has to be able to output 5v, 9v, 12v (although 12v is optional), 15v, and 20v. On a multi port device, each port might have to output a different voltage. Add in PPS, and each port has to output an arbitrary voltage between 5 and 20v. That makes multiport chargers much more complex. I don’t really have an explanation for lack of usb c on motherboards though. It’s really annoying.

          3 votes
  2. skybrian
    Link
    From the article: … … … …

    From the article:

    The specification originally stated – you aren’t supposed to consume more than 500mA from a USB port. At some points, you’re not even supposed to consume more than 100mA! It talked unit loads, current consumption rates, and a good few other restrictions you would want to apply to a power rail. Naturally, that meant enforcement of some kind, and you would see this limit enforced – occasionally.

    On the host side, current limiting had to be resettable, of course, and, at the time, that meant either PTC fuses or digital current limiting – both with their flaws, and a notable price increase – per port. Some bothered (mostly, laptops), but many didn’t, either ganging groups of ports together onto a single limited 5 V rail, or just expecting the board’s entire 5 V regulator to take the fall.

    Portable HDDs wanted just a little more than 2.5 W to spin-up, 3G modem USB sticks wanted an 2 A peak when connecting to a network, phones wanted more than 500 mA to charge, and coffee warmers, well, you don’t want to sell a 2.5 W coffee warmer when your competitor boasts 7.5 W. This led to Y-cables, but it also led to hosts effectively not being compatible with users’ devices, and customer dissatisfaction. And who wants complaints when a fix is simple?

    It was also the complexity. Let’s say you’re designing a USB hub with four ports. At its core, there’s a USB hub IC. Do you add current consumption measurement and switching on your outputs to make sure you don’t drain too much from the input? Will your users like having their devices randomly shut down, something that cheaper hubs won’t have a problem with? Will you be limiting yourself to way below what the upstream port can actually offer? Most importantly, do users care enough to buy an overly compliant hub, as opposed to one that costs way less and works just as well save for some edge cases?

    USB ports, purely mechanically, could very well handle more than 0.5 A all throughout, and soon, having an allowance of 1 A or even 1.5 A became the norm. Manufacturers would have some current limits of their own in mind, but 500 mA was long gone – and forget about the 100 mA figure. Perhaps the only place where you could commonly encounter 500 mA was step-ups inside mobile phones, simply because there’s neither much space on a motherboard nor a lot of power budget to spend.

    Smartphone manufacturers were in a bind – how do you distinguish a port able to provide 500 mA from a port able to provide 1000 mA, or even 2 A outright? That’s how D+/D- shenanigans on phone chargers came to be – that, and manufacturers’ greed. For Android, you were expected to short data lines with a 200 Ohm resistor, for Apple, you had to put 2.2 V or 2.7 V on the data pins, and if you tried hard enough, you could sometimes use three resistors to do both at once.

    Further on, USB3 took the chance to raise the 500 mA limit to 900 mA. The idea was simple – if you’re connected over USB2, you may consume 500 mA, but if you’re a USB3 device, you may take 900 mA, an increased power budget that is indeed useful for higher-speed USB3 devices more likely to try and do a lot of computation at once. In practice, I’ve never seen any laptop implement the USB2 vs USB3 current limit checking part, however, as more and more devices adopted USB3, it did certainly raise the bar on what you could be guaranteed to expect from any port.

    USB-C PD (Power Delivery) has completely, utterly demolished this complexity, as you might notice if you’ve followed my USB-C series. That’s because a device can check the port’s current capability with an ADC connected to each of the two CC pins on the USB-C connector. Three current levels are defined – 3 A, 1.5 A and “Default” (500 mA for USB2 devices and 900 mA for USB3). Your phone likely signals the Default level, your charger signals 3 A, and your laptop either signals 3 A or 1.5 A. Want to get higher voltages? You can do pretty simple digital communications to get that.

    Want to consume 3 A from a port? Check the CC lines with an ADC, use something like a WUSB3801, or just do the same “check the PSU label” thing. Want to consume less than 500 mA? Don’t even need to bother checking the CCs, if you’ve got 5 V, it will work. And because 5 V / 3 A is a defined option in the standard, myriad laptops will effortlessly give you 15 W of power from a single port.

    12 votes
  3. FlippantGod
    Link
    Around 2011 we got Battery Charging 1.2, which had a few standout features: Dedicated Charging Ports (DCPs) may output more than 1.5A. Allows Portable Devices (PDs) with switch mode chargers to...

    Around 2011 we got Battery Charging 1.2, which had a few standout features:

    • Dedicated Charging Ports (DCPs) may output more than 1.5A. Allows Portable Devices (PDs) with switch mode chargers to
      draw more power (not sure about the PD part, spec seemed to contradict itself.)
    • Minimum Charging Data Port (CDP) current is now 1.5A.
    • PDs may draw up to 1.5A during High-Speed traffic.
    • CDPs must support 1.5A during High-Speed traffic.
    • Secondary Detection now optional for PDs.
    • Remove resistive detection.
    • Any downstream port may act as a DCP (specifically, an SDP, CDP or DCP, and can switch roles).

    Even if this wasn't Power Delivery's "Clean Slate" design, I do think this was roughly everything required for straightforward ~25W charging, power and data combined operation, and a consistent, predictable user experience.

    3 votes