Hacker News new | past | comments | ask | show | jobs | submit login
Nintendo Switch is not USB-C compliant (plus.google.com)
520 points by lambada on Mar 29, 2018 | hide | past | web | favorite | 216 comments

In watching the evolution of USB-C over the last few years, it seems like it's extremely hard to implement correctly with the huge number of modes, alternate modes, and power delivery in the spec.

When you connect two USB-C devices today, you have almost no idea what is actually going to happen, which device is the master, and which way power will flow.

While having one connector and cable type for everything seems like it would be a good idea, in practice it's turning out to be a giant mess. Maybe it'll clear up in a few years, but given the race to the bottom in price and quality in the accessory market, this seems doubtful.

I partly agree that it's a bit harder than before. But a lot of the faulty devices we're seeing are just straight up using an incorrect resistor value or similar, and apparently released to the public without proper testing.

Maybe that's on USB spec people for not having good material, but on the other hand maybe that's on the manufacturers for not hiring EEs who can actually read a damn spec sheet properly...

>EEs who can actually read a damn spec sheet properly...

I think this is the hardware equivalent of expecting programmers to write bug free code on the first try.

Most of the time when a programmer wants to implement something a little complex and a little outside their expertise, they use an external library. Likewise, EEs will often buy a chip produced by a third party manufacturer and use that to handle it. But the abstractions available to EEs are often a bit more leaky than those available to us programmers, since they are more constrained by the laws of physics. and as a result that third party chip is still harder to use than a software library.

That isn't the way EE's are trained in my experience. When new hardware is designed that is outside the EE's experience there is usually an evaluation board available from the manufacturer. They get one of those in house, read the application notes and data sheets, and use their test equipment to observe the 'proper' signal levels and waveforms on a correctly implemented system.

Then they add this part to their design, and lay out the schematic and the printed circuit board according to best practices by the manufacturer. Once the first boards return they put measure the signal levels and waveforms in their circuit to verify they are with the specifications and that they match the ones on the eval board. Then they will 'corner' test the circuit (corners are low/hi temperature range, low and high voltage range) and verify it continues to work according to specification at all the 'corners' (if it does then you are generally ok with assuming it will work at all points "inside" those for corners.)

There are people who are either in a hurry or don't care who wire something up according to the application note, power it up once and call it 'good'. I've seen a number of cost reduced 'clone' equivalents that meet that description. @kens has done a number of blog posts that show this sort of mentality in detail.

Your process outlined here takes time and is likely part of why the quality ones aren't to market yet. :(

On the first try? I would hope that hardware manufacturers don't take the engineers' first attempt and immediately go into mass production.

That is still an avoidable error. You shouldn't be adding parts without understanding how they work or how they will affect your circuit as a whole.

Is it really that simple? Just changing a resistor value? I got an XPS 13 for work, it’s all USB-C, and the versatility of the port is fantastic (doesn’t matter which port I plug into, it still charges, so I don’t have to wrap the power cable around the back of the machine or sit in a weird position to keep some slack in the wire). However, I took it to the office and naturally I plugged my power adapter into one port. I then plugged in the monitor (also USB-C, delivering power), and the two ports cancelled themselves out and confused the OS. Windows thought I was charging the laptop yet it also said the battery was depleting. They were negating each other and the battery loss continued as it would with no power connected.

It makes sense for a monitor to power the laptop because it’s mains connected. But then you have one of those classic programming problems: if there are multiple inputs providing the same thing which do you consider the source of truth?

USB PD requires communication over the CC pins via setting different resistor values and checking them at specific times in the hand shaking process. The communication in the latest spec is supposed to happen at about 300kbs using a specific protocol on a specific CC line. There have been multiple specs since about 2011, some of which used communication over the VBUS line using BFSK, which has since been deprecated. This adds to the confusion.

There are some USB c complaince issues which are yes literally just the wrong resistor.

What you're describing might go beyond that though, in particular if there's multiple ports and hubs involved, that could be not just a usb problem.

Hmm, this have anything to do with why trying to hook up my macbook to external monitor with usb-c to display port cable blew my motherboard today?

They could always ask the user if they are that confused then remember the answer so they don't have to ask again.

Can you imagine a typical laptop user who assumes everything 'just works' (like me, by virtue of thinking I need to power my laptop while attaching a powered monitor) seeing a prompt in the OS about which thing should provide power and which should either reject it or use it to charge a battery?

Can you imagine someone not understanding a clearly worded UI that they use to do the things they have to do to function in society?

On Macs, it will choose the power supply that advertises the highest available wattage. This has been working flawlessly on my MacBook Pro.

This was the case when the original USB came out first too. It took a good couple of years to mature. I think these are normal standardization issues...

Difference is that when the original USB spec came out it wasn't delivering enough power to kill somebody, contrast that with today where a USB-C error could be fatal.

See: https://www.theverge.com/2016/2/4/10916264/usb-c-russian-rou...

This is incorrect. The high power charge is 20 volts, below the safe threshold of ~50V set by such agencies as: OSHA, NFPA, ANSI, IEEE, UL, IEC and others.

The real difference is USB was a completely new technology with zero market penetration so growing pains were inevitable. USB-C should not have these growing pains as they should have learned from 20+ years of developing the standard.

Is it really large, established manufacturers who are making these mistakes? Or is it random fist-time hardware entrepreneurs in Shenzhen?

Analogy: we don’t blame Javascript-the-language-standard for the Node community’s constant (and usually half-assed) reinvention of build tools. Javascript itself didn’t cause that. More than likely, it was a glut of newbie developers joining the community—and building tools for one-another, rather than leaving it to the experienced people—that caused that.

> Is it really large, established manufacturers who are making these mistakes?

Yes. Just read the NathanK or Benson google+ pages. Very few USB-C accessories are compliant - almost every one has some bug in its implementation, some worse than others. Apple cables are good, but even they took a couple iterations to get it right.

Considering this article is about the switch not being compliant, yes. Nintendo has quite a lot of experience making hardware, and they didn't get it right. Right now USB-c is a mess, and I'm not aware of any controllers on the market that are compliant and easy to use.

Yes, Anker had to issue a recall for a USB C cable that could fry devices:


There is relatively little risk of electrocution but 100 watts can still be more dangerous than 10 watts. Overheating and fire are a much bigger risk.

5V to 20V can be fatal?

Indirectly, it could start a fire...

A 9V battery in the wrong place can kill you.

As can a AAA battery, if accelerated sufficiently beforehand.

Well, I mean from electrocution specifically. 100-200mA through your heart is probably going to be fatal, and the low resistance required for that to happen with a simple 9V battery source has at least once been achieved by inserting multimeter probes in the thumbs such that the current passed via the blood stream through the heart, causing fibrillation.

Interestingly, there exists a sort of upper threshold where the amperage is less likely to be fatal due to severe cramps preventing fibrillation IIRC.

Sadly it's tiny watch batteries in the wrong place doing most of the killing and disfiguring.

Would you care to give us a scenario? The best source I could find places the minimum resistance of already damaged human skin at 500 ohms. With 9v of potential, that translates to .018 amps of current, which is just barely above the threshold for sensation, let alone death.

0.018 is way above the threshold of sensation, well into the territory of very painful shocks (18 times the threshold of sensation at ~1mA). Of course this will usually be local with a 9V battery, but with some deliberation you can pass the current through your heart via the blood stream, where the resistance is sometimes much lower which can get you very close to what'll likely be fatal.

Edit: the above holds true for AC but not necessarily for DC from a battery, where the thresholds are typically higher. But the heart is very sensitive and fibrillation occurs at very low currents passing directly through it.

I can kill you with a 12V car battery at 8 amps clamped to your fingers, no problem.

No you can't. It has the capacity to deliver 8 amps, but your body's resistance is too high to draw that current.


EDIT - here's some math:

The U.S. Navy FIRE CONTROLMAN Volumes 01 - 06 & FIREMAN gives 1500 ohms as a common resistance approximation between extremities, either hand to hand or hand to foot, for an average human body.

I = V/R = 12V / 1500 Ohms = 0.008 A

8 mA is not nearly enough to kill you.

On the other hand, if skin is breached, the resistance is quite a bit smaller, and it won't take too much voltage to cause a deadly current.

There's a story floating around the internet of someone having done by stabbing through their thumbs with a 9V multimeter, and maybe it's possible, but I'm skeptical of the story's veracity. I can see those wacky stories for a lot of situations, but you'd think with somebody dying that there'd be some kind of incident report or other documentation.

But sure, it's probably best to not touch the car battery terminals when your hands are bleeding.

No way. If that were the case, people would be dying all the time when trying to jump cars. You would need to wear lineman gloves to do it safely and working on cars in general would be quite hazardous. I've touched both car battery terminals before. You don't feel a thing.

I'm genuinely curious how you'd do that... I can touch both battery terminals with my hands and feel nothing. Even if my hands are damp, nothing happens. My battery is capable of outputting over 700amps.

People have latched on to the mantra that it’s not the volts that kill you, it’s the amps, but they never seem to remember that you need a certain voltage to get those amps in the first place.

Exactly. And also the mantra "electricity takes the path of least resistance". While that is true, some people seem to interpret it as "electricity only takes the path of least resistance" which is very wrong. Electricity takes all paths!

I don't know squat about electricity

If electricity takes all paths, then whats the point of the mantra? It includes the highest and least resistance, and everything in between. The mantra is almost deliberately meant to mislead?

It's technically correct, but the path of least resistance is the path of least resistance at the instant of measurement. Voltage and current are abstractions that represent the aggregate motion of charge carriers in a medium. Like charge carriers repel each other. At any instant, an individual charge carrier will take the path of least resistance. Calculus is required to model the current in a non-trivial system. It frequently leads to counter-intuitive results.

In short, it's complicated.

A better mantra would be: electricity takes all paths in inverse proportion to their resistance.

So with a single low resistance path, give or take, all the electricity will go that way. But with two paths of equal resistance, half the electricity will go each way.

> If electricity takes all paths, then whats the point of the mantra? It includes the highest and least resistance, and everything in between. The mantra is almost deliberately meant to mislead?

Maybe mantra wasn't the right word, Wikipedia calls it a heuristic. Regardless, I agree it is very misleading and I would never say anything like that when teaching or explaining electricity. Why does it keep getting repeated? Who knows. Inertia of the masses without a much understanding of electricity would be my guess...


Most of the power goes along the path of least resistance. It's an oversimplification, but you'll make much more dangerous mistakes if you let intuition lead you into assuming a pretty spread-out flow.

Electricity takes the path of least resistance...and all other paths, in amounts depending on their resistance.

Well, to be fair he said he could. Yes, it would be possible using a boost converter to jump the voltage, you can certainly draw enough current from an auto battery. But without using external components the only way a car battery would kill you is by creating a spark and igniting hydrogen that the battery produced.

If you pierce the skin with the "teeth" of the clamp you can do it, the lower resistance will be enough to send lethal amounts of current.

You don't die instantly. You are able to take exactly 5 steps before you die.

False. USB-C cannot be fatal.

Yeah, agreed - those things never worked. It took something like a decade before it got properly reliable.

It must be more than "engineers can't read a spec" because decices using previous versions of USB tend to be fine. So why can engineers read those specs? (Why didn't they "choose an incorrect resistor value" for previous generations of USB, then?)

This spec by contrast must be far more complicated[1]; also since it is so much faster, can't quality components supporting it all be more expensive - leading to cutting corners as a cost cutting feature?

Also if as a consumer you can't predict what "should" be about to happen (which way power will flow, what will be host) that is down to a bad spec re connector types. For example couldn't a universal visual indicator on a port (including the bottom of a phone) indicate its capabilities and what will happen very clearly? (Through symbols.) They do not. Couldn't a software popup make you choose? They do not.

Not to mention that it is not clear what should happen. If a phone can power a USB peripheral (like usb on the go) that means it could charge a second phone.

So if you connect two phones which one should charge the other? With old versions of USB phones the answer is simple: whichever one you put an on the go converter into, making it a master instead of a phone's usual role as a slave.

The fact that the spec does not make these as clear for USB C is down to the spec and design.

Can't USV C simply be badly designed? (overdesigned, underdesigned, badly designed.)

[1] Consider that a 1-page spec that can be read and implemented quickly in 15 minutes, can be implemented by anyone. A five-thousand page spec that takes 1,000 hours to read and understand can be implemented by no one (only teams). In between we have both previous versions of USB and USB C - but is it possible that USB C is too far in the latter direction?

> Couldn't a software popup make you choose? They do not.

They do when I e.g. connect two android phones together.

Got it (never tried it).

Is it just a resistor, or is it one of hundreds of possible resistor depending on factor X? USB or any connection should be simple - you know, like code. The more exceptions and options you add, the harder it becomes to properly implement it. This is exponential, too, not linear.

A sane USB-C spec would not have required a resistor for an A-to-C cable.

A single resistor is seriously not challenging to get right. If they can't do that they sure as hell shouldn't be making a cable.

Your average cable is some wire, some connectors, and plastic. Adding a resistor adds a whole extra element to the process, which is a big deal for a part that sells for probably well under $1 wholesale. Cable shops probably have no experience with circuits. The USB-C spec for some reason made it so that cables with no resistor or with the wrong resistor still seem to work but are unsafe. Of course the result is a mess.

Just about every video game console made between the late 1980s and mid-2000s had an official video-out cable with one or more resistors [1]. It's not an unheard-of thing.

[1]: http://members.optusnet.com.au/eviltim/gamescart/gamescart.h...

It's cost cutting to make cheaper cables and ports.

Or intentional trivial incompatibility to sell _more expensive_ cables and ports :)

That's a scary possibility. Flood the market with barely-functional garbage in the low range so that consumers can't figure out which cheap ones are actually usable and just pay more to avoid the headache.

It's not dissimilar to the box store HDMI/audio/etc cable strategy: Cheap, absolutely garbage cables at a mid-range price, or upper-mid-range cables at grossly over-inflated prices. Consumers tricked because of course the $200 HDMI cable looks noticeable better than the $20 one. Reality is the real retail prices of cables they're comparing should be more like $2 and $20, and the difference between that $20 cable and one that actually would cost $200 is only noticeable with high-end test equipment (or by 'audiophiles', who have the super-power of being able to see/hear a difference in cables as long as they know the price).

There needs to be a way to decide which device is the master, like the prompt on Android devices asking if you wish to power or be powered.

This. I’m always completely bemused when I first connect a USB-C charger to a USB-C battery pack before plugging into the wall - the LED on the charger lights up like the battery is powering the charger.

There’s also tons of times where the device you want to power from the battery starts charging the battery instead. With both ends of the connector being identical I’m not sure there even is a _good_ way to fix this that ordinary consumers will understand, the better solution might have been to still keep different cable ends. I hope this mess eventually gets better.

What's more annoying is when you connect a wacom tablet to your laptop and your laptop says it's charging. Only when your laptop is dead do you realize it was low battery. I had this happen during a demo and it was terrible.

Wonder if you could do it on the cable with little embedded LED lights. One end turns red for "charging" the other turns blue for "supplying power". Would add to the BOM cost on the cable for sure, but it would not only be cool looking but rather useful.

oh yeah, because my bedroom doesn't have enough glowing devices when i'm trying to go to sleep. let's add LEDs to every cable too!

Hah. Good point. Yes, too many devices want to put big giant blue LED's on their product.... even ones that are specifically targeted for the bedroom and specifically designed for sleeping...

Barring the comments about adding more sources of light in a dark room (yep, I agree with this), I still appreciate the MagSafe adapter for my MBP lighting a tiny LED to say charging/full. The new USB-C MBPs we have at work don't have any indication whatsoever that the adapter is working unless the machine is booted and logged in (to pre-empt comments about the chime, what if it's muted, as is often true of workplace laptops?). MagSafe tells me no matter what state the computer is in.

The LEDs don't have to be bright, just dim little indicators that are only really noticeable if you're looking for them. Then you get an instant idea of what the devices are doing.

The simplest solutions are directional cables (remember directional HDMI cables?) or plug-in order (supplier first, receiver last).

Plug in order is a terrible idea. With potentially every restart/power cycle, the cable would have to be unplugged and re-plugged into the receiver.

Not an EE, possibly a dumb question, but... do USB-C devices know that a cable has been plugged into them before the other end of the cable is plugged in to something else?

I recently got a new workstation at work that is a MacBook Pro that connects to 2 monitors via USB-C.

I've learned somewhat superstitiously that if I don't connect and disconnect the USB-C cables in the right order the laptop may stop responding and I have to do a hard reboot. I thought this solved the problem but even now it still occasionally happens.

Your question makes me wonder if something like this isn't the cause (Mac misreading USB-C signal).

Based on my experience with a pre-USB-C Macbook connected to multiple monitors, USB-C is not at all required to get them to do strange things, up to and including hard crashing, depending on the order and speed of connecting monitors and sleeping the system.

If all devices actually follow the standard, then yes, a device should be able to detect a cable was plugged in even if the other end is disconnected. The trouble is finding compliant hardware, I fully expect it will be at least 2 more years before we start seeing common hardware that follows the spec.

The plug can have (has?) resistors that let the socket "feel" it, even if the cable is not attached to anything.

Yes, the cables are active and can ID themselves when connected. Detection of another connected device (and orientation) is separate.

So yes, it is possible to detect which end gets connected first, but this absolutely shouldn't have any side-effects. It is unintuitive and fragile (if the 'master' device restarts, it will think the other end connected first, suddenly becoming 'slave').

Just imagine this: Device A <--> Device B

Device A receives the cable first, upon receiving the cable, Device A notices that the cable is not "hot" ("hot" in the sense that there is voltage/activity on the wire), so then Device A decides to "turn on" the cable, make it "hot".

Now Device B receives the cable, it notices that the cable is already "hot" and decides that it must be the second device.

I would imagine it could work that way,

Im also pretty sure that the usb cables are "dumb" cables and not active cables like QSFP (these have chips at each end).

This is not how USB Type C works. The cables are active (chipped), and the end devices have sense resistors. Each device can detect the presence and type of the cable, as well as the presence and type of an end device and total cable orientation (swapped or not).

This is all done before power is applied, as USB Type C is very explicit about not going hot (apart from a weak 5V V_conn for powering the cable).

Also, QSFP is not a cable, but a pluggable spec. A QSFP module is active, but the fiber optic cable you connect to it to is dumb. Link detection there works by sensing beam power.

That assumes a device can tell the difference between not hot and not there, which was the parent's question.

Yeah i don't get how they went from a reversible USB plug (you can already find those in the A format, that use a small lip with contact pads on both sides) to a 20(?) wire monstrosity carrying anything from several amps of 20V electricity to thunderbolt wrapped PCI traffic.

Well, the problem is that you can't patent Ethernet anymore so they have to complexify it. I'm being a bit snarky, but if you look at practically all the modern standards (HDMI, Thunderbolt, UCB-C), they are basically 2-TX and 2-RX pairs like Ethernet.

The core problem with power is that you can't really pump laptop charge wattage down a USB cable without raising the voltage above 5V. Once you accept that, the engineering gets quite a bit more complicated and you have to negotiate so one side doesn't blow the other up.

It isn't extremely hard; lots of IC vendors make USB-PD chips that are fully compliant with USB-PD.

And, they're not expensive. EEs are just cheap mother fuckers, often, and they think they can do it "well enough" themselves and save a cent or two on the bill of materials.

This is not the fault of the connector.

It is the fault of the protocol.

USB is a clusterfuck.

More and more devices are implementing it. Whats your tource its on its way to the bottom?

Murphy's Law: If it can go wrong, it will go wrong.

Why does USB-C have such terrible failure modes? I'm sure that the people behind it are bright engineers, so why do I have a fear unlike anything I've ever felt when buying a USB-C cable or charger or device?

Is this just a "tragedy of the commons" where every manufacturer expects the others to follow the spec so that they can skimp, or is there some kind of fundamental flaw in the USB-C spec that is making it so seemingly dangerous and difficult to use correctly?

Prior to USB-PD, USB maxed out at 4.5W, and that was a later addition, originally it maxed out at 2.5W. The USB-PW spec allows for 100W power transfer. There are far more modes in USB-C/PW, and they're much higher energy. If all you ever need to deal with is 5V and 0.1-0.9A then there's a much narrower range of failure states.

What voltage is the 100W coming through at? If it is low voltage, that is gonna be a really beefy cable... At 12V that is more than 8 amps. 5v is 20 amps. That is a lot of current to send over a typical USB cable...

Yes The cable can get warm. That said, now we use two tiny power wires from PCIe spec to deliver north of 12V 200W.

In order to reach 100W, you need to use a special cable that is capable of actively participating in the negotiation process in order to advertise its ability to handle 5A@20V.

Profile 5 is 20V@5A

Suffice to say, this makes me very hesitant to connect a USB-C phone (which I don't have, thankfully) to a laptop USB-C charger, because even though they /should/ negotiate the correct voltage, the idea of a bug in the implementation causing the charger to fry my phone, perhaps even setting it on fire in the process, isn't worth the risk.

As others note, the poor implementations wind up causing us to use only the adapters shipped with the device in the first place, which is actually a worse situation than with standard USB - right now, I know that I can plug a device into any USB charger and get 5 volts. Amperage may vary considerably, but unless the adapter is a real lemon (and yes, I know those exist), it's not going to kill my phone or burn my house down if I don't get the planetary alignment of adapter, cable and device right.

They are getting rid of explicit profiles btw...

The limit for a normal cable is 3A, the limit for a specialized charging cable is 5A.

It's almost as if hard USB A and USB B ends where there for a reason. I've heard good things about the Cypress EZ-PD[0] to handle power delivery issues (you'll need it in addition to the USB core).

[0] http://www.cypress.com/products/usb-type-c-and-power-deliver...

This seems to a problem with the spec then. Its about the customer. Amoung the customers are

- the consumer - who buys the product - the product - who buys from the manufacturer - the manufacturer

If the manufacturer wants to only implement a subsect of the full spec, they should be allowed to do that. This has additional be benefits with regard to minimizing the resources used.

Granted, its a cool spec with great potential and i enjoy seeing it in action with regard to sbcs. But the manufacturer is also a customer. If it doesn't work for them then there probably needs to be some changes

What people usually refer to as USB-C are 3 independent specs.

First there is the C plug and cable, including provisions for converting between C and the various A and B sizes. This is the spec that introduce various resistors to signal if the cable is a converter or (a very big or that has created much problems) if it can handle various watts at 5V.

then there is the power delivery spec that on paper can be used with any USB plug (yes, even your old A and B formats), and allows current to go in either direction at up to 20V.

And thirdly there is a continuation of the 3.0 data spec, 3.1, that include a provision for using various wires in the C cable in an alternate mode. This mode allows anything from digital video to PCI bus traffic to travel over the same cable, if both ends support the protocol traveling over the alternate mode wires. outside of alternate mode the 3.1 data spec can also be used with 3.0 A and B ports.

So even if your device have a C port, it may not be able to handle more than 5V at 0.5A and 1.0 data speeds...

It depends on what you consider normal I suppose. Its the 'devil you know' .. or maybe 'glass houses'. HTML, JS, CSS, etc, are all so buggy to where we have not managed to create a reference implementation for any of them (AFAIK). Imagine 100 browsers implementing the HTML/JS/CSS spec. Yeah..

Because USB-C has a million features that require complex logical controls and hardware.

While this is more of an interesting fact than anything: The Nintendo Switch website [1] does not advertise that the Switch has a USB-C port. It has HDMI and USB 2.0 on the dock... but no USB-C anywhere.

[1] https://www.nintendo.com/switch/features/tech-specs/

Depends where you look. [1] lists:

> USB terminal USB Type-C terminal Used for charging or for connecting to the Nintendo Switch dock.

As does the Japanese site[2]:

> USB Type-C™端子

[1] https://www.nintendo.co.uk/Nintendo-Switch/Specifications/Sp...

[2] https://www.nintendo.co.jp/hardware/switch/specs/

Then why did they use the port? I'm guessing it's the obvious answer of "it was cheaper".

As much as it pains me to say it, a "pay us to use the port shape" group like HDMI that will threaten litigation unless you pay them to use the port would probably have prevented this kind of thing from being as widespread with USB-C as it is. While just about everyone doesn't want that to be the case (myself included), I don't see any other way of aligning incentives to make it harder to use the port/spec incorrectly than it is to make it correctly.

More that cheap it was convenient: You solve the whole power, video output and external devices problem with a single chip on both side. And it also look nice compared to any usual docking port.

It solved enough design issues for them to let go their old habit of having proprietary connectors for charging. Which made them a lot of money.

With a broken implementation, their connector is effectively proprietary.

For what it's worth, the linked article I think is misleading as to how incompatible the device is. I charge my Switch regularly at work with a Lenovo 45W USB-C power adapter (model: adlx45uccu2a), it seems to perhaps take longer to charge than the Nintendo one but has worked reliably. Unfortunately I have tried several 3rd-party USB-C HDMI adapters with no success, though there are several options for HDMI+power on Amazon that reportedly work -- as I understand this is simply because the Switch does not enable video output unless it is also on AC power, so you need a combo adapter.

I've also charged my Switch with a random USB-C cable that someone let me borrow. It's a bummer that it doesn't handle HDMI properly or whatever else people might want from it, but being able to borrow a commonly-used cable and charge my Nintendo device (or any other dedicated gaming system)? I don't know if that's ever been done before... I certainly haven't owned one. And it's a nice step forward.

I've had good luck with this one: http://a.co/86ahf5e though I'm concerned about the recent reports of bricking devices using 3rd-party adapters...

I've charged mine a few times with a USB A to USB C cable off an iPhone charger. It takes many hours and the device was in sleep mode at the time.

I dunno, I tried a couple of my home USB C chargers and neither of them would work with the Switch.

Best of both worlds for them then.

or not pay to use the port shape but pay to certify the port, which is most often the case in the compatibility industry and adds a useful service...

They should stop using HDMI and switch to DisplayPort.

Fun fact: the Switch does output a flavor of DisplayPort to the dock, which converts it to HDMI using a Megachips STDP2550 MyDP-HDMI converter for output to TVs[0]

[0] https://www.tweaktown.com/news/56650/switch-dock-uses-mobili...

Hm, so actual Switch alone does output in proper DisplayPort (as MyDP)?

It's for use with TVs which all have HDMI.

You can always get a converter for that.

Or .. they could do what they currently do and ship HDMI for use with what most people have.

(Why do you care about DisplayPort? Can't you get a converter in the other direction?

No, you can't. You can convert DP to HDMI, not the other way around. DisplayPort uses packet protocol for its data (so it's almost network data like), while HDMI is just using digital signal. So it makes sense to always prefer DP.

Make popular and cheaply available ($1 - $3) a simple device that can plug into a port and pound the device through all the modes, blowing it up in spectacular fashion if it's not compliant. Make them ubiquitous enough that mischievous sorts are prone to quickly destroy any device put on the market with flawed ports.

Note: There is probably an ethical issue with this, but it doesn't jump out at me.

I love the idea of USB-C - Being able to have one cable which can do everything is great.

But I've become very leery about actually trusting it in practice - There are so many examples of bad cables, or devices which don't quite follow the specification, causing things to break badly.

If you end up having to follow a defacto "Only use 1st party tools" rule for safety reasons, I'd almost rather manufacturers went back to proprietary connectors. Those aren't inter-operable, but at least they don't pretend to be and risk me breaking everything.

It's insane to me that you can have two USB-C cables that look exactly the same externally yet provide different(and incompatible!) connections. You can have a USB 2.0, USB 3.0, USB 3.1, Power-only, Display-only, Thunderbolt-3-only "usb-c" cables that all look identical outside yet won't work for your use case. Prime example to me was that the USB-C cable supplied with the MacBook Pro cannot be used to connect the LG displays sold by apple which also use USB-C, and the user gets absolutely zero explanation as to why it doesn't work, no error, no warning - just a blank screen. That's nuts.

> You can have a USB 2.0, USB 3.0, USB 3.1, Power-only, Display-only, Thunderbolt-3-only "usb-c" cables that all look identical outside yet won't work for your use case.

Half of those are not spec-compliant. There are only 4 different legitimate types of cable: 3.0 non-power/thunderbolt, 3.1 non-power/thunderbolt, power-capable non-thunderbolt, and thunderbolt. That's 3 more types than there should be, but let's not make the problem worse than it is.

Wait what.

That is insane. Who thought this would be ok?

Is there any cables that are fullspec?

Yes: Thunderbolt. I only buy those because the $10 extra I spend on a cable now will never matter if I have to deal with this nonsense even once.

Only 0.5M TB3 cables are full spec. Longer ones only support USB 2.0 apparently

Do you have a souce for that? I have a 1m TB3 cable that appears to do USB3.1 just fine, and Newegg sells a 2m one that claims the same. [0]

[0] https://www.newegg.com/Product/Product.aspx?Item=9SIAERN68H8...

https://appleinsider.com/articles/17/08/15/psa-thunderbolt-3... is the first result I found. I could be out of date but I can't find a cable that's 2m long, 40gb, supports 87w+ and USB 3

I thought 0.5M TB3 was for 40G and any longer (up to 2M) is only 20G...

I got downvoted, but I was right:

> At launch, there'll be one passive Thunderbolt 3 cable that supports Thunderbolt, USB 3.1, and DisplayPort 1.2, but with a max bandwidth of only 20Gbps.


> Who thought this would be ok?

Whoever thought that using the same cable for thunderbolt was Ok. Probably Apple or Intel

Thunderbolt is not even part of the base spec, it just piggy backs on the generic alternate mode that can just as well carry Displayport vidoe (or anything, really).

Most of it is not even with the cables, but the chips and software at either end.

"only" 4???!

Are you sure 2.0 C cables are invalid? Can you back that up?

My favorite is I have an HP Envy USB-C monitor to use with my Macbook, and using the HP factory supplied cable, every time I power it on I get a warning that I should use an "official HP USB-C cable". It's literally the one straight from the box!

I think this error is caused by a mismatch between the monitors capacity to deliver power and your Macbooks desire for it. The HP Envy 27 is rated to deliver 65W, but MBPs want 85W. The MBP will negotiate down to 65W (although the battery may discharge under heavy usage).

I could be wrong though. I've got a similar setup and I've only seen the error once or twice.

My macbook won't wake up my usb-c monitors consistently. I have tried 8 cables at this point. I have to unplug the cable and plug it back in.

I also refuses to mount my phone straight usb-c, and the known workaround is to go usbc -> a -> c.

At this point in my relationship with Apple, I can only assume this is deliberate.

So _that's_ why my monitor doesn't wake up.

I got the new xps13, it only has thunderbolt/USBc-C ports and it's pretty much 50/50 whether my external monitor will wake up.

I figured it was the cable and was about to buy another one but you saved me the trouble.

I also got a hootoo adapter that's supposed to support power delivery but that hasn't worked either :/

This isn't just a problem with USB-C. There are also tons of video adapters on the market that say they support 4k, but really don't. We have grips of 2560x1440 monitors at work, and they had tons of issues like this where the monitors may or may not wake up, may handle the native res for a while, then power down or switch back to 1080p. The only consistent fix for them has been to buy one particular brand of adapter that we found works and making sure it's active. So I guess what I'm saying is, display tech has already been having issues for a long time with crappy components, and USB-C is just adding one problem into the mix now.

What they meant is that you should use an official HP Envy laptop :)

Regular USB had all the same problems. I can't charge my JBL speaker with my iPad charger. And I can't charge much of anything with my older Toshiba laptop USB ports.

Why anyone thought it would be different this time around, I don't know.

Different classes of failure though. Older USB ports would just fail to work (in the majority of cases), whereas with USB-C there is the risk of damaging the device.

That is a different Kettle of fish.

First of all, Apple came up with a slightly different set of resistors on the data pins (leaning that the data in and data out will read slightly different V) to signal to their devices that they could go above 1A draw while charging.

The official charging spec says to put a resistor between the data pins (inside the charger to support detachable USB cables), or simply to short them with a blob of solder.

Your JBL speakers are likely reacting as if it was plugged into a normal USB port.

also keep in mind that the charging spec came about as China and EU wanted to deal with the piles of incompatible chargers that was going into landfills.

Thus older USB ports max out at 5V 0.5A, as that was the spec back when USB 1.0 was launched. Also why some external HDDs come with a Y cable that draws power from two USB ports to get the motor spinning.

what you're describing is simply that those ports were not designed to deliver a lot of amps because usb devices at that time did not need it. i dont think it's the same as not being fully spec compliant at the protocol level. i've never had 2 usb 2.0 devices refuse to work together.

Can the title be changed to note that this is from 2017? It's new to me, so I assumed it was recent (but it was also posted on Google+....).

Some consoles are getting bricked since the latest Switch update because of this problem, that's why it gained visibility.

More info on that here:


Both the update and third party accessories seem implicated to a degree.

It's absolutely insane that Nintendo still has no way to backup saves in the event of console failure. This compounds the already ridiculous situation where you risk bricking the console when you plug it in to a charger.

However, because it was USB-C compliant (followed the darn spec) and robustly engineered, it will work with the Switch even though it came out nearly two years before the Switch was released. (Hooray!) Innergie had the foresight to add 15v as an "optional and extra" voltage level and now it reaps the rewards. (It also has $1mil in connected device insurance, so I can recommend it.)

I think maybe I just found a new preferred provider of cables and chargers.

Unrelated, but I really wish G+ posts were a bit wider. They look great on mobile but reading a "card" on a laptop screen is insane.

Reader Mode seems to work fine to reformat the page

I miss their older layout.

In case others are looking for something similar, I've had great success using Anker's 20100mAh USB-C portable battery to charge my Switch on the go. It's the only USB-C battery I've tried that actually works with the Nintendo Switch.

Link: https://www.amazon.com/Anker-PowerCore-Ultra-High-Capacity-P...

It also gets me a full recharge of my 15" USB-C MBP (though you can't use the MBP and charge at the same time, it takes a while).

I've got the same battery, bought ~2 years ago. Seems to work fine with my Switch and an Anker-brand USB A-to-C cable.

My 40W Anker PowerPort wall-charger also works.

Now I'm a bit worried about trying anything else, though. o_O

I have a different RavPower battery pack (also about 20Ah) and My 2017 15" MBP makes it REALLY FRIGGIN' HOT. OTOH it would be fine to charge it off of a USB A -> C charge cable, it would take ages, but i would feel better about it not cooking. Oh well.

I used a Xaiomi 20,000mAh external pack that has worked really well with the Switch. I also charged it perfectly from both LG and HTC phone chargers. On the other hand, the Switch charger refused to charge my phones.

Well this is exactly what I was wondering last week after noticing the USB-C port on the switch dock is labelled "Power Adapter". I thought it was a little off that there is no mention of USB-C, either not to confuse the users or perhaps because they didn't have much confidence in their hardware compliance.

In my humble opinion, the greatest sin of the USB C standard is the lack of introspection. Given the huge amount of possibilities, it should be possible to question a USB C socket with a cheap device what Alt modes it is capable of, what device roles it is capable of, what USB PD it supports etc.

One out of three: This device will show you what power deliver modes are available, request the most it can get, and make that available on terminals.


(Also, if you didn't know about tindie.com, and you are a certain sort of person, then I apologize for consuming the next couple hours of your day.)

That's 79 dollars. I said cheap :) 20 bucks is realistic, even if it's just the hardware and needs a phone or even a laptop to be plugged into. Even a laptop is reasonable.

I’ve been looking for soil moisture sensors for months and never come across that site, just on the first page of IOT stuff there are 5 good options on Tindie.

Amazing, thanks.

I can't make heads or tails of the technical details of this, but I think it's fair to start with the prior that Nintendo doesn't care too much about non-proprietary formats/interoperability/etc. I developed this impression when the Gamecube came out -- weird minidisks when Sony had already switched over to CDs and then DVDs, no ability to play external media, an online community that seemed pretty half-baked.

I haven't played videogames regularly for a decade though so perhaps things have changed. I now use a PS4 controller to play some games on my computer, and it's really nice, integrates naturally.

I'm actually pretty surprised that the comments here are discussing the complexity of the USB C spec. That may well be true, but somehow I think Nintendo would have screwed it up no matter how simple you try to make it...

The Wii and Wii U and DS have dozens of these stupid "quirks" which are the result of Nintendo just not knowing how to do things correctly. The reason I didn't buy a switch was because I was so infuriated by the Wii U's incompetence that I was certain the switch would bring me nothing but misery. I'm feeling pretty smug and validated right about now. This isn't the first of the switch's failures, either.

Nintendo are just miserably incompetent at firmware and software, I really wish they would just back some other hardware company to make the console for them, and Nintendo themselves should just focus on making games.

Better yet, Nintendo should just make their "fun" peripherals for PC, mobile, and existing consoles, and release their titles cross platform. Yes I'm bitter.

I hadn't played video games for close to a decade before I kind of randomly bought a switch, and it's actually got me back into gaming. Nintendo has kind of always done their own thing though, sometimes for better, sometimes for worse.

I have a feeling I'll be buying one of those fancy USB-C amperage/voltage meters. I hit issues with the charging spec on a GPD Pocket (tiny laptop). Its maximum supported charging voltage is 12V, which is now optional in the USB-PD 2.0 standard. So most chargers I see now are 5V/9V/15V

Macbook charger for instance is missing the 12V, so it charges slow off that charger since it ignores the 15V rail. Has to fall back to 5V or 9V in theory...

My Apple USB-C charger and cable (which came with my 2017 MacBook Pro) work fine to charge my Switch. No crashes here.

The distinction here is powering the Switch in portable mode vs. powering it via the Dock. Per discussions on Reddit (https://www.reddit.com/r/NintendoSwitch/comments/87vmud/the_...), portable mode draws the expected 5V so it should work with USB-C compliant chargers, but the dock draws more. (although the article says the 87W charger crashed the Switch, I haven't seen that either)

There's also discussion about the new 5.0 update being buggy and compounding the issue.

Unfortunately, it cannot power the dock. That was a bit of a letdown for me

My thinkpad charger and the charger of my Nexus 5X also work without trouble.

Apple's USB-C chargers skirt around some standards, as well.

Out of curiosity which standards? The only one I can think of is providing a slightly higher voltage (but still within USB-C spec) to account for resistive losses in the cable. That's why it says 61 watts instead of 60.

Can you say more? Thanks.

Apple uses a non-standard 14.5V @ 2A PDO unit in the smaller MacBook charger, they also have some protocol-level 'quirks'.


That's quite interesting, as in the last 20 years that I've been dealing with Apple hardware they have typically been excessively strict with hardware spec conformance (not especially "liberal in what they will accept"). They of course had plenty of in-house stuff (adb, AUI etc) back in the pre-jobs-departure and pre-jobs-return eras, but that was often because no standard existed at all.

I wonder if something has changed or if this is an outlier, or perhaps it's just my sampling (which of course is hardly enormous) has simply entirely fallen in a pool of strictly conforming equipment. I can certainly believe the last possibility.

Presumably this investigation was prompted by the Nyko dock catastrophe.

I'm a software guy; what do these revelations imply about which other accessories may or may not be unsafe to use, whether they work or not? In particular, third-party charging cables. My girlfriend uses her MacBook charger to charge her Switch sometimes, and now I'm worried it could brick it one day.

I find it absolutely mindboggling that Switch's are bricking when 3rd party accessories are plugged in isn't being treated as a bigger issue. Everything I've heard is that the Nyko dock is at fault, but why the hell aren't people mad at Nintendo? Then again, I haven't followed it very closely, maybe people are? All I've read is how the dock is bad, and nothing about the Switch's apparent fragility.

People are somewhat mad, but Nintendo's only recently bothered adopting anything akin to industry standards around power adaptors. Even the 3DS is proprietary, and the Switch dock itself is some voodoo magic with a chip inside.

USB C itself has been sort of a minefield of adaptors getting the spec wrong, so I doubt anyone is surprised that docks are buggy and even capable of damaging parts.

As long as she doesn't use the charger to power the dock (so only directly to the device itself) then she's probably okay.

I just wish the included adapter had a longer cable. The 6-ish feet is just long enough to get you off of a wall, but not quite long enough to move away comfortably. I assume that it's due to the spec though.

It's from May 29, 2017.

the switch is advertised to only work with official nintendo cables and accessories. Anything else and you're just risking damage to your console/tablet/thingy.

I don't think they have ever claimed that their USB-C connectors were compliant with the spec.

What they claimed is almost irellevant - the device bears USB-C logo and has a USB-C port - so plugging a USB-C device into that port should in the worst possible case result in a system message "sorry, this device is not supported", not brick the device! There's a reason why you can't use a USB-A cable in your device to carry 220V - because someone will plug it into a normal USB-A socket and burn their computer down. If both sides of the connection have the official USB logo, then connecting them together should never result in either device becoming damaged.

The Switch has many logos on it, but a USB-C logo is not one of them. Nintendo does not use the term "USB-C" in its literature for the Switch.

Nintendo of America may not, but Nintendo UK and Japan both refer to it as a "USB Type-C terminal"

UK: https://www.nintendo.co.uk/Nintendo-Switch/Specifications/Sp...

JP: https://www.nintendo.co.jp/hardware/switch/specs/

I stand corrected. Good catch.

Are you sure it has a USB-C logo?

I wonder why they would even use the same shape then...

Nintendo (and up until recently, the videogame hardware industry at large) has a long history of making their own connectors...

My guess is that if they make their own shape, then third party accessories with that shape are guaranteed to work. Since they use USB-C, then there is fear, uncertainty and doubt by the consumer about if an accessory will work or not.

That's overcomplicated 4D chess. They just wanted to use standard tools and components.

True enough, but then deciding to use a usb-c port on the console is asking for trouble.

The dock is cool but I wish they'd release like a "travel" version which could be just like a cable.

When I travel and stay away from home with my Switch, sometimes I really yearn for the dock-to-TV functionality so I can play it with family/friends.

The dock is a pain to carry around

i bought a simple USB Type-C to HDMI adapter and used it successfully a few times. of course, now that people's switches are getting bricked, i'm scared to keep using it. :(

I'm not sure I buy the vendor lock-in thing, if that was the core motivation why not at least bother to use a custom USB-C connector? This way you wouldn't have to deal with the reports of users plugging other USB-C peripherals and getting crashes.

Seems more likely what whoever implemented that part of the Switch did a poor job. Was it even done in-house or did they contract a 3rd party to hack it?

My son just bought himself a Switch, and TBH this isn't surprising at all. What is surprising is that Nintendo didn't modify the port so that a USB-C cable wouldn't fit. I don't think it's advertised as being a USB-C port, though.

The specs page[0] says there are three USB 2.0 ports and a power port (which, by looking at it, uses the same connector as USB-C).


Why not require some kind of USB-C compatibility details on all USB-C related products? So you know if you use the device/cable what will actually work/not work?

I thought type-c was just the shape, while usb 3.1 and 3.1 gen 2, or thunderbolt 3 being the common specs that usually carry this shape.

Or is it more complicated than this?

Type-C can carry things that are not USB, that's the problem. USB-PD can't be carried over non-C connectors.

USB-PD is still allowed on classic USB-A/B ports

On a related note, what are some reputable brands for USB-C adapters that actually do have proper implementation and quality assurance? Anker? UPTab? Belkin?

I want to say Anker but even they made a mistake and had to recall faulty cables


Although at least they did the recall and presumably will be careful not to make such a costly mistake again.

I have successfully used this (battery and power adapter):


And this:


Both have been solid with the Nintendo Switch. I have tried the Apple charger and it did exactly what this article said: basically stopped the ability to charge the Switch until I did a hard reboot.

Cheers, presumably the AC wall plugs have international equivalents for the non-US market?

(i.e. their 'compliance' has been tested on 240V scenarios)

Actually, I don't think so. These have been tested by one Google engineer, and he lives in the US.

Anecdata: I've had zero problems with my Anker power hub.

USB-C peripheral makers desperately need to agree on a color/marking scheme for the devices and specifically the cables. Right now it's chaos. Red, blue, green, black, ABCD, 1234, square, circle, triangle, cross... Whatever. Just make it so users can tell what's going on.

I was under the impression that some of the issues with third-party adapters were caused by insufficient tolerances, causing damage to the type-C connector. In addition to the non-compliance.

Maybe it's USB-C because of future usage requirements, that they envisioned they might need USB-C to interface with attachments (VR headset? New GB/DS generation?).

I’m curious, how many of the issues enumerated could be fixed by a firmware upgrade? Personally, I wouldn’t be too surprised if the dock isn’t upgradeable at all.

huh, I thought Plus was dead

I was just amazed that a google plus post got so high on hn.

Perhaps if Nintendo had boarded the standardized-connectors train at the terminal, rather than just hopping on at the last station, they would have been more familiar by now with the need to read and follow all the rules of the standard.

How is this at all surprising? Controlling what third parties are allowed to make accessories and games has been part of Nintendo's business model since the 80s. They truly believe (and you can argue this if you want) that the Nintendo Seal of Quality is part of their success.

I believe you might be interpreting this as an attempt at DRM or otherwise limiting use of third party accessories.

All of this is just errors and flaws, rather than attempts to trip third parties. They're extremely easy to replicate if you want to make a Switch accessory, but it means that shit might happen if you try the Switch with a non-Switch accessory.

Considering Nintendo's past of custom connectors and priority for making children-friendly items, I do not think Nintendo would make a device with a standard plug where children plugging in a standard charger might blow things up.

Given the video game crash of the early 80s, it's entirely likely that the Seal (or at least the principles behind it) was part of Nintendo's earlier successes. Whether it's still relevant today is questionable, but I wouldn't write it off.

Also, every other console may not have an official "Seal", but they all have the same process. Nintendo created that idea when anyone could release games for consoles without the console manufacturer approving or even knowing about it. I can't just make an Xbox 1 game and release it legally either. Blaming this on the Seal is completely misunderstanding the context.

I remember reading a long article on how Nintendo brought back video games. According to the article, they deeply analyzed the early '80s crash to determine what went wrong, so they could avoid whatever caused that.

The article said that their conclusion was that they main factor was the proliferation of a large number of low quality games from numerous third party publishers. This was a time when most people's only source of game reviews was magazines or person experience of friends. So a lot of games were bought blind, based just on the claims on the box. After people got burned on a couple low quality games or so, which cost about as much in today's money as today's games, they often stopped or greatly reduced their game buying. Hence, Nintendo decided that they had to control who could publish for their console.

Speaking as someone who was programming games for the Mattel Intellivision at the time of the crash, and got laid off because of it, Nintendo's analysis seems quite plausible to me [1].

Consider Intellivision games. The history for the other consoles is similar, but I know Intellivision best so that is what I will use.

Once upon a time, the only people who could write Intellivision games were engineers from APh. APh was/is an engineering consulting firm in Pasadena, CA, which was founded by and largely staffed by former and current Caltech students. APh designed the bulk of the electronics for the Intellivision, and more importantly wrote the ROM and the early games.

Not too long after that, the set of people who could write Intellivision games expanded to include Mattel engineers.

Games written at APh or Mattel could take full advantage of the ROM, which provided a lot of tools to make writing games easier. If you did not use the ROM you had to write a lot more code, know a lot more about low level hardware details, and you might need to use a bigger cartridge which could raise your costs significantly. (For example, the ROM provided high level sprite [2] handling. You could just give it a list of images, and tell it to animate the sprite with them at a specified rate. You could tell it to give the sprite a velocity and it would deal with moving it. You could give it callback routines for various types of collisions, such as sprite hits sprite or sprite hits background, or sprite hits edge).

The set of Intellivision game writers further expanded when people who had been Intellivision developers at Mattel or APh started leaving to form their own companies. Several key APh Intellivision people formed Chesire Engineering, which wrote Intellivision games for Activision. Imagic was formed by former Atari and Mattel people.

These companies had people who had used the Intellivision ROM. They couldn't take the documentation with them when they left, but they could direct a clean room reverse engineering effort to produce their own documentation sufficient to train new programmers in how to use it. So those companies could produce top quality Intellivision games, too.

But then companies start coming on the scene that did not have any ex-Mattel or ex-APh people, or anyone from Imagic or other places that had their own documentation on the ROM. These companies most likely started with the hardware documentation (both the CPU and the graphics system in the Intellivision were off the shelf General Instruments parts), and then reverse engineered just the boot code from the ROM to figure out what it expected to find in a cartridge. Then they handled everything themselves. This made it quite a bit harder to do a top tier game.

There were also console makers who wrote games for competing consoles. Mattel did games for Atari and Coleco consoles. Coleco did games for Mattel and Atari consoles. I don't remember Atari doing games for other consoles, but Wikipedia claims they did Centipede and Defender for Intellivision and Colecovision, and Pac-Man for Intellivision and Galaxian for Colecovision. The console makers, unlike the small, poorly funded companies from the above paragraph, had the resources to properly reverse engineer the ROMs of other consoles and then use them in their games.

A very large fraction of the games from this last group of companies were pretty bad. They were often rushed, because these companies were often not well funded--they had to get their game done and out fast. That also meant they could not get review copies to magazines with enough lead time for the reviews to come out before or concurrent with the game's release. The market went downhill rapidly at that point.

Anyway, I find it quite plausible that if the industry had stayed with each console only having games written by its own associated developers, the competing console makers, and a couple big third parties (Imagic and Activision), the crash could have been avoided.

[1] Actually, at the time, I was programming for what would have been the next generation Intellivision, which at that point was just a big box containing the CPU and a prototype of the next generation Intellivision graphics chip implemented as several wire-wrapped circuit boards full of 74xx or equivalent discrete logic chips. We could leave work at night, get back in the morning, and find that the hardware people had pulled an all-nighter and changed the design of the graphics, massively breaking all our code...except for Hal Finney's code which always managed to just take a few tweaks to fix no matter how radical the hardware change.

[2] They were actually called "moving objects", not "sprites", in all the Intellivision documentation and code, but I'm going to use sprites because this is what nearly everyone else calls hardware moving objects.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact