Hacker News new | past | comments | ask | show | jobs | submit login
Thunderbolt (apple.com)
320 points by arnemart on Feb 24, 2011 | hide | past | web | favorite | 197 comments



I am delighted that Thunderbolt isn't reusing the USB plug form-factor. Early rumors showed Light Peak plugs that were the standard rectangular USB shape with fiber optic channels blended in: http://www.macrumors.com/2011/02/19/apple-to-introduce-light...

The outer rectangular, doubly symmetrical shape of USB is a usability nightmare! You know what I'm talking about. Good riddance.


Too bad they didn't make the plug really usable: connect it anyway you want to.

There are probably many ways to implement that. An USB concept for it: http://www.wired.com/gadgetlab/2011/01/double-usb-plug-conce...


One obvious way to implement that would be the MagSafe connector. Indeed, Apple has some patents relating to data-enabled MagSafe connectors.


Ever notice that an Ethernet jack is just the right height for a USB plug to go into it snugly? I got confused by that one day when I plugged something in without paying attention.


Yes.. 50% chance of getting it right every time, even though it feels like less than that! It doesn't help that some motherboards put the USB ports upside-down for some inexplicable reason, and when it's dark you can't see the little USB logo on the cable anyway.


It's actually more like 33% chance. "Oh, wrong side. flip over Wrong again? Oh, I had it right the first time!"


The bigger problem isn't so much that you've got a 50/50 chance just blindly plugging it in. With all considerations, 50% probability of connecting a device by just sticking your hand under your desk and jabbing randomly with a USB cord is actually fantastic compared to every other connection on a PC.

However, the problem is that the USB ports have no guide to lead the USB in so they stick on the edge and no matter how hard you push it just sits on the edge and makes you think you have it upside down. You would improve usability greatly if the ports just readily allowed the plug to slide in.

I've never had to make an effort to plug in a 3.5mm as the plug design complements the port design. You either miss or it's in.


happens all the time.


Oh, it's not so bad once you take advertising sub-clause of the spec into account. That's the hidden clause that states the USB logo will be on the top of a properly-oriented plug unless you're Microsoft or Logitech, in which case your corporate name is allowed top-billing, so to speak.

Of course, if the slots are oriented vertically...


Yes, I have a couple on the side of the monitor I'm typing this on and they always seem wrong to me once I get something plugged in (after one or two failed attempts, of course).

Also doesn't help when the lighting's not so good and you have a USB logo embossed in black plastic on a black plug with the manufacturer's logo embossed in the same way on the other side. Trial-and-error probably isn't as bad as squinting at it trying to figure out which side is "up"...


Microsoft USB plugs all have a little nub on the "top" side, if memory serves.


Interesting. The only Microsoft USB plug I have these days is one of the wireless stubs, where the logo is the only clue. I have noticed that some Logitech products have a concave top, presumably for the same tactile feedback approach.

Of course, you can always just look at the end of the plug and orient it so the white plastic key is on the bottom, but where's the fun in that?


The mini USB end on my Droid cable has a bit of shiny foil on the side that goes up. I can plug it in while in the dark. Then, I usually have to turn the phone on to plug in the actual USB side.


Wait, USB ports have a "correct" orientation? I just assumed it was random...


lol.. it gets more confusing if the USB's are placed vertically


With vertical plugs I always just plug it in whichever way I'm holding the plug first, then flip it as necessary. It's not like there's any standardization between motherboard manufacturers on this, so why bother trying to remember which way it goes?


Although, I still regularly try to plug in my mini DisplayPort connector the wrong way--just like you describe with USB. Apple is using the miniDP connector for Lightpeak/Thunderbolt.



Pleasantly surprised that it's backwards-compatible with the mini-DisplayPort. Was expecting another round of buying dongles.


Since there's only one port, you'll most likely need a splitter to use both the data and video simultaneously, unless you want your existing monitor to be at the end of the daisy chain. (DP was brought to market for terminal devices, not pass-through like Firewire, any Thunderbolt devices will most certainly have two ports.)

I assume most monitors will transition from DP to a Thunderbolt pass-through quickly, but in a daisy-chain situation, the monitor would be disconnected the least, meaning it would need to either be the first in line, or split off, allowing the other peripherals to be removed without re-connecting the monitor.


Display Port has allowed daisy chaining monitors since 1.2 at least, if not earlier back.


Apple's displays don't have that second port, but you are correct about the daisy chaining, I see.


Sadly, Thunderbolt only supports 1.1a


This is a bit of a moot point, as far as Apple is concerned.

"[...] It is completely backward compatible with DisplayPort v1.1a and requires no new cables or other equipment [...]" - Bill Lempesis, VESA Executive Director.

A 1.2 device will work as a 1.1a device.

The features that you'll loose out on:

* Driving displays in excess of 2560 x 1600 x 30 bpp @ 60 Hz (Apple LED Cinema Display tops out at 2560 x 1440 currently).

* Multiple display daisy chaining (LED Cinema Displays only have a single Mini DisplayPort jack and as such cannot be daisy chained).

* AUX channel data transport at 720 Mbps (beaten by the PCIe transport layer rates within Thunderbolt, which when driving a LED Cinema Display at full resolution leave about ~2Gbps available for data).

* 3d stereoscopic display support (Aww, no Avatar 3D).

* Additional audio format support (mostly related to Blu-ray, which Macs still don't do).

It's sad, yes - but there's not much incentive for Apple (not sure about Intel) to support 1.2 currently. And since Apple is going to have exclusive use of the Thunderbolt interface until it starts showing up on other PCs in 2012, Intel only really has to meet Apple's requirements while they finish up the rest of the spec (optical cabling, etc).

Maybe they'll find time to add in 1.2 support by then.


Wasn't the original idea of light peak to be optical? Why didn't they deliver on that promise, I want my future back, it's frickin 2011 now and we still don't use optical connectors to connect our devices :)

(I guess they'll have a hard time with the 'Thunderbolt' icon when they go to optical, it makes even less sense for optical than for a low-voltage connector)


Because optical is expensive, less flexible, does not provide enough power for devices (think portable HDD) and not many people use long cable.

Once the technology is widely used, the name doesn't matter anymore I guess.


Technically it seems nice, but what's up with reusing the "high voltage" symbol for this? And "thunderbolt" is a very tacky name...


IMO, nearly all of the icons on the side of the laptop don't make sense. The headphones are clearly recognizable, and the rest are just pretty shapes. I'd argue that the headphone jack is the only place you actually need the icon, since it shares the same shape as the goatse-plug next to it (whatever that's supposed to be).


The other mini-jack is an audio line in.


I was surprised about the symbol as well. It's just a matter of time until someone gets hurt while trying to connect their peripherals to a transformer.


Perhaps true, but how exactly would you mate the connector with a high voltage power source without a massive amount of kludgery?


If you've ever done field tech work, freelance or otherwise, you'd know that mere physical incompatibility isn't going to stop some people from plugging nearly any connector into nearly any port.


Sad, but true.


my sister recently plugged a usb cable into an ethernet port. don't assume that because something seems like a massive amount of kludgery to you that a user won't do it anyway.


I plugged a USB cable into an Ethernet port many times! Not intentionally, obviously, but I did it because the USB plug has just the right width to fit snugly in an Ethernet port. It’s easy to do when you are not looking. You should try it, it even feels sort of right in a very wrong way.

There is no kludgery involved when plugging USB cables into Ethernet ports. They just fit perfectly.


I once plugged an Ethernet cable into a USB port...I was a lot younger, and I took a file to it because I was absolutely convinced I'd been shipped a cable that was too big.


No it will work exactly as intended. They'll experience a thunderbolt.


I don't think using a "warning! high voltage!" symbol for this will fly in Europe... (thankfully)


This was the first thing I thought. This is pure and simple bad UI design. UIs should move away from ambiguity, especially around concepts of safety. The high voltage symbol is a safety symbol.

Sure, this very second, there's going to be no confusion between the two symbols, but if this catches on and the symbol becomes ubiquitous, it will muddy the discrimination of the safety logo.

This is simply irresponsible behaviour from a tech giant who intends for this symbol to become ubiquitous. All in the name of "oo, it looks cool!". Bad Apple!


Of all logos for connectors that I have seen in a long while, this is the first where the logo actually evokes the imagery of the name/vice-versa.


$20 says it was Intel's idea.


Do you think it's more or less tacky than "Firewire"?


Always thought Firewire was a pretty good name.


So I guess the official branding of Lightpeak is Thunderbolt, even from Intel? And the standard connector (even outside of Apple products) is essentially Mini-Displayport?


I wonder if Apple will eventually replace the dock connector on the iPhone/iPad with Thunderbolt? That would be a compelling reason to upgrade: your music sync time would be freed of another bottleneck.

Of course, it kind of messes with the third-party accessory market, but I'm sure they'd love another reason to get people to buy new stuff.


Ever try to buy a car mount for an iPhone? :) Mine came with about 10 different plastic snap-on mounting brackets. Apple could give a crap about the third party accessory market.

The big issue with replacing any iOS connectors with desktop standard connectors is power. Thunderbolt specifies 10W of power; USB specifies 2.5W; the Camera Connection Kit only supplies 0.5W. I doubt that Apple is going to ship anything that provides much more power than that.


Thunderbolt may work thru the 30 pin conector, needs four lanes of copper. We should know Tuesday if they do it with the iPad 2.


I'm no hardware engineer, but I'd imagine a new iPad would integrate a Thunderbolt controller in a similar manner to how the older iPods integrated a FireWire controller.


Also remember that the doc connector is a revenue source. I'm having trouble finding a source right now, but if I am remembering right Apple sells the actual dock connectors to accessory makers.


Yep, Apple charges royalty on the 30 pin conector, which varies in cost depending on what it's used for. You have to be enrolled in Apple's MiFi program too. Apple makes money everywhere.


I heard apple charges $4 per unit.


Pretty sure that Apple's dock connector is sold directly by JAE -- it's the 'DD1' listed here: http://www.jae-connector.com/en/general_en.cfm?series_code=D...


You can't just get them from JAE, you have to license the male plug thru Apple and it's a big process. And Apple lets no one but Apple use the female 30 pin plug.


License under what legal theory? It doesn't seem like a fairly ordinary plug would meet the standards for either patent or copyright.


I don't know, but there is a reason no other manufacturers use it. And its a massive legal and due diligence process to get into the Made for iPod program which gives you all the specs and supplies the connectors, along with charging royalties.

http://developer.apple.com/programs/mfi


Sparkfun sells male and female connectors: http://www.sparkfun.com/categories/101


I doubt it, PCIe (apparently the main underlying protocol) isn't a common feature of ARM SoC systems.


Marvell's cores have one lane of PCI express. Check out the OpenRD: http://www.open-rd.org/


Even if it's not part of ARM's reference designs, could Apple integrate it into a future A-series chip?


This goes beyond my knowledge of hardware design, but I suspect there's no fundamental reason preventing it. The signal processing at those sort of clock rates may however consume more power than is practical - I don't know.


PCIe essentially trades greater hardware complexity for lower power requirements (and higher performance). Original PCI is similar in this regard (that also reduced number of required external passive components on motherboards compared to ISA).

So if anything prevents usage of PCIe in embedded devices it's added complexity, not power requirements. By the way electrical interface of most modern true-color TFT panels is identical on the lowest layer to PCIe (and SATA and who knows what else).


I'm pretty sure the dock connector isn't going anywhere anytime soon, though I could see a Dock-to-Thunderbolt cable optionally replacing the Dock-to-USB cable.


The dock connector is actually wide enough to allow the device to dock! So don't expect it to go away while docking is a priority for Apple's design decisions.


Two questions:

Is it royalty-free?

If the answer is negative, what are the licensing terms?


Thunderbolt technology leverages the native PCI Express and DisplayPort device drivers available in many operating systems today. This native software support means no extra software development is required to use a Thunderbolt technology enabled product.

http://www.intel.com/technology/io/thunderbolt/325136-001US_...


Not enough. That's talking about support from developers, not whether hardware vendors need to pay for licensing to include this tech in their hardware.

That said, Mini DisplayPort is free of hardware licensing fees... so there's a good chance.


The poster of the question is a developer. I answered to help get the information he asked for, not have my answer graded on completeness or lack thereof with a curt "Not enough".


I think the "Not enough" was aimed at Intel, not you. But you seemed to have missed the point of the question.

No hardware developer in the consumer space charges developers to access their hardware. Spec owners frequently do charge other companies to implement their specs, though.

Intel has a mixed record on this - some specs - like USB - they have led but have been implementable licence free.

Other things they consider their propriety interfaces and have sued over - see the recent lawsuits between Intel & Nvidia over their memory bus.

So it is very reasonable to wonder if Nvidia and AMD are going to be able to implement this freely, or are they going to have to pay a tax to Intel. As far as I can see your answer doesn't address that question at all.


Yeah, sorry for the confusion. As nl says in the sister-post, that was Intel has released "not enough" info for us to know the full scope of licensing terms.


Has anybody read anything explaining how giving external devices DMA access is or isn't a problem? It seems one could make a Thunderbolt device that silently injects itself into the running system, writes itself to the hard drive, etc. without even touching the CPU.


Intel's page sheds some more light on it, too: http://www.intel.com/technology/io/thunderbolt/index.htm


The problem with USB 3.0 is that it still isnt supported by Intel and probably never will since they developed Thunderbolt. If you want USB 3.0 today, you need an extra Chip on your board because its not integrated in any chipset. When Intel integrates Thunderbolt nativly the game is over for USB 3.0


“Intel fully supports USB 3 and plans to integrate it in the future.” — http://www.engadget.com/2011/02/24/intel-promises-native-usb...


got me there. Seems like they see Thunderbold not as a competing technology and more like a successor of firewire. Fine by me then, but they havent integrated USB 3.0 yet into their chipsets!


So the whole USB2.0/Firewire is going to repeat? Sigh.


Thunderbolt aims to replace nearly every kind of single-use connectors (HDMI, DisplayPort, eSATA, USB, Ethernet). Unifying the connector for displays, peripherals, network and power is a great idea, so I can't complain if they're going up against USB 3.0.


Thunderbolt is a high end connector so it isn't going to displace USB in low end devices.


Ha. Good luck to them. Thunderbolt will be lucky to replace firewire. There's no way in the world it can replace USB, Ethernet or HDMI. Just think of the number of devices out there with these ports.

It might replace SATA eventually, but it will have to fight against USB.


From the Apple page on Thunderbolt: "you can use existing USB and FireWire peripherals — even connect to Gigabit Ethernet and Fibre Channel networks — using simple adapters."

So, twice as fast as USB 3.0 and compatible with existing devices. Doesn't seem like much of a fight. USB won't disappear overnight, but it's already obsolete.


Thunderbolt may be technically superior in every single way, but unless other manufacturers use it for their laptops/motherboards, Apple's 10ish% market share is going to ensure that USB 3.0 "wins" ultimately. This will wind up being just like firewire where aside from Apple's stuff and a select few "Apple" manufacturers, nobody uses it and consumers either buy the more expensive Apple stuff, or use a dongle and see 0 benefit from the superior tech.


Apple also pioneered USB with the iMac. Because USB was the only way to connect anything to the iMac, it served as a catalyst for device makers to come out with USB devices, since they knew they had a captive market.

Once the number of USB peripherals reached critical mass, the general PC market followed suit.

So, will Thunderbolt adoption more closely resemble that of Firewire, or USB?


I believe the new macs still have USB ports, no? So what incentive do peripherals manufacturers have to use Thunderbolt and target just the new macbooks vs USB 3.0 and targeting everything?


But we're not talking about Apple's marketshare. We're talking about Intel's influence. And that's quite a bit different.


Thunderbolt will replace them. And it will itself be subsequently replaced for its own shortcomings.


It this a qualified statement or are you just guessing?


I’m happily using my FireWire 800 port on my 2007 MacBook Pro. I don’t really have a problem with the competition.

(What’s nice for Apple is that this port isn’t exactly risky for them. If it doesn’t succeed their Macs have a glorified Mini DisplayPort with a lighting bolt symbol next to it. It might cost a bit more for them to add but that’s about it. It seems to me that only Intel has a problem if this fails.)


Seriously. As a non-Mac user, I really can't see going out of my way to use anything other than USB at this point, especially with USB3 being in the same ballpark in terms of speed.


> especially with USB3 being in the same ballpark in terms of speed.

For some uses it's not just a matter of speed - USB2 and FW are similar in terms of speed but USB2 is pretty much unusable for multi-channel audio recording purposes.


And so is FW unless you buy one of the very expensive interfaces that happens to have highly optimized fine tuned drivers.

Source: personal experience finding the drivers of most FW audio interfaces under $1000 impose outrageous CPU loads when doing many channels and using more than one device on the supposedly daisy chainable FW bus.


Funny, my experience is the opposite--FireWire drivers are generally very DMA oriented and require little to no CPU at all, while USB drivers tend to need more CPU for the protocol overhead. This is because FireWire is an address oriented bus and the hardware can map FireWire addresses to CPU addresses and DMA in and out with (literally) zero CPU overhead. A driver write would really have to go out of their way to make a FireWire protocol become CPU bound. I suppose the audio protocol is isochronous based which would be slightly different, but that is still DMA based in every FW implementation I know of.

8 channels of 24 bit/192KHz audio is only 2MB/s which is basically nothing. I'm very surprised that you would see any issues.

I have about 3 or 4 devices in my FW chain (including disks and scanners) and my sub $1000 FW audio interface works just fine.


Same here. I have a $250 FireWire audio interface that imposes basically no load on the system even with upwards of 10 channels in use.

I chose FireWire over USB because of all the complaints I saw in the Amazon reviews for every USB audio interface I looked at; many of the FireWire audio interfaces had absolutely glowing reviews.

I've had similarly excellent experiences with disks connected via FireWire.


Yeah, this has been my experience as well. Even cheap Firewire audio interfaces are light-years better than their USB counterparts.

I still use a USB 2.0 MIDI controller and the latency is absolutely killer. Press a key, a beat later, see the note appear on piano roll...


That's interesting. I've had no problems with USB MIDI adapters. As long as I have less than 5ms latency in my audio path, I'm able to play comfortably over the cheapest USB-MIDI adapter on Amazon. I've also had reasonable success with USB audio interfaces, with latency around 5ms (IIRC - I switched back to my ancient emu10k1 card because of the DSP).


That's USB2, not USB3, and also something that 99% of computer users are not going to run into.


USB2 and FireWire were close to the same speed. Thunderbolt is twice as fast as USB3 and can push DisplayPorts, so I doubt it will be the same.


Yes, but

1) USB3 has a 1 year lead - check out how many products are out already

2) 5Gbps vs 10Gbps. There are just a couple of SSDs that need SATA III (6Gbps) because the 3Gbps isn't enough. You just don't need the speed for disk I/O. What for then? Current DisplayPorts 1.2 has 17Gbps so I don't really see the Thunderbolt replacing it.

also Firewire 800 was substantially faster in theoretical and particularly in practical throughput


Firewire was 400MBits for 5 years before USB2 was released (USB1 was only 12MBits). Look who won that one.


IIRC, it was because Intel made USB chipsets really cheap vs IEEE1394a (aka FireWire) chipsets.


I recall it being because Apple screwed up the licensing so bad ($1 per port) that everyone got together to create and then push USB 2.0 as a replacement even after the royalties got reduced to something semi-reasonable like 25c per device.


I worked at a place that developed FireWire peripherals. I can tell you the $1/port price never mattered to us, despite the loud backlash on the internet.

The real reason USB "won" is because Intel really pushed it and integrated it into every one of their chipsets. Possibly even licensing it to the other PC chipset makers for cheap/free up front.

Given that Intel is backing Thunderbolt, it stands a good chance at achieving the same widespread usage.


USB can easily be emulated and done using just the CPU, no chips required while Firewire required a real chip that did the negotiation and sat directly on the memory bus for DMA.


I don't care who won, I like my FireWire CF card reader.


Right up until you have nothing to plug it into.


He can plug it into the new Thunderbolt plug with a simple adapter. Thunderbolt is already compatible with USB and Firewire.


It sounds like Thunderbolt is "compatible" with USB and Firewire in the same way ExpressCard and CardBus are -- by putting the host controller that would normally be on a PCI/PCIe card on the bus.


#2 is classic technology short-sightedness. How many times have people said that a 100 GB Hard Drive, 2 GB RAM, etc. is more than sufficient?


I disagree. He's just saying that the speed difference doesn't currently matter, not that it never will. The battle for adoption is fought based on today's usefulness, not tomorrow's (in most cases).


1) true 2) true, but keep in mind that this bandwidth is shared by the entire bus.


Re #2: higher performance external video cards (think 3D accelerator dongles for laptops), cluster interconnects, RAID arrays of SSDs, RAM-based "disks" for use as another layer of cache, lower-latency ultra-high-channel-count external sound cards, etc.


I have maybe 20 products at home that plug into my USB 3 ports.

I don't have anything that plugs into thunderbolt.

I couldn't and wouldn't buy a computer without USB. I can live without Thunderbolt.


Doesn't USB rely on the CPU and FireWire offloads processing to a dedicated chip?

Does anyone know if Thunderbolt is similar to FireWire in this way?


Apple seems to be targetting a whole different sector with this technology. See Cringley's take on this (http://www.cringely.com/2011/02/attack-of-the-minis/) about how this technology could go into data centers. Interesting move by Apple.


Oh, that's just Crazy Bob talking. If that had really been Apple's strategy, they wouldn't have alienated so many customers in the way they killed the xServe line last year. If they had really wanted to keep their toe-hold in that market, they wouldn't have said, "Let them eat mini Macs." I think it's simply the case that Apple's signature advantages in the consumer space don't translate into a 1U world, and they recognized that.


But on the other hand, it does work as a move by Intel...


So thunderbolt is PCI-E at the end of a cable, Cool. I can see people building neat, cheapo numa boxes with this. Think sgi altix on the cheap.

For those that don't know, the SGI Altix has a special chip that intercepts memory accesses and maps other systems memory to be seen as "local" on each system. If thunderbolt is just pci-e on a wire, you may be able to connect a few systems together and just map memory across systems. It'd take some trickery, and wouldn't be quite as fast as infiniband, but the thought of building a ghetto supercomputer would be useful to many people.


Anybody notice how it shares the name with HTC's 'ThunderBolt' 4G phone being released, and how it looks like both Intel and HTC have trademarks on the word?


It's perfectly reasonable for two entities to have trademarks on the same word, so long as they aren't in the same business. In this case (without looking at the relevant legal paperwork), HTC could trademark 'Thunderbolt' with respect to phones, mobile devices, whatnot, while Intel may have the trademark with respect to peripheral data connections. Nobody (except maybe Monster Cable) would have an issue with that arrangement.


And what if a phone one day wants to support a Thunderbolt interface? I think that they're a bit too close to one another, both being parts of the consumer electronic space.

Thunderbolt the phone, however, can be expected to have a much shorter lifespan than a new connector like this.


Is this peer-2-peer like FireWire was or it is a client-server model like USB? I see people talking about this being copper or fibre. If this is fibre, then it can't supply power to the device like USB? I don't see that catching on for most portable devices (e.g. hard drives). It's extremely convenient to just have one cable for a device that needs connectivity and power when it comes to portable devices.


You can always have one optical fiber for data, and a metal conductor for power, wrapped up in the same cable / connector.


It’s copper, for the time being at least. Intel’s document says it can transmit up to 10W of power.


Peer to peer.


I can't seem to find an answer to this in the materials - is this optical or copper? Light Peak was supposed to be optical, but the Wikipedia page has unsubstantiated claims of it initially being copper.


> Electrical or optical cables

From http://www.intel.com/technology/io/thunderbolt/index.htm

I don't know what that means though. Does the newly released MBP support both or only one?


Engadget seems to think the MBP will support both at least in theory: Intel will eventually bake the optical transceivers into the cables themselves.

http://www.engadget.com/2011/02/24/intel-thunderbolt-a-close...


Light Peak right now is only copper. Optical will be added in the future most likely.


I'd like to read about this. Is there an available document explaining why? Or do you know the reason?

(Not that I hate copper or anything, it is just that they designed it for fiber from the beginning and then suddenly bailed out)


Same reason why every interface standard that flirts with optical eventually ends up supporting copper: optical components are more expensive, and over short runs you can get acceptable performance with copper.


They're also not as flexible a copper...


A which point they'll call it Lightning? Nah, too obvious :)


But, Thunder always follows Lightning...


It's copper for now. I haven't found any confirmation of this, but I believe Intel's strategy for optical is to have the lasers and light detectors built into the cable itself, so that the plug connection is still completely electrical.


Has anyone considered using an external video card with this? That would be a great use with desktop replacement laptops since it would actually be upgradeable.


Codenamed Light Peak.

Intel's page on Light Peak (not the same as theirs on Thunderbolt): http://techresearch.intel.com/ProjectDetails.aspx?Id=143

Wikipedia has a very informative article on it: http://en.wikipedia.org/wiki/Light_Peak


It's not an implementation of Light Peak, it is what Light Peak is now called.


You're right; Developed by Intel (under the code name Light Peak). (I've corrected my comment.)


Can anybody shed light on the possibility of DRM or some implementation of Tilt-bits to restrict output from this port to high resolution screens etc?


DisplayPort already has two different flavors of DRM. Thunderbolt, being basically PCIe over a DisplayPort cable, doesn't add any additional DRM.


Not that it hasn't been possible strictly due to lack of a suitable interconnect technology, but I wonder if this could facilitate using your mobile device as a sort of "personality module" that could plug into a monitor with a built-in graphics chip and maybe some additional processing power (accessed via OpenCL).

Obviously the OS would have a long way to go to support that kind of thing, but I would be surprised if in five years your typical "home directory" isn't either entirely cloud-based or uses a scheme like this.


How long before we can buy external graphics cards that utilize thunderbolt? It'd save me the hassle of having to build gaming PCs every couple years.


If you're serious enough about your gaming hardware that you're currently building gaming PCs every couple of years, you're still going to be replacing your CPU and your graphics card every couple of years, whether graphics are inside the case or not.

Besides, Thunderbolt = 10 G_b_ps, PCIe x 16 = 8 G_B_ps. No idea on latency/transfers per second, though. (http://www.intel.com/technology/io/thunderbolt/index.htm, http://en.wikipedia.org/wiki/PCI_Express)


You can lease your card: http://cuttingedgegamer.com/


Those prices aren't bad at all based on some newegg comparisons. I was expecting a massive markup.


Can a Thunderbolt device (eg display) expose USB to peripherals? With my current setup, I connect my display via USB and DVI to my MBP in order to connect USB peripherals via the display. I'd love to break that redundant USB connection with Thunderbolt.


Yes. It looks like you would have to implement USB on top of PCIe in the monitor, though.

> Intel's Thunderbolt controllers interconnect a PC and other devices, transmitting and receiving packetized traffic for both PCIe and DisplayPort protocols. Thunderbolt technology works on data streams in both directions, at the same time, so users get the benefit of full bandwidth in both directions, over a single cable.

http://www.intel.com/technology/io/thunderbolt/index.htm


> It looks like you would have to implement USB on top of PCIe in the monitor, though.

Isn't that basically UHCI?


It would be awesome if iPad2 had the same port... I say it's a possibility!


i would love to have and iPad or TouchPad as a second display.


Wonder how many monitors this can push. Also, what will happen to Apples 30pin connector on their iPods,iPhones... I guess we will know Tuesday. Are there any external hard drives with thunderbolt yet?


The bandwidth is apparently entirely unimpressive. DisplayPort is currently at 17.28 Gbps, so Thunderbolt can push ... 0.6 monitors? ;) (In the worst case, at least)

I have been excited about Light Peak/Thunderbolt for a year or so now, but in that time it seems the ambitions have become smaller, and the competition has developed as well.


It has two independent 10Gbps channels, so it can do 20Gbps.


That's not really relevant as long as you can't choose the directions for the channels. You are only going to get one channel in each direction.

Edit: Whoa. I just re-read parts of the Intel documentation, and it seems there are indeed two independent downstream channels. My bad. It also looks like there is no provision for using both channels for a single device? I have no idea.


2x10Gbps is still enough to drive two 2560x1600 displays at 60Hz, which is all the prior DisplayPort implementation could do.


Check out the Wikipedia page for DisplayPort, as it has a good explanation of the bandwidth requirements for video.

It lists 2560x1600x30bpp @ 60Hz as using just over 8Gbps, so any one consumer monitor will work fine over Thunderbolt.

The 17.28Gbps speed would only be needed for multiple monitors or a high-resolution, high-color-depth, high-refresh or 3D monitor. Thunderbolt and full-speed DisplayPort both have enough aggregate bandwidth for four 1080p streams.


Well possibly, if it's in displayport mode using all four lanes in one direction, it can push that much bandwidth. How much bandwidth does a 27" apple display use?


Well, let's see:

2560 x 1440 resolution at 24 bits per pixel = 88,473,600 bits

At 60Hz, that's 5,308,416,000 bits per second. So a little more than half the claimed 10Gbit bandwidth of Thunderbolt. I guess you're not going to be running two of them on the same port.


Depends on what you're using it for I guess. For my typical day coding / web browsing / etc. I'd rather run two at 16bit instead of one at 24bit.

If I was editing movies it'd be a different story.


I might have missed it shooting through the threads, but has anyone announced a PCI express card with thunderpeak? I'd love to have 10gbit between my workstation and my NAS without having to buy multiple FC HBAs.

In fact, I'd probably take all of my current DAS and add it to the NAS pool as well. It would make my home office much quieter if I could hide all the spindles in another room and still have fast storage access.


Five years ago I would have been eagerly anticipating this, but I can't really get excited anymore. USB3 has probably already won, and while I love the idea of monitors and other devices all using the same port, I don't think it will be enough to drive adoption.


USB3 has probably already won

Why do you say that? Are there more USB 3 things on the market than I'm aware of? With Intel and Apple behind it, I wouldn't say Thunderbolt is a sure thing, but it seems promising.


For a moment I thought the device on the left side of the image was an apple version of this on its side ...

http://gemsres.com/story/apr08/536976/ENGELBART_2.jpg


i sense multiple external screens on a MBP :D


That's already been possible for a few years - http://daggle.com/macbook-pro-multimonitor-4-monitors-at-onc...


Possible, but not great. The Matrox and Diamond solutions have always been a bit hacky - USB simply can't push bits fast enough and there are lots of cases where you will max it out and stutter.

This is actual support for dual monitors, without compromises.


i know, but the Matrox Solutions make your external monitors into one big monitor, so the dock and basically every new window that pops up will be in the middle and split between to monitors. Its not really that convenient.

USB2DVI is nice but USB 2.0 is too slow. USB 3.0 is solving that problem already and thunderbolt will too, i am sure. DisplayLink is already working on USB2DVI for USB3.0 which makes stutter free 1080p playback etc over USB possible and thunderbolt is alot faster.


So if PCIe x16 is rated between 8GB/s and 16GB/s would it be possible for someone to come out with a PCIe enclosure so I could hook up a semi decent nVidia or AMD/ATI card to my MacBook?


Awesome. Two monitors on a MBP with Thunderbolt. Now, where the hell is the adapter to buy that will actually let you connect two monitors?


Kinda cheezy name. Also, the plugs on the Intel site don't look like mini displayport. Is the Apple version proprietary?


The pictures I've seen look like mDP to me. Apple invented Mini DisplayPort, so mDP is proprietary, but it's licensed for free and VESA later adopted it as part of the DisplayPort standard.

http://en.wikipedia.org/wiki/Mini_DisplayPort


Now the question is how long it would take market to catch up with adapters/hubs or updated hardware.


No hubs--it's a chaining protocol.


What happens if I plug a PC into another PC via thunderbolt?


Thunderstruck?

Snarky I know but I just couldn't help myself.


I sincerely hope there's a debug option that causes it to play AC/DC in some fashion.


Hopefully the same as when you plug a Mac into another Mac via FireWire. (One of the Macs becomes the most expensive external HDD ever. This requires one Mac to reboot into the target disk mode. It would be nice if the same trick would be possible with Thunderbolt without rebooting.)


You don't have to use TDM. You can also use Firewire as a network interface.


You have to boot into target disk mode for the same reason you can't plug one Firewire disk into two computers simultaneously.

TDM is supported in Thunderbolt, to my great glee. Terrific Firewire feature.


Something I really wanted is, when I connect my laptop to my home computer, my laptop can use the home computer's CPU, GPU and RAM and have an additional screen.

I believe Thunderbolt will make this dream true, since the Intel page mentioned "workstation performance expansion".


It really seems to me a lot of this is about making your current peripherals obsolete so they can sell you new ones.


In the long run, Thunderbolt aims to replace USB, FireWire, Ethernet, DisplayPort/HDMI/etc, eSATA and be considered an external variant of PCIe.

This is feasible and would be a good future, just like the future we live in now where USB has replaced serial ports, parallel ports and PS/2.


The weird thing is that there already is an external cable for PCIe. An 8-lane PCIe cable can beat Thunderbolt, at 16Gb/s instead of 10. (Although I'm not sure if that's bidirectional; if not, you'd have to go to a 16-lane cable.)

https://secure.wikimedia.org/wikipedia/en/wiki/PCI_Express#P...


I for one don't miss serial, parallel, and PS/2 ports.


I miss serial ports.


Why?


Because serial is a trivial interface to implement in home brew electronics, just throw in a MAX232.

Even many major consumer electronics are still developed via a serial console. Sure they have fancier connectors to talk to your PC and the network but serial is so simple it's practically idiot proof. So it breaks far less often, makes new hardware much simpler to bring up, and it allows you to debug the fancier interfaces without interfering with their operations.


Ah, I see. I can definitely understand that.

Still not crazy about it as yet one more connector for mass-manufactured devices, though.


Yes, this is what they say about every new standard. And once you have thrown out all your working peripherals and bought all new ones, then LightningBlast is released, not compatible with Thunderbolt. This has been going on as long as I have been alive.

I know, I know: Thunderbolt is the last time people will have to throw out all their things and start over. This time really will be the last time because it is a new standard to rule over all, just like NuBus was.

And it's definitely not going to be dead on arrival like PCI-X, or not supported by drivers on the Mac for most peripherals like PCI-e. No, Thunderbolt is going to be truly universal and forever lasting world without end. If only you believe, amen.


It does USB, HDMI, Firewire, displayport et cetera... its one connector to rule them all (though there will need to be a few adapters!)

No need to replace your peripherals.


Is this why Apple dumped FireWire a few years ago?


All Macs except the low end plastic MacBook and the MacBook Air have a FireWire 800 port. So, no. (Also: Why would they dump FireWire without a replacement in place or even just the lab?)

This will probably replace FireWire sooner or later.


This is what I meant, but thanks for your downvote anyway: http://www.wired.com/gadgetlab/2008/10/apple-quietly-k/


A simple pin-adaptor turns a FW800 port into a FW400 port - they’re electrically compatible.


I wouldn't be surprised if the 10Gbps chips are interoperable with 10Gbps Ethernet chips at some level.


Sounds like the sequel to Steve's Disney "classic" Bolt(2008).


On another note the new MacBook Pro is nothing special: - still no SSD drive built-in (+$250) - still no 8GB of RAM built-in (+$200) - less battery duration than last generation


Are there any laptops with 8GB built in the default?


no doubt it is a great innovation, but it is a Epic Fail with its logo or symbol or branding whatever you call it. it should be something special like USB, Ethernet, Sound symbol etc.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: