I'm pretty sure I was one of the many naysayers who said the Ouya would never make that launch date. Hell, I once ordered underwear from a Kickstarter clothing outfit and that came at least three months late. I've supported several Kickstarters and none of them have delivered the physical goods within two months of their promise (I've even had one delayed by a year). The Ouya is late by a couple of weeks, if even that.
Congrats to Ouya for meeting their goal, here's hoping that their system is a success.
They might have shipped, but it remains to be seen if they shipped something that's any good.
There are a couple games built using an older version of the ouya Unity plugin that introduced some lag, so that might explain it.
Full disclosure: I am an employee of OUYA. But any developer that has had a dev unit for the past 3 months can chime in. We never got this feedback until a couple people in the press brought it up.
Polygon's hands-on review mentions this latency, along with a few other criticisms, including the feel of the controller:
I've talked to the developers of the ouya port and they're baffled by these reports as well.
People subconsciously looking for something negative? No clue.
Because the time between a frame being rendered and appearing on screen can increase, you are reacting slowly to events on the device which you perceive as control lag when in fact it's display lag.
Most TVs will allow these to be disabled (often in a specifically labeled "game mode") but a few may inexplicably not allow them all to be disabled, hardwired for a certain degree of latency.
AirPlay, on the other hand, introduces a lot of latency because it's compressing the image prior to transport and uncompressing it on the other end. This generally introduces significantly more lag than any TV image processing.
It's a consequence of using framebuffers in the display device. I mentioed Airport because it exaggerates the effect, but any digital display will likely introduce s little extra lag.
I think they are reading [HDMI] specifically as [HDMI] rather than [The use of an external display, via HDMI].
EXACTLY. HDMI encodes frames and pipes them to the TV which has to render them. There's no guarantee this will be instantaneous (indeed it's virtually guaranteed that it won't).
Airplay definitely adds more potential bottlenecks but the principle remains the same.
it most definitely does. With DVI/HDMI each frame sent is just that, a frame. at X the screen should be Y. If a frame is damaged/lost you have only lost the information from that one frame (or just part of the frame).
However with compressed data each frame is now intertwined with neighboring ones. if you lose a frame or its damaged, you have now have lost X frames until the dependent frames are past.
I'm not sure if you've ever had the experience of a damaged HDMI cable, but you can see the exact pixels that are affected, and they change frame to frame as you twist the cable making it worse or better.
Compressed video is like Netflix or a damaged AVI where when the corrupted data is hit the entire stream goes wonky for a short while until it suddenly snaps back into clarity when a keyframe is hit.
Uncompressed is as near real time as you can get, the video is directly passed through. Compressed you have a buffer, decoder, etc and there is more delay/processing.
You can still get lag on CRTs — phosphors don't light up instantly — but digital TV adds all kinds of opportunities for screwing up timing, and ironically it's the fancier TVs that do more processing (e.g. Interpolating frames, decoding 3D, et .).
And compressed vs. uncompressed does matter, especially when you're talking about video compression. Because compression requires some minimum amount of data before it can even begin to compress, let alone start to send the compressed bitstream over a physical link. And then you have more latency as you add a decode step. Not to mention the complete lack of latency guarantees a wifi link has...
While its possible the controller lags, it seems a little more likely that the lag is caused from the transition from the iPad screen to a lagging LCD screen.
I'm not a game developer (or hardware expert), but are there not tests that can be performed to capture input lag? At least that way it could be ruled in/out at the hardware/software level, assuming you have a baseline reference of any input lag from TV/receiver.
Edit: Also on his blog http://www.altdevblogaday.com/2013/02/22/latency-mitigation-...
tl;dr High speed video camera capturing button input and screen in the same frame to measure the complete cycle. Diagnosing why is harder.
Have you guys not had the same experiences w/ Canabalt, or have you not heard any other reports of this?
I personally have a hard time noticing latency like some others report w/ games, unless it directly impacts gameplay (i.e. music games like Rock Band). I don't notice it on other games like FPSes, but many people are very sensitive to latency in games like this.
Hardware MVPs are hard, but this certainly feels like a step in the right direction of faster hardware iteration.
From the linked article.
I wonder how long this will last.
This is the new normal. The connector is the valuable part.
I gave a talk on the Internet of Things where I tried to communicate this point clearly, for more and more applications the marginal cost to add a computer is nearly 0, the marginal cost to add a network is about $0.35, so a lot of things that wouldn't have had networks or computers in them before suddenly do.
Back when Sun Micro was trying to get everyone in the Java group to write up patents on anything they could think of, James Gosling in what was a great comment on the process wrote up one for Java inside a light switch. He reasoned it was the most ridiculous patent ever since a light switch from Home Depot was $1.50 and there was no amount of "coolness" you could add with Java that would merit building a light switch with its own processor. What he didn't count on was that the cost of the computer that could run Java would drop below the cost to make a mechanical switch.
This statement amazes me. It may be true, but it is very counterintuitive to my experience.
A micro USB connector is about $0.35/unit (for quantites of 1500). An HDMI connector is about $0.50/unit (similar quantites). These prices are from Digikey. So, in larger quantites from larger distributors, you could certainly get cheaper prices. The cheapest ARM processor (cortex M0) is about $0.78/unit on Digikey. I understand your point is about licensing the ARM design and integrating it into your silicon, but most low cost devices I've seen have just been PCBs that integrate these off the shelf components. I would have to imagine the cost of hiring engineers to do the VLSI design/integration, the cost of licensing the ARM CPU, and then the cost of fabricating/testing your silicon would have to exceed integration on a PCB. So, I would then assume that the processor, while certainly cheap, is still a very substantial portion of the cost of the device. But, I have no data on the cost of ARM licenses to back that up. You're definitely right in asserting the trend is cheaper and cheaper processors, but I don't think we've arrived at the "processors are so cheap they're basically free" world quite yet.
The ARM7/TDMI core is about 100K transistors, that is nominally a square 316 transistors x 316 transistors, which with a 22 nm process is about 3 microns square. The marginal cost to add 3 square microns to a chip is very nearly 0, as an example the 'test feature' on a chip the company I worked at in 2000 was 18 square microns and "wasted space" in the final chip. I say its 'nearly' zero because while the cost to produce the chip doesn't change measurably, the yield curve does and the 'cost' is the chips that fail due to this extra core not functioning.
So if you make a consumer electronics gizmo in quantity and it has a semi-custom chip on it, adding a computer to that chip these days won't make your semi-custom chip that much more expensive and by having a programability aspect you can add features without re-spinning the chip.
As for the cost, TSMC offers "add an arm core" to your ASIC as a design service. I've not been part of a wafer start negotiation for over a decade but it would not surprise me in the least if they offer to throw that in for free these days to sweeten the deal.
 The "test feature" is a part of the chip that the fab uses to verify the wafer processing worked correctly, it generally can be probed to with a simple voltage or current pulse to quickly screen out die which didn't get baked correctly.
So we haven't quite reached the day Eric Drexler hinted at with "so-called microcomputers still visible to the naked eye".
That said, we are still talking about an incrementally small addition to a chip.
Right, and I read this as "the connector is the next thing to go". Someday (one hopes) all of this will be done by software radio, anything that needs to be really fast will be over a generic optical interconnect, and the idea of buying a particular computer because it has a bit of copper that mates with your particular display will be humorously antiquated... like steam engines, or cursive handwriting.
Some vested interest or IP complication will reassert itself into the same chokepoint as soon as (insert your future state here) comes true...
I completely disagree on cursive handwritng though. I use that every day.
> the idea of buying a particular computer because it has a bit of copper that mates with your particular display will be humorously antiquated
Is the next stage going to be everything playing nicely with eachother as it should, or are we going to get a range of proprietary wireless standards that won't work with each other in the name of squeezing a few more dollars out of users?
I'm sure there will be growing pains, but users will place such a high premium on systems that work together that it will have to happen at a software level even if there's differing technology under the hood.
Android is very rich with bluetooth controllers and microusb to hdmi adapters like this http://www.amazon.com/SlimPort%C2%AE-SP1002-Connect-connecto... .
As someone that would rather pay a little more for a real console, I really don't see the appeal of Ouya.
But I think the significant thing is price: you cannot buy a comparable smartphone for $99. This will always be true, because the smartphone needs a display, a battery, be compact, light, and not drain that battery too fast. What will change is that, eventually, smartphones that are good enough for games will be cheaper than $99 (or whatever price-point is then relevant). Although batteries are not improving at Moore's Law rates, this point will probably be reached surprisingly soon.
Bold prediction: the next generation of xbox/playstation/wii will fail for this reason. (smartphone GPUs will likely reach xbox360 levels this year; and the mainstream hasn't been demanding the more powerful GPUs of PCs, unlike previous generations).
I can plug a tablet into a computer by USB, but it can't be a keyboard. Even though it contains a keyboard - is a superset of keyboard components - it still doesn't have the software to do it. Nor can it be a USB soundcard, or a USB display.
(But it can be a second display over TCP/IP and that can be done over a USB-Ethernet dongle which it does support).
Quite precisely what can be usefully plugged into what, is a website I've considered creating for a long time.
This paragraph got me thinking: "I can't point to a single game that would make one need to buy a system at launch. Much of the value of the OUYA hardware lies in what you can do with it, from media functions to creating your own games. It's very possible that a breakout game is coming, and we just don't know what is it yet, but at this point it's hard to point to one single game that will get you to buy a unit."
For a traditional console, that would be a huge issue. (Ask Nintendo about the Wii U launch.) Sony, Microsoft, and Nintendo spend a lot of cash making sure there will be excellent launch titles. Traditional wisdom is that launch titles drive console sales.
On the other hand, the current generation of smartphones didn't have big launch titles. Possibly the landscape has changed.
On the third hand, smartphones do other things than play games.
On the fourth hand, sounds like the Ouya might be a strong media console, depending on how slick that XBMC integration is. C.f. the number of people who bought Playstation 3s as a Blu-ray player.
In any case, I'm impressed that hardware is shipping and I was wrong to think it wouldn't. I'd keep an eye on those lag reports, though; I would think Penny Arcade and Polygon are smart enough to think about video lag as a possibility.
From their email:
"You'll need a credit/debit card to download games. All games are still free to try. Your card will only be charged if you buy content you love. We do want valid payment information for everyone. This is to ensure that game developers can get paid when you love their game."
The article mention gift card, but I just don't want to pay a gift card that I may not use just to download demos. Well I guess I'll use the hardware as a MAME box, and video player.
I agree that storing a CC number has a number of problems (including lack of parental controls, as mention in the article).
 Yes, calling something a credit card when it doesn't allow me any credit, and doesn't provide some of the advantages of credit cards is annoying.
I thought that some merchants didn't accept them for various reasons.
This leads to impulse purchases, and higher sales as the friction on purchase is reduced to a click/tap.
In any case, for games, you probably want to start learning OpenGL ES.
The few Android APIs available, are wrappers around JNI calls.
Java is the native language of the platform and Google does not seem willing to change that, even with the ongoing court issues.
It apparently can build for iOS and the web (via GWT) too.
Though I can't see any reason they couldn't start supporting some HTML5 games as well assuming it can support a browser.
I agree that we'll see a lot more games developed in garages and on weekends. And I think that's where a lot of the unique games come from. But this won't destroy the gaming industry at all. It will help usher out folks who shouldn't be in the industry in the first place.
These days, we're literally running the same software we run on everything else, but in a little box that has an audio/video output and a port for a controller. And then when I realized that, I immediately realized that the console is mostly dead. The only case where this isn't true is where performance metrics are consistent. This is why development on platforms like a PS3 or 360 result in shorter dev cycles and higher quality results: the hardware is all the same. But that matters when you're writing software that isn't shielded from the system, so with Java, that's a non factor, making Ouya nothing special.
I believe the next Playstation, Xbox, and Nintendo will all have their merits -- high-end hardware that is consistent for years, which will allow developers to rapidly build games without having to concern themselves with the lowest common denominator (it's ridiculous to see software designed to run on a 512MB 1 core machine performing horribly on a 24GB machine with 6 cores, 2 GPUs, and 3GB of GPU memory because it was decided by someone that progressive enhancement of features would be too expensive a development cost, or for those high-end features to be completely non-optimized).
For me, I am summarily unimpressed and not excited. For me, this is packaging Java in yet-another-box that I have to buy. Why can't I just download an app and play Ouya games on my PC? That's a -1 for Ouya and a +1 for what Valve is doing with Steam.
I bought a gaming computer last summer and was quite impressed with it until its SSD died (I have to get it shipped to me and replace that SSD at some point). With an ordinary Windows 7 installation it ran a full gamut of emulators, ran the Source engine with the quality settings turned pretty far up, and ran everything else available on Steam with good quality, too. It also played DVDs and downloaded movies in high-def and with good sound, as well.
My real question is: over how many years of usage can I amortize the cost of that gaming PC? Because a lightly-used or "last year" gaming PC costs $600-$800, while a new one with top of the line hardware costs about $1000-$1200. If I can keep it for 6 years like I would with a console, the new console can match the one-year-old gaming PC for price, while the PC has general media functions, retains backwards compatibility via emulation, and gives me choice of peripherals.
Hmmm... but the traditional disadvantage of PCs was having to upgrade your hardware, operating system, and APIs continually to keep up with new features in the gaming world, whereas with a console you'd just drop $200-$300 every 4-6 years for the new system. With a PC, upgrading the graphics card, motherboard or the hard drive might easily cost that much, depending on just how up-to-date you keep it.
Seems to me there's a space for a "Ship of Theseus" model of PC, where the cost of hardware upgrades made every few years can approach the cost of a new console with the same frequency while retaining backwards compatibility.
It's that it's part of a (potentially and to a degree actually) huge Android ecosystem with the same games running on phones, tablets, consoles, Smart TVs, media centers, mini-PCs and netbooks.
Doesn't that change things?
While it sounds nice to run the same game on a console and a touchscreen I don't see that working too well in reality.
Some data points:
1. Many phone tablet games compromise the controls due to the lack of button or stick controllers. They would actually be improved by a gamepad.
2. If a market exists then people will adapt the games to that market. It's not hard to imagine how many touch-only games could be altered in fairly minor to be D-pad friendly.
3. Developers will innovate new approaches to game input if the hardware is out there.
What about the fact that this thing boots up faster than a pc, is dedicated to focus on entertainment around the TV space as consoles are and is a system that is priced at $99. Call me a realist but a pc can do alot of the things my mp3 player can. But i dont want to have to lug around by pc just so i can listen to songs.
Oh and you do know steam is also coming out with their own console right? This only solidifies the notion that there is still a growing market for non techie individuals who want entertainment in their living rooms. Personally i prefer crowding around a big tv when playing console style games with friends. Its a bit hard to do that on 1 computer and 1 keyboard.
There's a tradition of putting operating systems on anything. FreeBSD runs on cameras and game consoles and all sorts of devices that did not intend the user to install an OS.
So it's just a fun gimmicky project.
But the biggest irony is that I would really dig this thing if it were a portable: touchscreen controls suck, there's no way around it. The xperia play was a fiasco and the Vita is going to die any day now. There are portable consoles running android but spec-wise they are all pathetic, and the quality of some is just subpar. If the ouya was a GBASP with a Tegra3 it would be the best portable console ever made, and would blow any phone out of the water when it comes to gaming.
Hopefully the game controller for Apple TV isn't really an April Fool's Joke as AppleTV is close to being the next big console possibly.
1) the dev sdk is not yet complete. Try figuring out how to bring up the in-game menu, or "pause" using the controller. Seems logical that's what the middle button should be for, but there isn't any sort of guidance on this in the api.
2) my game is developed using unity, and runs on android (it's in the google play store). We get my game building using the official Ouya plugin, but without dev hardware, our ability to test is extremely limited, but we do our best to plug in an xbox controller and make sure everythign works. .... so we submit the game to ouya, and they reject it citing "the game takes turns every one second after starting, and the music keeps playing after exiting the game"..... since my game doesn't actually have "turns", and we use Unity as an engine so don't actually do anything special for exiting, I emailed them back asking for clarification ... and got no response.
rant: they name the controllers buttons OUYA. wtf seriously? they couldn't use ABXY?