A big glossy launch at CES with no details does absolutely nothing for my project. When an NXP or STMicro rep comes into the office and shows me what's new, maybe drops off a devkit...that's useful.
The NUC/stick are the same, and the Surfaces are Microsoft's way of pushing their OEMs forward. Nexus/Pixel is Google's same realization that they cant just provide one component and pray for third parties to deliver consistent top tier products. These companies create a specific component (processors, Windows, Android) and then release a reference to prevent a race to the bottom. If an OEM cant beat the reference, consumers will buy it instead.
And their reference devices are generally poor...it's not hard for OEM's to beat them.
note - I was an intel employee, but on the software side.
They want wins like automotive platforms. That's really the upcoming battleground. NXP/Freescale has a very strong hold in that area though, which is why Qualcomm bought them.
Also, on the forums you'd see old and nagging bugs with basic operations like getting the networking up and running without having problems because by default usb0 owned the default route.
You'd think that Intel would have addressed these things. Eventually, I decided to not trust Intel. I was worried that they would stop supporting the platform and leave me hanging, having spent thousands of hours building a solution around it.
Now we can see that they're off on a new platform. I won't be spending any time with it.
As a linux system-on-module, it doesn't seem badly priced for low quantities/one offs. Although for a little more you can get an arm SOM that has display capabilities.
I had a better time as a dev with arduino Yun but it was way too limited (memory and flash) for what I wanted.
While they have been producing endless press releases about "makers" and "IoT" in the last couple of years, they still charge $4k+ for a usb vid (and prohibit sharing) as part of the usb if . So yes, I wouldn't expect much change.
"interactive refrigerators and smart kiosks to security cameras and IoT gateways."
I can't see a strong advantage for Intel here. I don't think there's anyway a modular solution can compete with an integrated design using low cost ARM SOCs (typically <= 1USD for something capable of running Linux in China).
I was kind of hoping this would be targeted at high density server applications, which could be interesting but doesn't seem to be the case.
Smart kiosks? Really? It's 2017, not 1995.
Intel can keep screwing around with this stuff while Qualcomm+NXP+Freescale slowly captures the rest of the market they don't have yet.
Just embed a tablet with an app in a kiosk.
- Not everyone has smart phones, do they? I mean, maybe 95% do, but it's still pretty harsh on the 5% to just abandon them altogether
- Your phone might not have any power
- A mall directory board is maybe ~20 times bigger than my phone - MUCH easier to use
- Much harder to locate whatever website I'm supposed to use than just walking up to a screen which is already displaying the correct 'site' and doesn't pop up all sorts of notifications / start ringing in the middle of your interaction / allow you to get distracted by a million and one other things.
I was actually in a mall the other day, looking for a video directory board and couldn't find one. I'm not convinced there wasn't one, but I had my phone on me, tried to find useful info. on what I thought was the mall's website, and just gave up.
Complete directories and inside maps for 1250 malls.
Search is easier to use than scanning the giant board, as is having the info where you are instead of having to hunt for a board.
How much money is there really to be made catering to the poorest of the poor, given Intel's position in the supply chain? Intel is a multi-billion dollar company - they didn't get there by caring about poor people, and they won't move the needle on billions in earnings by starting now.
I wonder at what point LCD / LED panels will be so cheap the outside of your fridge is just one panel per door. We already have fridges with cameras inside, so maybe at some point you can browse the fridge without opening it.
Talk about over-engineering! What about glass?!
At home.. who needs it? The inside of your fridge is probably the messiest looking surface in your kitchen, why show it off?
While that's true, most merchandiser refrigerators (as they are known - also "display coolers" is a term) use dual-pane glass on the doors (and I wouldn't be surprised to find triple-pane either).
I've been looking into one of these for a while now; my wife and I want to get one to replace our current side-by-side, because we hardly ever use the freezer side, and we already have a chest freezer in our garage anyhow. The problem hasn't been the cost (commercial fridges aren't really much more expensive than consumer units), but the size - they are too tall for the area we have (without mods to our cabinets). Of course, we have found that new consumer fridges have the same problem!
Our house was built in the early-1970s (block construction, copper wiring and water pipes, and no HOA - so if I want to weld in the front yard on my vehicle while it's on jack-stands overnight - NFG), and so it has the old-style "crappy cabinet over the fridge" area - so we are limited by that, unless we take it out - which may end up being the solution.
The only other downside to a commercial unit is the fact that the warranty is instantly voided if you buy one and install it for home use. Furthermore, service is more expensive (part, labor, and the whole nine yards). Still, it should be more reliable in the long run, which is what we want. We did the same thing recently with our microwave, purchasing a standard Sharp branded commercial unit that nearly every restaurant and food service place uses; they are super-tough, and much better quality than a consumer unit. Easy to wipe out, and only one control (100% power, one knob for up to 6 minutes). We haven't had a problem adapting to it - it's so much nicer, and stainless steel!
(PS I am quite happy to have lowebot follow me home and help me find my glasses.)
- Mid 2017
- Needs a dock to be powered and cooled
- USB-C plus another unnamed connector, looks like this (http://i.imgur.com/887iweg.jpg)
- Built in WiFi/Bluetooth
- Up to 7th gen Intel vPro processor
Actually, my phone isn't much bigger than a credit card, excluding thickness. But this device might be a lot more powerful. Though what "7th gen vPro" actually means re. performance isn't disclosed.
Needs a dock to be powered and cooled
I feel like USB-C gives much more flexibility: it has a well-defined configuration enumeration protocol that's already supported everywhere. There's a lot more options for extensibility and future-proofing with USB-C.
With the onboard storage this is one step closer to my dream: the ability to plug a phone's compute module into a desktop to give me access to a more powerful GPU, extra storage, etc.
My big question: will this card format be an open standard, or will Intel be locking it down?
In terms of process migration I still think Google has missed a trick by not marrying Android and light-weight containers. If I could flick my wrist to send a phone process to my Chromebook, Android tablet, Andromeda device or smart TV life would be pretty interesting. Maybe one gesture for screen mirroring and another to move rather than mirror or copy the process.
This would also open up new channels for malware sure but also cool things like trusted processes / data for authentication. You have an invitation to my party, you say? Dock it with my Google Home device which signed your invitation to prove it, then...
Our next hardware generation will have a dedicated FPGA handling the communication with the GPS, IMU, servos. The compute platform will communicate with the FPGA over Ethernet. This means that all hardware drivers can be written against a network based interface, resulting in a loose coupling between the compute and IO. This means that both the IO hardware and compute hardware can be replaced individually without breaking the setup, and without having to rewrite any drivers. It also results in high transparency for debugging IO, with the possibility of easily recording and playing back a scenario. You can also test the full flight software on a developer's desktop machine during development without the embedded compute platform.
As embedded compute platforms get better all the time, while the IO stays more stable, this will make it easy to replace the compute every 2-3 years.
You're applying the same mental model to hardware which can be seen in the similarity of tools that can be enabled on top of such abstraction. Way to go!
The flexibility that emerges from decoupling our software and hardware solutions is a [recent] trend that will only gain on traction as more people become aware of it's implications to how we create solutions to technological problems.
How can we evangelize this better? From my experience it's complicated to communicate this idea well.
not sure if true, may be recency bias or the fact that I'm still young and inexperienced in many ways and lack historical background knowledge in tech
Our opinion is that Ethernet is easier to analyze using tools such as Wireshark, in addition to being easier to run in a cable between different PCAs. It also allows the IO board to be easily networked for communicating with development tools to reprogram the FPGA memory, debug issues, do testing from a developer machine etc without concern for driver issues on various host operating systems.
Another of their summer projects this year was the "CoastalShark" autonomous jetski where they constructed a sophisticated autonomous surface vessel. For their demonstration they used it to pull a guy on water skis:
Then you realize why nobody shows up to your parties. :P
>Then you realize why nobody shows up to your parties. :P
Sure they do, but they're all hackers or cypherpunks. Listen all three of us had a great time. :-) :-)
But seriously, wouldn't it be fun to host a party where everyone had to hack an invitation?
It has two major downsides right now. There's no win32 support, so there's not much to run on the desktop, which MS is going to address by bringing x86 emulation to arm. Secondly, the hardware isn't quite powerful enough, which limits the max. resolution and the ability to multitask, which the new qualcomm 835 based phones with 4+ gb of ram should address.
It is impressive, but I haven't found a use for it. After all, when you have a laptop, why use it to run apps from your phone?
This is exactly why x86-emulation-on-ARM (http://arstechnica.com/information-technology/2016/11/x86-em...) is a very exciting project from Microsoft.
As mobile chips continue to improve, emulating x86 becomes a viable way of tapping into the huge Windows ecosystem of applications when not running on battery power.
I think this is Microsoft's last shot of becoming relevant in the mobile space.
You could combine it with something like a nexdock to also turn it into a laptop. Would be cheaper than getting a phone, and a desktop pc and a laptop, and it would avoid all those nasty issues with keeping devices in sync.
The qualcomm 820 already performed at core m levels, and the 835 is 25% faster, so it should be fine if paired with enough ram and storage.
The problem isn't power. Phones have been powerful enough for about 5 years (since around iPhone 4s, as a means of dating it).
Android was out at that point, but could barely do video out by blanking the phone screen. Forget about operating it via other means. And i don't think iPhone was doing much better.
One reason why i feel the focus on those platforms reset mobile device progress by a decade.
Speaking to retailers recently, demand is for office apps usable on touch. i.e. if it can possibly done without a keyboard, people will do that. From listening to users, they will moan and complain about touch "productivity" apps... but will choose it.
Personally, I switched entirely to a phone a year ago... though with a bluetooth keyboard.
Where Intel can win is if a standard x86 Linux distribution could boot off of this with little modification. Right now in the ARM world, unless your device supports device tree configuration, kernels are highly customized per device.
actually it's PCMCIA connectors (which have nothing to do with PCI express)
USB-C is a really cool solution, but a solution for the needs of laptops/other personal devices, not servers.
* Dead simple design.
* Easily hot-swappable without expensive proprietary connectors.
* Low cost solution that provides adequate bandwidth.
This isn't something that will perform better than a proper server with actual PCIe lanes, but it will perform better than a blade from two generations ago that didn't have that kind of backplane bandwidth and it will cost less to manufacture.
A nice side-effect might be that interfaces become more open, and you can run e.g. whatever OS you like inside your smart cheese grater.
I mean, they were trying to sell incremental advances in top speed to people who were buying 20 year support contracts for IBM mainframes to run their legacy COBOL code. They obviously didn't value faster computers, they wanted stability. Having to recompile was just out of the question. And x86-64 was just the thing to get them to upgrade: they could continue running their same crufty code on the same processors that their new code could run on.
The mobile and very low power market, however, will never be taken over by x86. The architecture is just too power hungry, which means batteries need to be large and heavy and more expensive. Phone manufacturers gladly pay premiums for every incremental performance boost, as long as it doesn't consume more power. And the backwards compatibility problem doesn't exist. People throw their phones out every year or two anyway, and nobody has any legacy software they need to keep installed on their new phone. In other words, this market is ripe for ILP, predication, etc., that push off scheduling and branch prediction and other energy hungry tasks onto the compiler.
Funny how now they really want into the very low power market, but have abandoned all the ideas that could given them an edge up.
x86 is harder to decode than arm since the instructions are anywhere between 1-15 bytes while arm instructions are 4 bytes each. It has quite an effect if you want to decode more than 1 instruction per cycle.
x86 has a stronger memory model than arm, which will come at an extra cost if implementing any sort of out-of-order execution. I'm not very familiar with how it affects the cache implementation.
The x86 instruction set is more complicated than arm, and instructions frequently decode into multiple uops. I believe that microcode is used more extensively in x86.
ARM also offer a selection of lower-end devices which have both lower MIPS and lower watts, especially in standby. Intel don't have anything to match. You can't buy a $1 Intel device that draws microamps in standby and comes with a selection of useful microcontroller peripherals.
I wish Intel would stop creating products nobody wants and actually work on shipping their core products on-time.
If a company cuts X employees from Product B, those employees probably are not going to Product A to work (different skill sets, budgets, diversify product line, etc).
You're right on both points, but as a discriminating consumer I will prefer dumb devices to smart ones and upgradeable smart devices to static ones simply because it's greener. Unless the fridge refuses to function it's much more likely to get its network connection disabled completely the first time there's an unpatched vulnerability rather than replaced in its entirety.
>I wish Intel would stop creating products nobody wants
Those are two very different things. I, as a consumer, want this, because I would very much like to upgrade the processor in my laptops without replacing the whole device, for environmental and cost reasons. (I backed the EOMA-68 project for the same reason.)
If Intel can get a few manufacturers on board with the idea, more power to them.
However they're not common and I see how low end laptops could become docking stations for a compute card. They're going to have an internal USB hub to connect IO devices to the card USB C, disk, screen, touch, keyboard. I wonder how much that's going to degrade performances.
There's going to be a limit to the upgrade options, depending on the cooling capabilities of the laptop.
>- USB-C plus extension connector will provide USB, PCIe, HDMI, DP and additional signals between the card and the device
That sounds pretty interesting. Hoping they don't keep these priced too far above the hobbyist price point.
At least to me, that suggests that they will retail for around the same price range as the Compute Sticks: $150 to $500, depending on the model.
There’s no word on pricing, and Intel stresses that although anyone can buy a Compute Card when they’re available, you will still need to build the dock to power the device and cool it, and that will likely be outside the realm of possibilities for your average tinkerer.
But somehow I think that the cost of this card plus a home/work dock will be more than the price of 2 laptops.
This actually works really really well for me (on Linux).
I wonder if the HD (or SSD?) connectors are designed for thousands of pull/push cycles. How long you've been doing that?
I read the review at http://www.notebookreview.com/notebookreview/lenovo-thinkpad... and it seems there is a screw in the way to the HD. How fast is it to remove the HD?
I've been doing it for years with no issues. I just leave the screw out.
I still carry it most days, though. And I don't think I'd be a whole lot happier to carry this thing. I can't really put it in a pocket; not if I intend on sitting down.
this stuff costs us something like 35$.
With this, I can just bring a card or a stick, and plug it in on my company card/stick reader and start working.
Granted I'd rather just have better gaming capability in a smaller, dockable device like a Surface but I see the merit for sure.
I think there is a big opportunity in creating such a easy to use device. I like to think its similar to traditional services that have gone digital or transformed into web services. This would be the next step physical objects gaining digital awareness.
I remember when all of a sudden retailers needed to extend their presence into the web. Soon even furniture makers like IKEA will have connected furniture.
Why? It will be cheap and easy to implement. There are benefits that we don't consider important today. What if a smart table could keep track of its owners and use, almost like an odometer. Perhaps it could be sold or traded easier. Perhaps this would unlock a new system of trade, if objects could track their use. I wouldn't need to worry about getting a "fair" price or trade if it was more objective. Or a chair monitoring my posture and my couch recording when my dog decided to tear apart its cushion.
The only way sometimes is to try the internet connected rabbit hutch and see if there is a market, the strangest things sometimes win.
sigh How much am I going to have to pay in the future to not have hackable surveillance embedded in my furniture? It's bad enough when it's creeping into televisions and lightbulbs.
Once you get to fridge scale, a low cost tablet tacked on the front gives me better UI, more security updates, and easier upgrades than anything an OEM would put in.
Assuming connectivity is available, storage is effectively infinite.
Power requirements are down and sources are becoming more efficient and plentiful.
Processing is becoming cheap, potentially "free" at some point.
What happens as a result? What becomes feasible that wasn't 5 or 10 years ago? What problems go away when you have massive amounts of compute capability at the point of the problem?
Cheap hardware is pretty much a solved problem - you can get Orange Pi PC2 for $20 with very impressive specs, both computational and connectivity-wise.
The really hard part is creating new devices and hooking them up to these cheap computers.
Software will eat the world and I for one have a dizzying amount of excitement and fear.
The idea of minimum viable computer has always been attractive to me.
That sounds like they just stopped trying... This is sad, because their processors are far from ideal. For example in the area of sound synthesis CPU is still a bottleneck and people have to compromise on the quality or accuracy.
I wish AMD could push the boundaries, but in terms of performance they seem to have abandoned the race as well...
All we really need is a port that can drive all the peripherals at once (we have that on most phones via OTG), the power to run a 1080p display (we have that too, phone displays are usually even higher res than that), and the ability to run common home-use desktop apps (we're getting there, as MS/Google/Apple continue unifying their mobile and desktop OSes).
The next step would be to have all that be wireless, which I'm guessing isn't too far away on the Bluetooth roadmap, or even wifi streaming like Chromecast/Nvidia/Steam.
I'm sure professional use will lag behind, but even that will catch up eventually, especially with the possibility of docks adding hardware to an existing device (like the better video card in the Surface Book's keyboard module).
Come here and see top post complaining about it. Not disappointed.