
Intel Management Engine JTAG Proof of Concept - adulau
https://github.com/ptresearch/IntelTXE-PoC
======
codedokode
Why does Intel encrypt the code? It is unlikely that anyone would steal it
because it is useable only with Intel CPUs and is protected with copyright. So
the reasons are either Intel wants to hide something (a code that changes CPU
behaviour when a benchmark is detected?) or wants to have some features that
cannot be disabled or modified by the user.

It is difficult to believe that there is no backdoor.

~~~
nickpsecurity
There's two reasons I've seen companies obfuscate as much of their hardware
and firmware as possible:

1\. Protecting their trade secrets. Even if it's eventually leaked or copied,
the time between introducing the product and a clone happening might represent
billions in sales.

2\. Reducing patent damages. Hardware is a patent minefield with suits and
licensing deals pretty common. The companies that want to troll need to see
clearly that their patents were infringed. Making the product as opaque as
possible reduces the number that will notice in the first place and/or how
certain their claim will look. Patent suits are the single, greatest threat to
open-source, hardware companies that operate in an existing ecosystem.

~~~
ratmice
IANAL, but I find 2. weird, my first thought was that it obfuscation would
give the opposing litigant an argument for willfulness potentially increasing
patent damages.

i.e. it may reduce the total sum of patent damages across all potential cases
by reducing the number that result in litigation, while increasing likelihood
of increased damages for an unsuccessful defense.

Can't help but think 2. really seems to me a double edged sword.

~~~
nickpsecurity
Well, I know they do No 2 and it seems to work a bit. My guess is they never
bring up that motivation. Just say they hid their design to stop Chinese
copycats and delay hackers.

------
pingec
Does this help with reversing the intel ME and/or figurring out how to disable
to potential spyware parts of it?

~~~
1996
Tremendously. It is a huge first step. I worked on BIOS stuff before. I
decided to wait until AMT was more exploitable to come back to the scene.

This is big news. It should be #1 on HN, instead of the failed pentest story.

Right now I am ordering a GB-BPCE-3350C to replicate their work and see how to
extend it to other platforms. Intel will limit the vulnerability. No time to
wait.

A modern AMT free laptop would be of great value.

~~~
Sephr
This isn't as highly upvoted because it's old news from two months ago.

I remember this being very well received when it was first published on
Twitter.

~~~
1996
I should have followed more closely.

I get most news here now.

~~~
craftyguy
Consider multiple news sources. Getting info only from HN must leave you with
a feeling of 'Facebook rules the world, blockchain all the things, ICO or
GTFO, apps are only written in Go or Rust, Tesla Tesla Tesla'

------
zzo38computer
Can we disable Management Engine entirely in order to don't waste energy, in
addition to not having stuff that you don't know so that it will not run all
of the stuff you do not need.

~~~
bubblethink
Only partially for new machines. For pre-Nehalem ones, you can disable it
fully. You may be interested in this:
[https://github.com/corna/me_cleaner](https://github.com/corna/me_cleaner)

------
therein
That was a fun read. Interesting trick with the debug USB cable with some of
the pins isolated.

~~~
rzzzt
I don't quite get how it works. A quick search tells me that USB 3.0 indeed
allows A-A crossover cables to connect two hosts:
[https://superuser.com/a/945523](https://superuser.com/a/945523)

...but this leaves only a single conductor to do the entire thing?

~~~
AnssiH
A USB3 A connector has 9 pins, not 4 like USB2, so there are still 6 left.

~~~
snops
And those 6 pins are separate TX and RX pairs, and 2 grounds. The pins
disconnected are 5V, which would potentially damage things or cause an
overcurrent flag to trip (turning off both ports) if this was connected on
both ends, and D+/D-, which is the bidirectional USB2.0 data pair that
normally has fixed host/device roles and needs a special controller (like the
one in your phone) to be dual role, so it's easier to just disconnect it. The
USB3.0 TX/RX pairs are only one way each (think like RS232 null modem but much
faster), so it's much easier.

This is a good example of how USB3.0 works almost like a separate, parallel
bus to USB2.0.

------
tempodox
So do we finally have reliable backdoors into any and all chips?

------
dafrankenstein2
is there any specific reason for using python 2 instead of 3 or it is just how
you guys do it

~~~
Freak_NL
Python 3 is ten (!) years old and perfectly usable, but it suffers from a
well-known, but curious problem where people who occasionally use Python for
some light scripting tend to go for Python 2.7. Rarely it's a choice made
because of library availability, more often it is because of the perceived
ubiquity of Python 2.7 installs on the target user's computer or simply
because of a mindset where Python 2.7 is good enough and getting the hang of
Python 3 seems like a huge obstacle (it isn't).

I'd love to know why this project specifically chose Python 2.7 too. Python 2
reaches its end of life in 2020, any new Python code should really be written
in Python 3.

By the way, if you want to prevent getting a bunch of downvotes like your
comment did, be polite! Not including any capitalization or punctuation is
considered downright rude by many. Your text reads like a robot's printout.

~~~
josefx
You forgot one reason not to switch from 2 to 3: Zero benefit. Nothing python
3 does would benefit my scripts to any extend over just using python 2 as I
always did. Is learning 3 a huge hurdle? No, it is however in some cases
completely unnecessary.

~~~
rjeli
Yeah, for writing scripts like this - python 2: 0% chance of thinking about
Unicode. python 3: 5% chance i have to waste time debugging some random str
decoding issue, no benefits. why bother?

~~~
shawnz
This might be true if you're an english speaker, running the script on an
english platform and only consuming data from english services. And also you
are sure that no non-english speaker will ever take over the development of
your script and you will never have to localize it to other languages.
Otherwise, it's exactly the opposite. Python 3 is what you should be using if
you want 0% chance of being stopped by a string encoding issue.

Python 3 might occasionally require some extra steps when consuming strings
compared to Python 2, but the reality is that those steps were always
necessary. Python 2 just hid those details in a way that was only really safe
for english-exclusive development. That doesn't mean that Python 2 is easier
to use or less brittle. In fact I would say it means the opposite.

~~~
josefx
For most strings I don't care about language, encoding or related overhead. In
my scripts they are best dealt with as ophaque bytes with a few specific byte
patterns that are the same in ascii and utf8, as well as various other
encodings.

Last unicode issue I had was on a system german characters, because some
library assumed it had to explicitly perform encoding with a bad default
setting. If the library didn't try to be smart the program would have worked
independently of system or language, instead it failed on any non english
system by trying to convert a perfectly fine, system specific encoding to
utf8.

