Hacker News new | past | comments | ask | show | jobs | submit login
Experiments and leaking company secrets through your testing infrastructure (jonlu.ca)
92 points by jonluca on Feb 25, 2019 | hide | past | favorite | 20 comments



Since these comments imply that research firms already mine this, I think there's a hilarious opportunity for rank-and-file engineers at these tech companies to vend fake features/experiments (in which nobody receives the treatment), and watch hedge funds overstep then get the rug pulled out on them. At the least, the practice would cheapen the practice of data mining of the experiment names without having to implement any substantial fix.

Starting points could be, DRIVERLESS_DRONE_KILL_BOTS, or IR_CALL_DISPLAY_BANNER_RECORD_40_PCT_PROFIT_MARGIN.


Very cool! I find it fascinating to see what experiments other companies run, and especially what the winners were.

I've had a similar idea on my mind for a while now: Find sites using optimizely (eg with nerdydata), automatically screenshot the variations, and revisit to see what the chosen winner was.


These kinds of side channels often contain alternative data which is of very high interest to the financial industry. I don't think most companies leaking this information are aware of it, but it's very actively mined.


who mines it?


Private equity firms, hedge funds, any firm that does investment at a large scale will use every kind of advantage they can to understand if a company or competitor is testing new things


Research firms mine it, then sell the raw or value-added (analyzed) data to hedge funds. Some hedge funds (like Two Sigma) build out teams/divisions internally which are dedicated to this kind of data mining.


JFYI: iOS does not use openssl (however an app might choose to use it). I believe it uses a FIPS compliant custom TLS implementation in Security.framework. I believe ssl pinning is circumvented on iOS by leveraging the objective-c runtime to hook the callbacks that an app would use to inspect the remote peer certificate. More info: https://www.guardsquare.com/en/blog/iOS-SSL-certificate-pinn...


My memory must have failed me, you're correct. I used to use [iOS kill switch](https://github.com/iSECPartners/ios-ssl-kill-switch), and for some reason assumed it overrode openssl! Thanks, I'll correct the article.


> I believe ssl pinning is circumvented on iOS by leveraging the objective-c runtime to hook the callbacks that an app would use to inspect the remote peer certificate.

I believe this only works if apps are using their own custom stack to perform this validation. If they are using the iOS TLS implementation, you will need a jailbroken device to fix this.


Discord keeps it on the down low:

https://discordapp.com/api/experiments produces

    {"assignments": [[1927765909, 0, 0], [2969373038, 0, 1], [518926094, 1, 1], [3089664276, 0, 1], [3747495958, 1, 1], [600804427, 2, 1], [2078807847, 2, 1], [372914062, 0, 0], [4002407165, 0, 1], [2827351180, 2, 1], [183948708, 2, 1]], "fingerprint": "..."}
If you prettify their main JS bundle, you can figure out what these map to.


This is great!

Part 2 is going to be more analysis focused, checking the delta in how many experiments these companies are running, and how quickly they are moving.

I'll also mention how to properly disguise it - I'll use Discord as an example. Thanks!


You should make another post which explores private APIs and (undisplayed) data sent to mobile clients. For example, many (perhaps most?) delivery apps have an API endpoint which will send information about every single restaurant location within a bounding box to the client. You can often make the bounding box arbitrarily large to pull down a JSON list of every location and its metadata (like store hours, age, number of orders per day, menu, unique promotions, etc).


This is a pretty common thread with a lot of my blog posts - with the rise of SPA's and the ubiquity of REST APIs, all the difficulties of "scraping" are gone - these services will often open up a route that lets you pull everything in an easy to digest JSON format.

https://blog.jonlu.ca/posts/ryan-air and https://blog.jonlu.ca/posts/uber-stats are two blog posts of mine that follow a very similar path.


I always like these things. People have realized that absolute secrecy is very expensive, but doesn't buy you much. So they give up and just let the information out.


There was once a bookmarklet to explore and preview Optimizely experiments: https://growthhackers.com/questions/show-gh-spy-on-optimizel...


I'm curious if this violates any laws. Perhaps the Computer Fraud and Abuse Act?


The real life equivalent would be if a company wrote their secrets on the back of a letter they sent you in the mail, and just relied on you not looking at the back.

There's no security breach, or even unauthorized network request - their service is requesting this data to your machine, from within their client. I think they just aren't thinking about potential repercussions in their testing names.


Aaron Swartz didn't breach JSTOR's security or make any (technically) unauthorized network requests, either.

https://en.wikipedia.org/wiki/United_States_v._Swartz


How would it? The API endpoint is sending the data to the user's device anyway, because the mobile client initiates a request to the API by design.


Inspecting JSON payloads that are returned during normal use of the app doesn’t seem like an illegal activity.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: