Hacker News new | past | comments | ask | show | jobs | submit login

Funny, you seem to be working on systems very similar to Palantir (twitter handle on HN profile):

https://twitter.com/FogbeamLabs/status/1086757478312960002

What safeguards do you have in place to make sure that personally identifiable information of customers that companies using your tech have is not aggregated and pulled together for nefarious use? For example very targeted lead sourcing or targeted advertisement?

Oh, none, you actually advertise that you are mining all the databases for:

> Prospect and identify leads

But I understand that it can be hard to see certain things when your income depends on you not seeing them.




At the end of the day, the only aspect of what we do that is really anything like Palantir is that we build a search index, using Lucene. That's it, an inverted text index. There really is no meaningful way to know that the data being put in is PII, or to regulate how the orgs that use it, do so. And even if we did put in any such safeguard, all our stuff is Open Source, meaning anybody could rebuild it without the safeguard, and we'd be none the wiser.

The differences between us and Palantir then:

1. We don't pitch our software to intelligence agencies, law enforcement, etc., or encourage it's use for these kinds of ends. But we can't specifically block those uses, or we'd be in violation of the OSD.

2. We've been very public with our unwillingness to embrace working with intelligence agencies and the like. See, for example: https://www.wraltechwire.com/2014/04/30/why-a-triangle-tech-...

3. Everything we do is Open Source, meaning that at least the public can take a look inside and see what's going on... modulo any changes a given end user organization makes and keeps private.

4. Our technology is positioned primarily for internal knowledge management / collaboration use inside organizations. But, again, we have no means, legal or technical, to stop somebody from using it for other purposes. And even if we did, they could just download Lucene, ManifoldCF, blah, blah, etc., and build up their own Nefarious Indexing System.

But I understand that it can be hard to see certain things when your income depends on you not seeing them.

There is nothing in this regard that we "don't see". Taking your argument to it's logical conclusion, even a worker mining sand somewhere, to use to fabricate silicon chips, which can be used to power computers, which can be used to run privacy violating software, is "guilty". I don't think I need to point out the absurdity of that position. Furthermore, if we really just cared about "get all the money at any cost" we would have immediately jumped at a chance to talk to In-Q-Tel and the possibility of juicy, rich contracts supplying the CIA and their brethren with technology.


> There really is no meaningful way to know that the data being put in is PII, or to regulate how the orgs that use it, do so. And even if we did put in any such safeguard

Since you advertise that your tech is suitable for lead sourcing, you obviously don't see anything wrong with mining and linking databases of PII information, as long as it's done by "the good guys".


Since you advertise that your tech is suitable for lead sourcing

We don't. The line you quoted above is from my LinkedIn profile where it's describing my responsibilities as founder of the company. So, of course, part of what I do is prospecting and identifying leads. All companies do that, I don't think anything we do falls into the "nefarious" range. We aren't using retargeting or buying user information from data mining companies, etc. Our prospecting basically starts and stops with Twitter and LinkedIn... Not exactly spook stuff.

And while I can respect that some people take things to such extremes that they can even find mundane things like advertising unethical; I'm pretty comfortable with my own sense of ethics and our attempt to do the right things. In either case, attempting to draw any parallel between us and Palantir is an exercise in absurdity. Notice that there's no HN top-page stories titled "Fogbeam Lab's Secret User Manual For Cops", etc. :-)


I apologize for reading the linkedin line in the wrong key.


No worries. I understand where you're coming from. And believe me, I've spent a not inconsiderable amount of time thinking about these issues. I tend to look at raw technology as being "ethically neutral" but it does bother me that there doesn't seem to be any way to truly ensure that tech is only used for noble / beneficial ends. But I don't feel like I can let that stop me from working on tech in general... the only other alternative seems to be to turn Amish or something. And somehow I'm not quite comfortable with that.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: