Hacker Newsnew | past | comments | ask | show | jobs | submit | more SamInTheShell's commentslogin

You miss the point. This is a law enforcement tool. The average American doesn’t want a surveillance state and that’s literally what’s happening. The legal aspect of it is not in question here.

Just because something is legal doesn’t make it right. Anyone deploying or involved with this technology should be embarrassed and ashamed of themselves.


That's not the point the video makes. Flock didn't invent CCTV. Not that I am trying to defend mass surveillance or incompetent silicon valley companies.


Flock "invented" CCTV in the USA that doesn't requiring going to multiple locations and asking for their tapes in order to track someone across locations.


I mean - no, they didn't, Ring existed long before Flock (but has since removed the police "request everything in the area" feature.)

Flock just brought this to the public right-of-way.


Rather just see them get Flocked honestly. Seems like the type of tech a child would dream up only to realize when it's too late that it's dystopian, creepy, and a detriment to society.


Building the torment nexus...


Nothing will be done until one of the investors of the tech end up embarrassed from weaponization of the tech against themselves. These people have no clue how creepy some of their technologic betters can be. I once witnessed a coworker surveilling his own network to ensure his girlfriend wasn't cheating on him (this was a time before massive SSL adoption). The guy just got a role doing networking at my company and thankfully he wasn't there for very long after that.


> Nothing will be done until one of the investors of the tech end up embarrassed from weaponization of the tech against themselves.

I propose that it become mandatory for all senior managment, board members, and investors in Flock - to have these Condor camears and their ALPR cameras installed out the front of their houses, along their routes to work, along the route to nearby entertainment precincts, outside their children's school and their spouses workplace (or places they regularly visit if they don't work) - all of which must be unsecured and publicly available at all times.

(Yes I know, I'm dreaming. I reckon every Meta employee's children should be required to have un-parental-controlled access to Facebook/WhatsApp/Messenger/et al...)


flock is a YC startup

We have met the enemy and he is us -Pogo


Are we the baddies?


Yep


I am the "Y-combinator". Do you have any questions?


  no questions asked  
  go eat yourself now   
  or at least your own dog food


As O’Brien passed the telescreen a thought seemed to strike him. He stopped, turned aside and pressed a switch on the wall. There was a sharp snap. The voice had stopped.

Julia uttered a tiny sound, a sort of squeak of surprise. Even in the midst of his panic, Winston was too much taken aback to be able to hold his tongue.

‘You can turn it off!’ he said.

‘Yes,’ said O’Brien, ‘we can turn it off. We have that privilege.’


It's 2025. The ISP gateway I got comes with more default security than these cameras. The barrier to entry on security is lower than it ever has been in history. Whoever let this past the QC phase is an idiot.


> Whoever let this past the QC phase is an idiot.

It's all a matter of perspective. I'm sure to some executive somewhere, the person/s who approved all of this is seen as heroes, as they shaved of 0.7% or whatever from the costs of the development, and therefore made shareholders more money.

Until there are laws in place that makes people actually responsible for creating these situations, it'll continue, as for a company, profits goes above all.


It probably makes close to no difference in development or production, but it does significantly cut down on the number of tech support calls from people who can't figure out how to set the password, or immediately forget the password they set. If it has no password then you can just plug it in an have it work. Sure it's totally insecure, but its also trivial to install.


Generating a password that is unique to the device and print it with a sticky label on the underside of the device isn't exactly rocket-science, and ISPs somehow figured this out at least two decades ago, which was the first time I came across that myself. Surely whoever developed this IP-camera has an engineering department who've also seen something like this in the wild before?


Yep, but if you do that you need to staff a help line with people who can say "turn the box over and look at the sticker, no the sticker with the numbers on it, it's white with black letters and says PASSWORD in a big font, no the password isn't literally PASSWORD, it's the line below that with the strange letters, yes, to type that one you need to hold the shift key and press 3..."

Remember that ISPs often have people who come to your home to hook stuff up.


Yes, which costs money, which is exactly my original point. It's not because "Oh I'm so hassled because customers are dumb", it's "No, hiring people to do support would cost us money, which we don't want".

> Remember that ISPs often have people who come to your home to hook stuff up.

I can't recall a single time a technician wasn't required to come to my flat/house to install a new router. I'm based in Spain, maybe it's different elsewhere, but I think it's pretty much a requirement, you can't setup the WAN endpoint or ISP router yourself.


Last time I moved I opted for the "self install" kit, which was fine because I'm technical and the previous owners already had the service so there was nothing that needed to be done except hooking up the pre-configured modem. Saved me $200 in truck roll fees.


Interesting stuff, I've asked if I could do the installation myself every single time I've moved to a new place, and never has the ISP (three different ones) said yes. There isn't any installation fee place(probably by law?) so that isn't an issue here, just a hassle to coordinate having to meet between 12:00 and 18:00 or some super wide range of time for them to come and install it.


In the US for the past 5+ years Xfinity/Comcast, Charter, and whatever CenturyLink is called these days have all heavily pushed the "self-install kit" option vs traditional tech install each time I've moved.

Worked 4/5 times (all with cable), only time it failed was because I had apparently subscribed to a DSL plan from CenturyLink without realizing and they needed to wire up the extra lines upstream for the "modern" version of DSL to work in my apartment. After insisting multiple times that the self-install kit was 100% plug-n-play at my new address despite my intense skepticism since I really needed reliable internet from Day 1 during COVID remote work.

I was seriously missing Comcast/cable by the time that 1 yr contract was up, the devil you know and all...


Yep. Until we start holding decision makers responsible for the consequences of their decisions, they will always choose the selfish option.


So you're trying to justify this type of rampant negligence in tech? Do you think justifying such malfeasance makes up for fact we literally have surveillance networks that bad actors can tap to do really awful things?

Anyone that cares about their perspective has missed the point.


I don't think the person you're replying to is justifying it, but saying there's no laws to prevent the abuse.

Personally I think tech CEOs should be put in stocks in the town square on the regular but they're protected from any form of repercussions besides extreme cases of fraud. Even then, they're only held accountable when the money people have their money effected, not when normal people are bulldozed by the abuse.


If I was 10 years younger, I might agree that they aren't justifying it, but I have enough experience with passive speech to just not let it pass anymore.

Regarding remedy, we really need laws on this stuff yesterday. The problem is that we have to gut first amendment freedoms for some of this stuff, which wont go anywhere because there will always be too much overreach with today's representatives.


You should probably read the comment you're replying to before replying

> Until there are laws in place that makes people actually responsible for creating these situations, it'll continue, as for a company, profits goes above all.

They obviously meant that we ought to be holding these people responsible.


> You should probably read the comment you're replying to before replying

Congrats you spotted the thing we agreed on between comments. If you fail to see the agreement through parity of the part that was echoed, idk what to tell you. Education system is failing everyone in it these days.


> So you're trying to justify this type of rampant negligence in tech?

Don't know how you reached that conclusion, I obviously isn't trying to justify anything. But maybe something I said was unclear? What exactly gave you the idea I'm trying to justify anything of this?


Nothing against you personally, just so you know. But I have to point out that anyone caring about the reason for the short coming of flock on stuff like this are just crafting soft reasons they can use to justify things later. Being up front here I care not for their reason because the entire business model is frankly disgusting and an affront to a functioning society. This is the type of tech that evolves into social credit scores and precog crime units, stoping crime before it happens.

At the end of the day your rationalization only affords comfort to those that have a vested interest in this stuff being successful and it needs to be clear to those people driving this that they’re not doing something popular or even good.


An explanation is not a justification.


Why stick your neck out, swim upstream to do a good job that will not be recognised as such?

Fix the corporate incentives and engineers will be able to do the right thing without suffering. Not everyone gets the luxury of a secure career doing morally ok things.


Counterpoint: whoever let this past the QC phase got paid very generously, and everyone involved is ignoring the laws that already exist to combat this, because law enforcement, too, gets paid generously. And the laws that forbid that aren't getting enforced because the police doesn't police the police, and dad has made it perfectly clear that flagrantly ignoring the law is fine if you're in power.


What makes you think QA/QC is paid handsomely? It's a bloody cost center mate, and you can't measure "damage prevented" consistently, or at least in a way most high-risk tolerating exec types won't immediately undermine.

t. Former QA veteran


Kinda scratching my head as to why this has a backend... uses ruby... needs a database. It's cool though.


Seem to recall license plates are required to be illuminated as well. What's stopping someone from just adding an additional IR light to those enclosures? Couldn't you just slap an additional bright enough IR light in that makes it impossible to even see the plate clearly through cameras?

Personally, if I cared enough to obfuscate my plate info from these devices, I would just taint their data by wrapping my car in a wrap with various different "plates" themed art. I like cars and the exterior has traditionally been treated like art. Tainting data is just as effective at making the core dataset useless as omitting data in the first place.


> What's stopping someone from just adding an additional IR light to those enclosures

Nothing.

> Couldn't you just slap an additional bright enough IR light in that makes it impossible to even see the plate clearly through cameras?

You could: but it will only work at night (and even then, I don't know if the amount of light you could concentrate in that area would be enough to blow out letters), because all of these cameras have switchable IR cutoff filters.


Surely the sensor would detect the IR separately from the visible light and could easily filter it out?


Most digital cameras see farther on the low end than humans do and it can do some odd things. It made the news with the Sony? camera that had a mode specifically to use this--turns out it sort-of sees through many swimsuits. Or a video I've seen of firefighters caught in a burnover--the fire looked very weird!


Genuinely don’t know. The hack here is exploiting artifacts from over exposure for the camera sensor. As to if they have mitigating features to filter non visible light, I’m actually curious.


I doubt it, that is a lot of extra cost against an attack that doesn't exist today.


I just imagine the most hilarious form of this idea being a panel that lays behind the plate that is part of the car. The panel containing an array of IR leds that flood everything behind the car with invisible light. Imagine going out side, seeing nothing, but you pop open your phone's camera app and everything is illuminated for some reason. Would be wild.

Edit: I have no concept of what camera sensors are doing these days.


Don't think that really works.think that's been debunked


Which thing?


The thing I really don't get is why people are devaluing their own voices with slop. Too many people are just pressing what they think is an "easy" button and calling it good. What they lose in the process is their voice, their opinions, and some of the intangible humanity that was encoded in their content.

At first I just blocked the people doing it, but today I just stay off the platforms where I'm seeing these people. If things stay the same, in the future I can only imagine that small gated communities are where real humans are communicating (think places like lobste.rs, but for normies). Smaller Discord communities are still working.

It's just wild to me how some platforms are fine with bots fluffing content creation. There comes a point where people will realize all the messaging on platforms like Facebook are AI generated and they just leave for greener pastures.

I believe people really want to connect with real people. It just can't be done if people aren't being themselves. I think there are a lot of creative people that are up in arms against the right things, but for the wrong reasons. I don't think the legal IP implications are as damning as the societal implications of everyone filtering their speech through AI.

Sorry, this came off way more ranty than I would like, but it's been bubbling in my head for a while now. So much so I'm actively doing something about it.

Personally the only place that I filter my own speech through an AI is when I couldn't care less about the person I'm communicating to but I need them to get the message (typically because I have too many 4 letter words to say about the subject matter).


> The thing I really don't get is why people are devaluing their own voices with slop. Too many people are just pressing what they think is an "easy" button and calling it good. What they lose in the process is their voice, their opinions, and some of the intangible humanity that was encoded in their content.

Have you considered that people producing material that is considered, by the creator or audience, as "content" very often aren't trying to encode their voice, their opinions, or any of their intangible humanity in it, they are working a system and seeking to provide what the system demands. They aren't sacrificing those things, those things were never the point of the activity.

(OTOH, the people themselves as creating art, including those that use AI among their tools, do probably see that as the point, but they aren’t doing what they see as pushing an "easy" button.)


Really doubt most people put much thought into it at all. Critical thinking is a dying skill.


Kinda funny if it returned an ad for puppy chow. Realistically I doubt an ad presented would actually be tied to the context beyond the seed used for a vector lookup.


I tried out EmbeddingGemma a few weeks back in AB testing against nomic-embed-text-v1. I got way better results out of the nomic model. Runs fine on CPU as well.


Not a lawyer, but if you're talking USA, would recommend checking regularly for updates: https://www.copyright.gov/ai/

Even if AI output isn't copyrightable, could you really argue a solid case if I taint random parts of the source code with my own isms? Which this is something I do, I don't care if the AI generated portions of my code base are copyrightable, perhaps my licensing is not valid for those portions, but throughout my code is going to be parts I hand crafted or patterned out because the AI just can't get it right. Those snippets are just proverbial land mines in waiting for a copyright infringer in waiting.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: