Hacker News new | past | comments | ask | show | jobs | submit | remus's comments login

It's an apples to oranges comparison.

It's apples to apples if you care about 8-bit (a lot of people do these days).

AFAIK, there wasn't a faster 8-bit super computer to compare to - which is why they made the comparison.


Indeed. Relevant quote from the article:

"The Commission said previously that the simplification plan will focus on reporting requirements for organizations with less than 500 people, but will not touch the “underlying core objective of [the] GDPR regime.”

Adjustments could include limiting requirements to keep records of data processing activities, or reforming how businesses provide data protection impact statements — two rules seen as overly cumbersome to smaller firms."

Sounds pretty sensible to me.


It seems reasonably likely they were using signal to avoid records keeping requirements and public scrutiny. If you found a group of employees using signal with disappearing messages to talk about work outside of your normal work chat (slack etc.) you'd be pretty suspicious, let alone if they were working in public office!


Oh, absolutely, I try to remind people that even starting this group chat was criminal, because it's an attempt to break public records laws.


100%, we have public records requirements for a reason and that needs to be followed. I assume you take the same issue with private email servers.


Yes. This is as bad or worse than Clinton's email servers. It deserves to be talked about at least as long and investigated at least as thoroughly. Which in the case of Clinton involved bringing it up for nearly a decade, an FBI investigation, an Inspector General's report on the FBI's and DOJ's handling of the case and a three year State Department investigation. It's only fair to apply the same standard here


To offer a concrete example, their choice to delete everything after 7 days is an unambiguously worse aspect. Premeditated spoliation of evidence.


We wouldn't know if a private email was configured similarly, or if emails on a private server were manually deleted.

Either way, yes using a communication system on the public internet and outside of official documentation processes is bad no matter how you do it. Deleting those communications could be a bad sign, though to be fair it could also be a sign of someone trying to do what they think is more secure (avoiding old messages being leaked or hacked later).


Absolutely. And it’s astounding that Waltz also uses private email for official business[1]

This is a problem and it needs to be stopped.

1. https://americanoversight.org/investigation/the-trump-admini...


By all means, bring up something vaguely similar from ten years ago to somehow justify the idiotic and criminal actions of the current batch of clowns. Why can't people just say "this is bad and they deserve to be punished"?


How did I justify their actions today? I specifically agreed that its a problem and referenced a very public case where something similar happened 9 years ago and we still haven't done anything about it.


The other commenter covered this:

"in the case of Clinton involved bringing it up for nearly a decade, an FBI investigation, an Inspector General's report on the FBI's and DOJ's handling of the case and a three year State Department investigation. It's only fair to apply the same standard here"

Does that clear it up for you? Do you still need justification to treat this seriously? Or are you a person unwilling to try and address poor leadership because of the (R) after their names?


I think both should be treated seriously, I'm not sure why you'd assume otherwise from what I've said here.

There should be just as thorough of an investigation into this one, and assuming there isn't that's a miscarriage of justice.

That said, I'm of the opinion that its great and all that they investigated Clinton's email server but the fact that nothing came of it is a problem. It absolutely violated the intent of the law in my opinion. The mere fact that they found so many emails with information that should have been marked confidential is, in my opinion, a violation of the intent of confidential information protection laws.


There was literally no point in referencing it. It's classic "whataboutism" and it adds nothing to the conversation.


Where in my originally raising it did I say anything along the lines of whataboutism, or imply that this situation shouldn't be taken seriously?


Also if you then saw colleagues discussing company secrets or legally privileged information on a group chat, the onus is on everyone in the chat to call that out as wrong.


>the onus is on everyone in the chat to call that out as wrong

that would be in a group of even minimally qualified professionals, not the clowns who got their jobs on a whim of the bigger clown. You know the monkey with a nuclear bomb. I hope we wouldn't see how they handle the real bomb, and for now just the tariffs have like an extremely large nuke just wiped $10T+, and it isn't "just stock market", it is large complicated efficient logistics chains and trade relations that were built over years and were powering this civilization. To compare, the damage from the Ukraine war - you'd need several tens of nukes to produce such damage - is just around a "meager" $1T.


Should be, but the US is run by cults of personality where they defer to people they see as their superiors - that is, "if Hegseth does it, it's ok, right?"


Most large companies monitor devices. You start GMailing source code to yourself, you get fired.


A very friend of mine was going through a nasty divorce. Although we weren't talking about the divorce/case he set our messages on auto-delete. Apparently his (now, ex-wife) had SMS and WhatsApp messages in court to be used for her cased.

Any 'loving message' to her from the early days ("you are so perfect") and any 'nasty message' to others ("oh that bitch!!") sent to anyone was presented in court. So for caution he auto-deleted even the messages that were innocuous, just in case it could be used against him "oh he wanted to spend money for a new phone/laptop, thus he has money, thus I will take it"


Nothing good comes out of keeping records so it is natural that people do not want to keep them.


Wait, what? Sarcasm or implicit “if you’re a criminal”?

For most people and companies record keeping is important and valuable.


It is sometimes legally mandatory, but in what context does a company ever look again at old slack chats or work sms?


I find value in old slack messages daily. You seriously never consolt any records at all in your work?

From what I've gathered, Signal use was prolific among people in this administration and the past one.

I'm not surprised. My own company sends out several emails that Whatsapp can't be used as it's not secure, yet I get Whatsapp messages from leadership I work with constantly.

People ignore directives all the time. Usually out of convenience.

People have even called it out in a Whatsapp chat "hey guys, we're not supposed to use Whatsapp" and people usually ignore it.


That's not at all what's it's like to work with classified data, even deep down the subcontractor chain. They constantly drill into you to respect data classification and the consequences of ignoring these rules. Nobody ever does what you describe.


https://www.snopes.com/news/2025/03/27/biden-authorized-sign...

During Biden's administration the CISA even encouraged the use of Signal.

I’m not excusing it, just saying it was and still is, incredibly common among Whitehouse people.


The parent poster is right though, signal is permitted and encouraged for any discussion that would’ve happened over SMS, the issues are somebody dropping details into a channel meant for planning a meeting, as well as somebody accidentally adding somebody that should not have been there.


HN has recently adopted a habit of downmodding inconvenient facts to oblivion but you can easily confirm Signal is approved for this use yourself.


We do not know for sure but this would be very unlikely. Several of the participants does not need to know this information and it should be compartmentalized.

Information concerning capability or location should not come near unclassified networks or civilian phones. Somebody could drop or steal that phone or glance over a shoulder which could necessitate cancelling the operation or much worse.


Presumably to keep the repo size reasonable. Say I want to make an ad hoc contribution to a book, if step 1 is "download this multi-gigabyte repo" then that's a fairly big hurdle.


If you're part of the US government, with access to the most sensitive information which will put people's lives at risk if compromised, then yes this is a vulnerability because "russian GRU agent nicks your phone and scans your signal QR code" is a real threat.


If you're part of the US government, you're not supposed to use signal to discuss this kind of stuff.


Bringing in a phone with decryption keys for this conversation is a risk, then, not just Signal's featureset...

I agree it could be hardening to allow users/organizations to disable this feature, and also other features such as automatic media decoding and other mechanisms that are trade-offs between security and usability, but simply does not meet the definition of a vulnerability (nobody will assign this a CVE number to track the bug and "resolve" it)


imo it's best to just avoid it altogether. Requirements change, and what was once a trusted input can become untrusted input.


Generally you shouldn't be passing random data from the web to shell scripts. Maybe I haven't done the right type of work but having to deal with fidlg bits it's much more likely not passing it to be shell will cause issues (with stuff like executable paths)


In some ways it is surprising, but the examples you gave are only currently straightforward because of massive investment over many years by thousands of people. If you wanted to build chatGPT from scratch I'm sure it would be pretty hard, so it doesn't seem so unreasonable that you might pay someone if you care about keeping your data around for extended periods of time.


Which law makes this illegal? Presumably the live stream is setup by the flemish government themselves, and they're all public figures acting in a public capacity. Maybe there is something else here, or some detail of Belgian law, but from the outside it doesn't seem there is much of a privacy argument.


There's a brand new AI law too. You'd likely need explicit consent of these people to have their personal data (face) processed by AI.


Facial detection is processing of article 9 GDPR data which requires explicit consent.


> It sucks.

It depends on your perspective surely? As a lawyer your job is typically to protect your client from legal risk, so if users are happy to sign a really expansive set of terms (which experience shows is the case) that gives grants lots of permission to do stuff with their data then that's low risk. If you as a business don't want that then you need to make it explicit that you're willing to take on some extra risk.


Also by using “standard boilerplate” they are using language with meanings well established by precedent. Craft your own version in “regular English” and it’s much more open to litigation.


> It is very very slow

Could that be partially due to a big spike in demand at launch?


Possibly, repeating the prompt I got a much higher speed, taking 20s on average now, which is much more viable. But that remains to be seen when more people start using this version in production.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: