"The Commission said previously that the simplification plan will focus on reporting requirements for organizations with less than 500 people, but will not touch the “underlying core objective of [the] GDPR regime.”
Adjustments could include limiting requirements to keep records of data processing activities, or reforming how businesses provide data protection impact statements — two rules seen as overly cumbersome to smaller firms."
It seems reasonably likely they were using signal to avoid records keeping requirements and public scrutiny. If you found a group of employees using signal with disappearing messages to talk about work outside of your normal work chat (slack etc.) you'd be pretty suspicious, let alone if they were working in public office!
Yes. This is as bad or worse than Clinton's email servers. It deserves to be talked about at least as long and investigated at least as thoroughly. Which in the case of Clinton involved bringing it up for nearly a decade, an FBI investigation, an Inspector General's report on the FBI's and DOJ's handling of the case and a three year State Department investigation. It's only fair to apply the same standard here
We wouldn't know if a private email was configured similarly, or if emails on a private server were manually deleted.
Either way, yes using a communication system on the public internet and outside of official documentation processes is bad no matter how you do it. Deleting those communications could be a bad sign, though to be fair it could also be a sign of someone trying to do what they think is more secure (avoiding old messages being leaked or hacked later).
By all means, bring up something vaguely similar from ten years ago to somehow justify the idiotic and criminal actions of the current batch of clowns. Why can't people just say "this is bad and they deserve to be punished"?
How did I justify their actions today? I specifically agreed that its a problem and referenced a very public case where something similar happened 9 years ago and we still haven't done anything about it.
"in the case of Clinton involved bringing it up for nearly a decade, an FBI investigation, an Inspector General's report on the FBI's and DOJ's handling of the case and a three year State Department investigation. It's only fair to apply the same standard here"
Does that clear it up for you? Do you still need justification to treat this seriously? Or are you a person unwilling to try and address poor leadership because of the (R) after their names?
I think both should be treated seriously, I'm not sure why you'd assume otherwise from what I've said here.
There should be just as thorough of an investigation into this one, and assuming there isn't that's a miscarriage of justice.
That said, I'm of the opinion that its great and all that they investigated Clinton's email server but the fact that nothing came of it is a problem. It absolutely violated the intent of the law in my opinion. The mere fact that they found so many emails with information that should have been marked confidential is, in my opinion, a violation of the intent of confidential information protection laws.
Also if you then saw colleagues discussing company secrets or legally privileged information on a group chat, the onus is on everyone in the chat to call that out as wrong.
>the onus is on everyone in the chat to call that out as wrong
that would be in a group of even minimally qualified professionals, not the clowns who got their jobs on a whim of the bigger clown. You know the monkey with a nuclear bomb. I hope we wouldn't see how they handle the real bomb, and for now just the tariffs have like an extremely large nuke just wiped $10T+, and it isn't "just stock market", it is large complicated efficient logistics chains and trade relations that were built over years and were powering this civilization. To compare, the damage from the Ukraine war - you'd need several tens of nukes to produce such damage - is just around a "meager" $1T.
Should be, but the US is run by cults of personality where they defer to people they see as their superiors - that is, "if Hegseth does it, it's ok, right?"
A very friend of mine was going through a nasty divorce. Although we weren't talking about the divorce/case he set our messages on auto-delete. Apparently his (now, ex-wife) had SMS and WhatsApp messages in court to be used for her cased.
Any 'loving message' to her from the early days ("you are so perfect") and any 'nasty message' to others ("oh that bitch!!") sent to anyone was presented in court. So for caution he auto-deleted even the messages that were innocuous, just in case it could be used against him "oh he wanted to spend money for a new phone/laptop, thus he has money, thus I will take it"
From what I've gathered, Signal use was prolific among people in this administration and the past one.
I'm not surprised. My own company sends out several emails that Whatsapp can't be used as it's not secure, yet I get Whatsapp messages from leadership I work with constantly.
People ignore directives all the time. Usually out of convenience.
People have even called it out in a Whatsapp chat "hey guys, we're not supposed to use Whatsapp" and people usually ignore it.
That's not at all what's it's like to work with classified data, even deep down the subcontractor chain. They constantly drill into you to respect data classification and the consequences of ignoring these rules. Nobody ever does what you describe.
The parent poster is right though, signal is permitted and encouraged for any discussion that would’ve happened over SMS, the issues are somebody dropping details into a channel meant for planning a meeting, as well as somebody accidentally adding somebody that should not have been there.
We do not know for sure but this would be very unlikely. Several of the participants does not need to know this information and it should be compartmentalized.
Information concerning capability or location should not come near unclassified networks or civilian phones. Somebody could drop or steal that phone or glance over a shoulder which could necessitate cancelling the operation or much worse.
Presumably to keep the repo size reasonable. Say I want to make an ad hoc contribution to a book, if step 1 is "download this multi-gigabyte repo" then that's a fairly big hurdle.
If you're part of the US government, with access to the most sensitive information which will put people's lives at risk if compromised, then yes this is a vulnerability because "russian GRU agent nicks your phone and scans your signal QR code" is a real threat.
Bringing in a phone with decryption keys for this conversation is a risk, then, not just Signal's featureset...
I agree it could be hardening to allow users/organizations to disable this feature, and also other features such as automatic media decoding and other mechanisms that are trade-offs between security and usability, but simply does not meet the definition of a vulnerability (nobody will assign this a CVE number to track the bug and "resolve" it)
Generally you shouldn't be passing random data from the web to shell scripts. Maybe I haven't done the right type of work but having to deal with fidlg bits it's much more likely not passing it to be shell will cause issues (with stuff like executable paths)
In some ways it is surprising, but the examples you gave are only currently straightforward because of massive investment over many years by thousands of people. If you wanted to build chatGPT from scratch I'm sure it would be pretty hard, so it doesn't seem so unreasonable that you might pay someone if you care about keeping your data around for extended periods of time.
Which law makes this illegal? Presumably the live stream is setup by the flemish government themselves, and they're all public figures acting in a public capacity. Maybe there is something else here, or some detail of Belgian law, but from the outside it doesn't seem there is much of a privacy argument.
It depends on your perspective surely? As a lawyer your job is typically to protect your client from legal risk, so if users are happy to sign a really expansive set of terms (which experience shows is the case) that gives grants lots of permission to do stuff with their data then that's low risk. If you as a business don't want that then you need to make it explicit that you're willing to take on some extra risk.
Also by using “standard boilerplate” they are using language with meanings well established by precedent. Craft your own version in “regular English” and it’s much more open to litigation.
Possibly, repeating the prompt I got a much higher speed, taking 20s on average now, which is much more viable. But that remains to be seen when more people start using this version in production.
reply