Maybe we should think about how we make schools spaces of freedom for our children rather than turning them into the next panopticon.
The fact that we have kept a part of middle-age inside our modern societies keeps fascinating and frightening me.
Going from Swedish to American high school felt like stepping back to elementary school in regards to personal freedom and trust.
First to set the base, in general in Sweden you have all your classes with the same 25-30 instead of different groups every period. From 7th grade we had our own schedule complete with periods without lessons where we would just hang out, walk around or go to the library and jump on the computer. In high school this ramped up even more. Loitering as a concept doesn't exist in Sweden.
To accommodate this the design of schools are completely different. The endless hallways don't really exist, instead there are areas between the classrooms with sofas, tables and whatever to enable people to hangout or study together. My school even had a café open all day. Doing group projects is the epitome of this, we'd show up at the start of the class for attendance and then take off to the library or some other place.
Contrast this with America. Hall passes to go to the restroom, hall monitors. We even had CCTV surveillance in the entire school building. If you would show up late you could be sent to the cafeteria and forcibly miss the class and have to write some essay about how wrong you were. How can a person learn to take care of themselves in such an environment?
This isn't even talking about the pledge or national anthem over the PA system.
Sorry for the rant.
Back to the topic on hand, from what I've read on Swedish news about it everyone involved generally was positive. No more the annoying wait while taking attendance. For the parents it was no different, even before this system they still immediately got a text if the kid didn't show up for class.
The questions raised by the DPA are relevant and my guess is that this will continue to be explored but with more caution towards how the data is processed to comply with this ruling.
I am French, I live in Japan. My criticism applies to these two countries as well. Within European progressives, Scandinavian countries are seen as having a really different educative system and we are wondering what we are waiting before stealing it!
On the topic at hand, I am sure it is very convenient. That's the problem: maintaining privacy is a bit less convenient, but it is important, which is why it is good government do not allow people to trade short-term convenience for longer term privacy erosion.
I see these things as my basic human rights. And nobody should accept having them restricted.
In the country where I live the school has the authority by law to be more restrictive and enforce their own rules. Which I see as necessary to some point. But I think we should teach kids and teenagers to make decisions on their own and take responsibility for their actions. And not just enforce arbitrary rules.
It was my first lesson in dealing with unjust authority and that most authority relies on the principle that most people under it will do as they say most of the time.
A useful lesson and after that I saw asking to use the bathroom as a courtesy because I was going either way.
It destroyed my fear of and respect for arbitrary authority instantly so I look back on it as a good thing today. I realised that rather than my peers turning on me on a shame basis they turned on the teacher because it was going to be them next.
It also positioned me well for dealing with the somewhat backwards ideals in my own childrens’ school.
it was one of those things that chipped away at my respect for authority - it didn't destroy it, but still dinged it. Peers still turned on me - no one thought "it could be me next". We were 7/8(?) - I'm not sure most people could reason that far ahead vs living in the moment.
What sticks with me is that we had an announcement earlier (few days or weeks earlier?) that no bathroom breaks would be given out because a few kids had 'abused the privilege' (stayed out of the room longer than they should have). So... someone else violates a trivially stupid 'rule', and my own life is negatively impacted, which is wholly unjust. That was my takeaway, which has stuck with me for ~40 years. I still remember a couple of the kids who'd been 'abusing' that privilege beforehand, and remember what impact they and that teacher had on me that day and beyond.
They use that ridiculous methodology now. For example my eldest's entire class was held back at the end of the day for 15 minutes due to one child's behaviour. Unfair and her ASD younger sister relies on her to get home and had a complete panic and breakdown over the routine change. Teachers surprisingly never available for comment after such things due to the layer of bureaucracy over everything. Formal complaint raised, pushed to the very end of their own self-inflicted process every time they screw up!
That's not the issue (at least as I see it).
The big questions in my head include "Who's storing the biometric data? How are they securing it? Who has authorized access to it, and what processes and mechanisms are n place to ensure they only use it in authorized ways? What are authorized uses of the data? How is it ensured that authorized uses will ot be increased in scope? What process or mechanisms are in place to detect attempts or successful cases of some authorized 3rd party obtaining that data? What penalties are there for unauthorized use of the data and to whom will they be applied? What penalties are there for inadequately securing the data and to whom will those be applied? What processes mechanisms or policies are in place to ensure unauthorized access or failures to secure the data are detected and disclosed? How are those processes mechanisms or policies measured to ensure they're working?"
Pretty much _anyone_ dealing with EU citizen's data should already have the answers to those (and related) questions written down. (If you do not, think about how sure you are that the answer to those " .. to whom will the penalty be applied?" questions will not be "Who's responsible for the data breach? Oh, that'd be Barrin from the dev team. Here, let me give you his full contact details and their HR file! BTW, they'll be fired and marched out before close of business, might be best if you call them on their personal cell phone."
- Data and analysis was done at a server in the class room (locked in a cabinet)
- All parents signed a consent form before the trial started.
- All data was erased after the trial ended.
The Swedish DPA decided that:
1. Consent is not valid of there is a power difference between the subject and the requester.
2. They should have documented a PIA (Privacy impact assessment).
3. They should have contacted the DPA before starting the trial.
Parents should be able to consent on behalf of their children and retraction should come from parents as well.
This used to be considered common sense but not anymore.
Good! Some progress at least.
Ignoring reality for the sake of ideology isn't progress.
By that logic, next a doctor will refuse to vaccinate an 8 year old because the child refused to give consent.
Exactly what stuff the parents can consent to is the real question and should be argued properly and not treated as any kind of "common sense". Things that are necessary for the safety of the children - like vaccines - yes. Other things are on much more shaky grounds.
Well (as you said - and thankfully) not any more.
I try to refer questions of consent to my kids, but in this case the question wasn't asked, they just said "hold your hands out" and went ahead.
I don't think minors can retract parental consent, but IMO schools should accommodate an individual's personal consent if practicable. Indeed schools should where possible be advocating for pupils before their parents if necessary (things like a tutor contacting a parent to request consent on a child's behalf).
Aside: I heard security staff make an exchange suggesting they'd chosen a particular person to search "because you like doing the $ageGroupIdentifier", which might have been innocent but came across as wholly wrong.
What’s the use case? Automatic student registration? If it is, and I can certainly imagine some HR consultant going “weeeeeell we can squeeze five minutes of extra education in if we remove this teacher-student interaction of registration, that’s a gazillion hours of extra education a year!”. I know this is a strawman of my experience, but it’s frankly the only way it would happen at my place. Which would be crazy considering the primary focus of Schools in Scandinavia is to educate democratic citizens, and automating something as meaningful as presence registration is damaging to that focus because the “AI” won’t be able to have a conversation about the usefulness of attendance.
So I absolutely agree with you. This is completely crazy, but what I really wonder is why no one asked “why?”.
As far as the GDPR goes it actually didn’t have a huge technical impact on the public sector. We’ve had stricter local laws for decades, and have build our systems accordingly. We also don’t track you for advertising. So for us the GDPR has mostly been a bureaucratic change, and you’ll notice that’s also the majority of violations. It’s not that data aren’t protected, it’s that no one knows where the contract is, or that we haven’t documented elaborate procedures for whatever. 95% of the GDPR impact on the public sector had been law and legalisation. So the GDPR actually doesn’t impact O365 94 public cloud at all as long as you go to iso27000 certified vendors who provide privacy shield or whatever it’s called these days.
That’s not to say that we aren’t debating public cloud. Because we are. This has more to do with national laws though, we have war-time contingency plans from the cold-war era. Like I said, we had much stricter policies before the GDPR was even a thing.
The issue is that it makes public cloud illegal, but we can’t operate the most digitalised public sector in the world without public cloud. So far everyone is moving to AWS and Azure, pretending the flawed bureaucracy will eventually go away, but our politicians and national digitalisation agency has been refusing to give any meaningful heading, so who knows?
At some point though, someone is going to ask if the privacy bureaucracy is really worth the money it’s costing. At our place you could hire 10 extra teachers a year, just to cover the bureaucratic processes that don’t actually increase security, because a contract or a nice incident plan isn’t actually going to stop anyone from hacking you.
"Microsoft says it. We believe it. That settles it."
Really, there's no other option: Even if they magically got the ability to audit source code, the whole point of a cloud service is that the code can be changed at any point. Even if you extract a promise from Microsoft that the code won't be changed... well, see Figure 1. You're operating on trust that Microsoft will abide the agreement, and trust that you'll be able to magically tell if they don't.
Especially considering that kids have poor judgement, tracking things in school that could be in any way used later against the individual is dangerous.
It said there were less intrusive ways that their attendance could have been detected without involving camera surveillance."
Yeah, especially considering the students do not have a choice about being there.
Seriously though, where exactly is the privacy if attendance is taken either way?
There is no need to record more information than this. The expectation of privacy is exactly this: that any information that is not needed is not recorded.
Human teachers are unable to capture or keep recorded all the extra information they see with their eyes. Cameras do.
If the teacher sells attendance reports (together with detailed lesson transcripts and audio recordings) to Google, Amazon, Netflix, US and Russian governments, all major data brokers and The USA Association of Rich Pedophiles, all at the same time — yes, there is no difference. Otherwise there is a substantial difference.
It is amazing, that a person, directly reporting such detailed information to elsewhere, would be considered a pervert and criminal, but using an automated camera to do the same thing is somehow alright?!!
- The school had obtained consent from the pupills and each one of the pupills could opt out at any time. The Swedish Data Protection Agency (DPA) did however find that due to the power balance of a school vs a pupill, these consents are not ”freely given”, and thus are void. The school therefor have no legal basis of processing special categories of data/sensitive data, which processing face recognition data (”biometric information”) for identification purposes is. Violation of GDPR Article 9.
- The school had not completed a sufficient Data Protection Impact Assesment which properly identified risks of this processing. In addition, due to the nature of this processing, they would have been obliged to consult with the DPA about this impact assessment before starting. Violation of GDPR Article 35&36.
- The DPA also found that the processing was more extensive than necessary, since attendance could be taken in less invasive ways. This a violation of GDPR article 5.1c (the data minimisation principle).
I hope the decision will be appealed by the school, which is run by the municipality. There are several interesting questions, such as if consent can be freely given in schools, if the data minimization principle was really breached (the school claimed this way of taking attendance saves a calculated 72,000h of teachers time per year in just this school), etc. I am not sure however that the municipality has the right resources and competency to make a competent appeal.
Which is 197h per day. I find this number extremely dubious.
That still seems suspect: estimating a school year of 35 weeks that's 175 days, thus 97 hours per day across all teachers. Even if it were some mega school of 1,000 pupils and maybe 100 teachers that seems high, would imply they spent perhaps ten minutes of every hour in a six hours day taking & dealing with attendance on average! (obviously based on assumptions, so this could vary)
It feels like a number that was come up with (or at least inflated) to justify the project.
The school could simply use RFiD cards for tracking attendance.
Why not use a fingerprint reader if automation is needed, or just place trust in the teacher to not cheat and give them a web page or app to enter absences?
Attendance is a safety thing (or viewed that way). If my 8 year old unexpectedly doesn’t show up to school, I would like to find that out at 8:30 rather than 15:00.
Yes, just like you and your friend can do a lot of other cheating in school. And the mechanism used to prevent that is severe punishments once caught by a teacher.
I don't see much difficulty in spotting an empty desk that should be occupied according to today's automatically generated seating chart.
On another note, I feel like I'd be more okay with facial recognition and other biometrics in the form of you going to a scanner and initiating a scan. But certainly not surveillance with recognition built on top. Article doesn't seem to specify what they used, though the image chosen implies surveillance.
They actually studied this as well as the facial recognition system. Two different classrooms over an 8 week period.
The biggest problem they found with the RFID system;
> During the first week of testing (week 41) 38% of
pupils didn’t bring their tag at each lesson, while in
the second week of testing (week 42) 47% forgot it. 
 - http://pages.tieto.com/rs/517-ITT-285/images/SummaryFutureCl...
So You could have a tracking on a map of a dot moving, but as soon as you associate that dot with a person in the same dataset - that causes dramalama. But you can have that dot associated with a person in a separate dataset held by somebody else and skirt around the fact that the same dataset is held, albeit obfuscated at a remedial level that placates the technothuds.
The dataset you are talking about is not just in/out but is a lot more comprehensive than that.
I did not say RFID or a mobile phone is biometric information!
I was saying that such datasets can be matched up to another dataset that holds individual information and as long as those datasets are seperated in the right way, you can skirt around this whole area and still have the same problem.
But people seem happy to allow that.
This facial fingerprint will need to be pretty strongly unique so as to avoid false positives or false negatives. If that information is stolen, the thief has the fingerprint for your face, which is very difficult for you to change.
The level of protection for that stored information then must be very high, because the damage to the individual in case of its loss is very high. The GDPR is in place, in part, to make certain that the justification is sufficient for an organization requiring you to handover such personal and valuable data.
"I want to take attendance" is apparently not sufficient. Further if someone wanted to get creepy, we can imagine what could happen should someone place a foreign agent along the video-recognition-attendance workflow, which then told its owner when and where certain students positively were at any time.
It’s this kind of second-guessing which is extremely concerning about GDPR-type regulation.
Not that the information was not secure, or leaked, or mishandled, or consent wasn’t obtained, but even if all that is done, just, “We don’t think you had a good enough reason.”
Firstly, the school district already has a picture of every child at the school. Second, they already track the child (manually) at all times throughout the day.
So, they are not collecting any more information than they already had, nor are they collecting any more information than is legally required (attendance tracking is likely to be legally mandated)
The end result is the same exact data exposure / privacy risk that they started with, and more classroom hours to actually teach instead of taking attendance.
The classroom setting is a particularly good example of the narrow use case when facial recognition is acceptable. Unlike, for example, in a retail establishment, where a casual customer who browses the store but does not buy anything would otherwise remain anonymous.
Quite the contrary, you do not want anyone in the classroom who is anonymous.
As part of my job, I had to be able to recognise and put a name to all of my students. This is legitimately difficult and I spent a lot of time doing it. However, it was not for doing attendance! If you are in a school system below university and your teacher doesn't recognise you or know your name... I mean... I can't even comprehend what a bad situation that is. How could you ever teach a student if you know so little about them that you can't even recognise them? When you are marking their paper and see their name, if you don't even know who they are, how can you give appropriate guidance? That kind of situation would be shocking. And to be fair... because it is really, really hard I know a lot of teachers who have no clue... but seriously, this is not what we aim for!
Once you know the students, attendance is easy. Why is that chair empty? Who is supposed to sit there? Look at your seating chart (seriously, it's the best way to learn their names...). Is that person anywhere else in the room? No? They are absent. Mark it on your list. That takes all of 2 seconds. If you are really doing your job, you can ask why the person is missing. Maybe you can have their friend take them some homework. A surveillance system can't do that.
I mean... Why would you ever want an automated system in this circumstance? It just makes no sense.
I think the reality of the situation is that they really wanted a surveillance system. They didn't want to record attendance, they wanted to know where the students were in every second of the day. They wanted to time home long they are in the toilet so that they can catch them smoking or dealing drugs. They wanted to see them leaving the school o that they can lie in wait and nab them as they cut class: students 2 and 5 are supposed to be in band class but they are headed in the opposite direction -- go all you zigs! For great justice!
I just can't imagine any other reason you'd want this. Perhaps I'm wrong. I certainly hope so!
GDPR explicitly prevents this. Data is collected for a purpose and that purpose only. Collecting data for attendance then using it for catching drug dealers is forbidden.
What is probably happening is that this is some sneaky way of introducing facial recognition in schools and something benign like attendance was the excuse. That, however, is a separate conversation from my original post of how would you automate attendance without using facial recognition.
The GDPR's fundamental principles are set out in article 5. You should collect the least amount of data possible, you should process it only for specific, explicit and legitimate purposes and you should delete it as soon as possible once you've finished.
Can anyone legitimately argue that a facial recognition system is the least intrusive way of collecting attendance data? Would any reasonable person believe that constant video surveillance with facial recognition technology is no more intrusive than taking the register at the start of class? I think not.
The actions of the school authorities were in flagrant breach of the GDPR and they fully deserve to be fined. There's no second-guessing in this case, no grey area, just an organisation that used advanced surveillance technology to monitor children without giving a moment of thought to the privacy implications. This fine is precisely in line with the intentions of the European Parliament when they signed the GDPR into law.
According to the article, the experiment was constructed as opt in and several students have opted in.
There is no practical way for a student (or their parents) to say no, given the circumstances.
On the other hand, this type of scenario doesn't feel much like the consent given was really voluntary - (1) because noone wants to be the one person who stopped the school from doing something which everyone else was on board with and (2) because it involved adults consenting on behalf of children for something which the children may not be happy about in future when they are adults and it's too late.
That is certainly a statement that no one could argue with. I think the argument is whether those intentions are actually good, logical, or workable.
For example, the fine structure is explicitly designed so that they could take all of the assets of a small business and wipe it off the face of the map, but only take 5% of annual revenue of a large business. Does that reflect an intention to consolidate market power in the hands of a very few companies? That would be a bad intention in IMO.
It's 4% global revenue and it's really a lot. Fines are issued based on the amount of data and amount of people affected.
Small companies rarely need anything but contact information. Storing the data securely is no trivial task for any size of organization, though.
Small companies would have significantly less data and customers to be worthy of a hefty fine. Again, if you can't keep the data secure, don't go into such a business.
The GDPR fine is specially done some large multinationals can’t ignore it.
In this case it seems that all involved were informed and OK with it but that did not matter.
I would also think that knowing where pupils are at all times is quite a legitimate aim for a school.
Considering the only words the person who oversees both high schools had to say on record was that the technology is "fairly safe", I am somewhat doubting that the parents had a fully formed view of what the potential downfalls of the program are if/when it is breached or misused, or even malfunctions.
I think when we are dealing with advanced tech such as facial recognition it is analogous to the informed consent issues doctors face. How can someone possibly obtain informed consent from non-experts over something that requires years of schooling to understand?
It was just about using facial recognition in a school, not about having brain implants in children...
Schools already have pictures and detailed personal details of all pupils.
This new technology can have real benefits. For example it could be used to locate a pupil that does not show up in class when they should, or to trigger an alert if a pupil leaves school when not expected to or when someone unknown is detected.
To me the law should protect but not prevent useful applications that people agree to, and I think that this specific case shows that, as they stand, the GDPR can be over-restrictive.
I thought since I put:
"Do you have any evidence that all parties provided informed consent?"
And you said
"This is going over the top."
That you were replying that it is over the top. I'd be happy if you could clarify what you meant.
"Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data."
"In order to ensure that consent is freely given, consent should not provide a valid legal ground for the processing of personal data in a specific case where there is a clear imbalance between the data subject and the controller, in particular where the controller is a public authority and it is therefore unlikely that consent was freely given in all the circumstances of that specific situation."
A school cannot rely on consent as a lawful basis for processing the personal data of pupils.
I am commenting on the spirit of the law itself. You're quoting it back to me but that's not the point. The point is to discuss what we think of those legal restrictions and whether there are restrictive. In fact, what you quoted reinforces my opinion that the GDPR are over-restrictive.
Oyster cards in London, your journeys are tracked, consent not asked for. Let alone giving special privilege for children.
Then the whole aspect of under-age (children) commiting crime and evidence. A smart lawyer could abuse the whole aspect to squash any evidence that placed them at a scene of a crime as they never gave consent and if they did - they didn't know what they were doing.
Basically - if a school can't rely upon consent from children - nobody can.
Google are subject to multiple GDPR investigations as we speak. They have already received a number of large fines.
>Oyster cards in London, your journeys are tracked, consent not asked for. Let alone giving special privilege for children.
TFL have a legitimate need to know where you tapped in and out in order to calculate fares. As long as they aren't using that data in an identifiable form for other purposes and they delete it as soon as practicable, they're compliant. A facial recognition system collects far more data than is minimally necessary to track school attendance, which is contrary to the principles set out in Art. 5.
>Then the whole aspect of under-age (children) commiting crime and evidence. A smart lawyer could abuse the whole aspect to squash any evidence that placed them at a scene of a crime as they never gave consent and if they did - they didn't know what they were doing.
That evidence is necessary for the purposes of mounting a prosecution so processing it (in accordance with the rest of the GDPR) is lawful under Art. 6. Consent is only one lawful basis for processing personal data; it is not always necessary, nor is it always sufficient. Consent does not give anyone carte blanche to do as they please under GDPR, particularly where that consent might not be fully informed or freely given.
I think that is a fairly well established legal principle. In many places, a contract cannot be enforced against a minor no matter whether it was freely entered. Statutory rape laws say it doesn't matter if the minor consented. And so on . . .
In the more general case, I was under the impression that informed consent was sufficient to authorize a data controller to collect/process private information and so the ruling didn't make sense to me. I'm using "informed consent" here as a short hand for all the applicable GDPR requirements on consent (reasonable language, etc).
It isn't clear to me from this language in Recital 43 though how a data controller with an "imbalance" relative to the data subject could easily get clarity on any particular use case. It also seems strange that in this case there was deemed an imbalance between the schools and the parents (I'm assuming here that parents are indeed authorized to give consent in their role as parent/guardian). If parents are in an imbalanced situation regarding school attendance, then pretty much all government relationships are imbalanced.
If the school/parent relationship is considered imbalanced and the imbalance language isn't specific to a government data controller, then it would appear that every data controller (government entity or not) is in danger of having their relationship deemed "imbalanced" and the data collection subject to analysis by the data authority at any time.
It seems like this ruling destroys the clarity of "consent" and replaces it with "(consent AND balanced relationship) OR (imbalanced relationship AND legally adequate reason AND prior approval from regulator AND consent)"
Consent is not always necessary, nor is it always sufficient. If you rely on consent as a lawful basis for data processing, the burden of proof lies with you to demonstrate that such consent was informed and freely given. The authors of the GDPR were fully aware of the fact that coerced consent was rampant, with stuff like shrinkwrap agreements, incomprehensible terms and conditions and "by entering these premises, you consent to give us your first-born son"; as a result, consent is very tightly regulated under GDPR. As a rule of thumb, ask whether a data subject could a) refuse consent without any repercussions and b) would not be surprised by any aspect of your processing; if you aren't certain that both a & b are true, you probably can't rely on consent.
The school already had a means of collecting attendance data that didn't involve constant video surveillance and had a far lower risk of misuse and security breaches. They didn't need consent to take the register, because it was justified under Art. 6 (1c, d and e). They relied on consent as a lawful basis for the facial recognition scheme, even in a situation where it would be difficult for the data subjects to refuse consent and where the data subjects would be unlikely to understand the full extent of the data processing and the risks that they would be exposed to as a result. Using consent in that way is very much contrary to the spirit of the regulations.
>If parents are in an imbalanced situation regarding school attendance, then pretty much all government relationships are imbalanced.
I realize that this particular situation is about facial recognition, but I was trying to point out that this ruling changes the game for everything, basically creating a situation where the only way for a data controller to minimize legal risk would be to get prior authorization from the data authority. That is a problematic.
For a prison, it is. A school's purpose is to educate.
Coming from billions dollar business, storing document copies, required by law for certain type for contracts, securely is a HUGE burden and liability (aside the permission management, etc.) alongside the necessary audits - both internal and external.
Having footage and training data on kids would be a resounding 'no' from me (both as a parent and engineer). Schools would never, ever have enough resources do to it properly.
In this particular instance they need one bit of info: "present" and that should be it all.
The fine is well justified, considering someone won a contract to install the system without any oversight from the school.
Schools have a duty to provide a safe environment, including keeping track of who's on school premises, where pupils are, and whether pupils are let to leave premises when they are not expected to. Obviously the younger the pupils the stronger those duties.
For example, facial recognition could tell the system that you are in the library but doing so would not erode your privacy in the library at all.
Again, with the current technology and literacy level, I'd not trust a school to be able to afford to employ proper security policies. It's already hard for businesses that actually spend substantial effort and resources.
I'd argue exposing their children to mass surveillance in school and face recognition was not in the best interest of the child. I'd go further and argue that the parents violated the GDPR themselves, by mishandling or allowing a third party to mishandle with parental consent the data of their kids. This was at least grossly negligent. If I was making the decisions about fines, I'd have fine the parents a minor fine, too, just enough to make them (and other parents reading about it) think about such things next time somebody comes around with an indecent proposal to collect their kids data.
Also, we place, in my opinion rightfully so, certain restrictions on individual choice and decisions. For instance, you cannot kill your neighbor for your pleasure even if they gave you informed consent. These restrictions are of course not set in stone, as e.g. the assisted suicide discussions show, or gay marriage.
There are certain areas that seem benign at first, like e.g. health care providers getting you to wear "health" monitoring devices for a reduction in premiums... until the reductions become so severe that you cannot really opt out anymore, if any health care provider will even take you after you opt out. Or "voluntary" drug tests to get a job, or when already employed. Everybody knows these dejure voluntary tests are defacto mandatory.
I'd argue that mass surveillance and even biometric processing in schools falls in this category too; looks somewhat benign in the beginning, until it isn't.
As I already wrote, I think that being able to locate pupils at all times is a legitimate concern for schools, and parents.
Certainly schools already work hard to achieve this especially with younger children.
I think facial recognition can bring real benefits without many drawbacks if used properly.
An effective blanked ban is not productive or based on reason.
No, no. It's a want. Of course parents want to know where their children are 100% of the time, of course schools want to know where the students are 100% of the time, but children are people too.
Children have the right to lie and keep secrets from their parents. They have the right to privacy, they have the right not to be under automatic surveillance all the time, because they're not fucking prisoners or slaves.
Schools do have a duty of care and safety. They already do make a lot of efforts to know where their pupils are at all times, especially for younger children (obviously).
This is completely different from preventing children from having secrets and from affording them privacy.
I wish we could have a mature discussion on this and avoid hysterical terms and comparisons.
Depends on the age, really. It goes from all the time for really young children, to "locate in a reasonable amount of time, if needed". How long that is depends on the comfort zone of the parents of course, but has to account for allowing children to develop, learn by themselves and grow, or else it's wrongful imprisonment to be honest.
By the way, the "being able to locate all the time" wasn't even the goal of the school. It was to take attendance.
It's not clear (and honestly, not likely) that that was true either, but if it was:
> I'd have fine the parents a minor fine, too
This is child abuse, and should be prosecuted accordingly, including jail time.
Does continual surveillance improve educational outcomes such as life-fulfillment and academic achievement?
Why is it a legitimate _aim_ for a school?
1. The Short-Run Effects of GDPR on Technology Venture Investment: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3278912
Given that 9/10 startups fail even without GDPR, it's not surprising that early stage cos form the Lion's share of failures, and it surely can't be good for any data that was slurped up during whatever these experiments at market fit were doing.
And given that the ultimate goal of GDPR is to protect privacy, it doesn't make sense to exempt startups, especially when the early stage stakes are high and a failure to squeeze out every drop of value legally possible out of your data (while your competitors do) could mean the death of your venture.
As such, comparing to American standards or even current European standards doesn't quite work when there's a clear shift of the moral bar for GDPR compliance.
I'm saying that you can have GDPR or you can have a thriving startup ecosystem. The data shows that you can't have both.
Personally, I think the GDPR is a colossal waste of time that only benefits incumbents. I've had to help implement GDPR compliance at a company and it did absolutely nothing to protect the privacy of customers. However it did cost several hundred thousand dollars.
The tech economy seeing such upheaval right now could be construed as a signal demonstrating how dependent it was on fundamentally unhealthy and untenable data practices that were previously endemic to the industry.
I'm sorry that you have had to spend a tonne of money to attain GDPR compliance. I imagine most "incumbents" have had to spend a good deal as well; I can only hope that the next generation of companies have learnt from your company's mistakes and to structure their data processes from day 1 to avoid accruing such sensitive data in the first place.
At any rate - a tech sector is possible. A thriving one, that can sustain as much employment? Maybe not quite, there'd have to be some adjustments; but the people would be better off. A tech sector with the same market cap? Unlikely, but we need to get over ourselves and question if preserving the techocracy's wealth is more important here.
Again, I'm saying that the cost is borne by all startups, not just the ones you don't like. For startups, the main cost of GDPR isn't fines, it's less investment money and more compliance costs. That means good startups and bad startups alike must pay the price. They have equal chances of being killed in the cradle by these costs.
> I'm sorry that you have had to spend a tonne of money to attain GDPR compliance. I imagine most "incumbents" have had to spend a good deal as well; I can only hope that the next generation of companies have learnt from your company's mistakes and to structure their data processes from day 1 to avoid accruing such sensitive data in the first place.
The costs weren't high because of anything we were doing that was out of the ordinary. GDPR affects you if you even keep source IP addresses in your server logs. It mandates processes as well as restrictions. You have to train employees. You have to pay lawyers to ensure your processes are compliant. Even if everything you're doing is totally unobjectionable, the costs are significant. Current evidence indicates that many new companies are solving this problem by incorporating outside of the EU and avoiding doing business with EU customers until they're large enough to afford the costs of compliance.
> A tech sector with the same market cap? Unlikely, but we need to get over ourselves and question if preserving the techocracy's wealth is more important here.
You seem to have a zero sum view of wealth. Tech companies create wealth. They make things people want. Preventing companies from existing doesn't help others (except for incumbents with inferior products).
No no, if investment money disappears from tech, that means you get Dropbox and something else that isn't a tech company. The investment money doesn't just disappear into thin air, it gets invested into something else, a different kind of company, a different sector, some place that isn't dependent on privacy-violating ad-tech bullshit and selling user behaviours to make money.
And as a result, society is better off.
Given the year-over-year trends shown in  I feel like one should not overemphasize the timeframe that article looked at. The decline seems to be relatively contained at best and future development is unclear at best. And honestly, who cares, even if we suddenly got 20% of startup failures due to GDPR concerns alone. If those concerns are reasonable I'm totally fine with that.
I remember warning about this when GDPR was being considered. People said it wasn't a concern. Later in 2018 I linked to a draft whitepaper that showed a decline in funding for EU startups. People replied that it was a statistical fluke and that we needed more time before we could draw conclusions. Now it's been a full year and the US-EU investment disparity is higher than ever.
At this point I don't know what would change people's minds. It's like talking to climate change denialists. I get the same response: "There's not enough data. The data doesn't support that conclusion. Even if it did, the trade-offs are worth it."
The US-EU funding disparity was always there, in my book this leads to less people copying ethically dubious startup patterns from their US counterparts. If GDPR is the sole reason for the decline... doesn't the benefit of the population, of quite a few countries, outweigh the impact on a few startup folks and their investors?
Companies like Microsoft, Google, Uber, Facebook, and PayPal started in the US, but they eventually expanded to the rest of the world. Since those companies weren't founded in the EU, EU workers missed out on being early employees. That means those company cultures are far more American than they otherwise would be. The EU missed out on having those companies headquartered in Europe where they'd create more jobs, more wealth, and more profits that could be taxed. Having them headquartered in the EU would also make those companies easier to influence.
It's hard to quantify exactly how much these things matter, but I think the long term cost is far higher than the consumer benefits of GDPR.
To me that sentence just isn't logical. Yes, they'd prevent some them from starting in the European market with their exact original business model if they were starting out now (and not like half of them in the last millenium). None of them started out here back in their time so that sounds more like a question of market penetration than benefit for the startup culture anyhow. With the GDPR they will hopefully also prevent said market penetration for bad actors when the time comes for the next generation of startups.
If GDPR is the sole reason and, subsequently, the US chooses to foster such businesses to win in a market, more power to them. Your examples might be huge successes, but many of them are at this time considered to be either unethical (Facebook, for the most part; Google had their share of negative attention) or plain illegal (Uber for example still hasn't managed to really start out in Germany, let's throw in AirBNB for good measure - I think they were founded in about the same year - huge financial hit that leads users to break laws and contracts left and right).
The calculation of long term cost and benefit is honestly the point where this discussion switches from economical concerns to political, at least for me since I'm not versed enough in macroeconomics to begin to judge that and would always place society over monetary concerns. I'm sure neither of us can win over the other so I'd suggest we leave it at that.
Correlation is not causation. That the GDPR caused this decline is a pretty hefty claim for which you have offered zero evidence.
But this is the exact kind of situation GDPR was designed to prevent. One in which the school obviously used its position to coerce consent, because that's not consent anymore. It means companies and institutions can't abuse their power to get consent for unnecessarily broad data collection. The law is absolutely crystal clear about this and spinning it as "second-guessing is concerning" is absurd. Any less onerous requirements would easily be loopholed and technicalitied to death because that's what companies and institutions do whenever they see the opportunity. If you're upset with GDPR, blame the companies who aren't respecting our rights.
Linked article states:
"The General Data Protection Regulation, which came into force last year, classes facial images and other biometric information as being a special category of data, with added restrictions on its use."
EDIT: phone numbers are 10 digits, SSN is 9.
Most folks don't have passports. Children do not have drivers licenses or state ID's. Some places want more than one form of ID and some require a SS#, though you don't always need the card.
It also plainly contains your birthday and sex.
So.. Sweden.., it's very convenient when you're in the system and impractical when you are not.
This exists everywhere, though, so Sweden's personnummer isn't exactly unique in this case.
For a small subset of examples, take Norway's fødselsnummer or Iceland's identification numbers or Ireland's PPS number.
 - https://no.wikipedia.org/wiki/F%C3%B8dselsnummer
 - https://en.wikipedia.org/wiki/Icelandic_identification_numbe...
 - https://en.wikipedia.org/wiki/Personal_Public_Service_Number