This article seems to lack common sense. Yes, the amount of Vitamin C will reduce the longer you store a vegetable or fruit in a fridge, but the amount of calcium and iron does not change. Nor does the the density of other minerals.
Why is the author nominating Walmart as the winner based just on their speed to market? What about the other factors like soil degradation etc., which reflect the amount of iron and other minerals?
Deep in the text they mention that a study looked at nutrient contents and Walmart was the best:
“I learned this from talking to Brent Overcash, co-founder of a startup called TeakOrigin, which specialized in testing nutrient content in groceries from retail grocery stores. For years, every week, his team would walk into grocery stores, buy thousands of produce items the way normal consumers would, and bring them back to the lab to assess nutrient content.“
How did Facebook determine that Schrems was gay? Do they even know he's gay? Just because they showed him an ad for something that gay people would find relevant doesn't mean they targeted him for being gay. Its possible he likes a lot of stuff gay people like. I get targeted by advertisements for institutions offering degrees in Christian studies or some such thing, even though I'm an atheist. But say I was a Christian, should I then deduce that Facebook knew I was Christian and used that information?
They are just spraying and praying with their ads on best guesses as to what is relevant to you.
You get targeted by advertisements for institutions offering degrees in Christian studies or some such thing, because you're an atheist.
I fixed that quote for you, because advertising is often targeted to the opposite demographic for various reasons.
Just try enjoying your favorite show on radio or regular TV, and you'll perceive ads for stuff you would never touch, but they are paying good $$$ to support that show you like, and to get in front of everyone possible, and perhaps wear you down with brand recognition and exciting jingles to influence your buying decisions in a moment of weakness.
However, targeted advertising may know exactly what you like, and be an effective means of call-to-action and conversion to sales.
On the back end, Meta advertisers fill out a list of audience interests and demographics. So yes, if the advertisements and their buyers were documented, Meta should also have sales info on the intended audiences.
It's a lawsuit because Schrems only needs enough of a basis to force the courts to consider certain issues, and to make statements about how GDPR should apply in principle in certain situations, in order to effectively restrict big tech's use of data.
It's a case brought strategically in order to trigger certain questions of interpretation of GDPR rules to be litigated.
Schrems' specific claim only needs to hold enough water to give him standing to get the case through enough filters in the court system to facilitate this.
Facebook didn’t determine he was gay, it’s just spray-and-pray by the algorithm exactly like you said. He’s just bringing the court case in bad faith to raise the issue and create another foothold for the EU to extract further billions in fines. The outcome will be more laws so nobody creates anything new in the consumer space or takes any risks that aren’t sanctioned by EU central planners ever again.
Knocking over US big tech companies for fines is literally the fastest growing EU industry by total profits.
I’ve never heard of a malware that tells you exactly what it’s doing with your data upfront.
But anyways, not sure the EU will love it so much once the US finally puts its foot down and the EU capitulates. Unfortunately it’s hard to say no to the only country that will protect your sovereignty (NATO doesn’t work without the US and I’m sure Russia would like to keep going further into Europe).
After that data privacy revenue stream dries up, the only thing left will be the laws and regulations that permanently keep the EU tech industry from ever being competitive with the US or China.
But maybe the ballooning population of European social benefits retirees will pay for themselves? Who needs evil private industry to generate tax revenues when you have bulletproof data privacy over what toilet paper you bought last week. Who can afford the insane risk of somebody creating products that might disclose this highly sensitive data?
Believe it or not surveillance capitalism isn't the only business model out there. Builders and makers may perhaps find something more interesting to do.
But sometimes when I see projects like this in other languages, I think, are you sure you don't want to use Erlang or something else on the BEAM runtime and just call Rust or C via their NIFs?
I used Erlang about a decade ago, and even then it was so robust, easy to use, and mature. Granted you have to offload anything performance-sensitive to native functions but the interface was straightforward.
In the Erlang community back then there were always legends about how Whatsapp had only 10 people and 40 servers to serve 1 Billion customers. Probably an exaggeration, but I could totally see it being true. That's how well thought out and robust it was.
Having said all that, I don't mean to diminish your accomplishment here. This is very cool!
I think a lot of issues BEAM was trying to solve were solved by processors getting bigger and more cores.
BEAM's benefit 10-20 years ago where that inter-node communication was essentially the same as communicating in the same process. Meaning i could talk to an actor on a different machine the same way as if it was in the same process.
These days people just spin up more cores on one machine. Getting good performance out of multi-node erlang is a challenge and only really works if you can host all the servers on one rack to simulate a multi-core machine. The built in distributed part of erlang doesn't work so well in modern VPS / AWS setup, although some try.
“Just spin up more cores on one machine” has a pretty low scale ceiling, don’t you think? What, 96 cores? Maybe a few more on ARM? What do you do when you need thousands or tens of thousands of cores?
Well, what I do is think of functions as services, and there are different ways to get that, but BEAM / OTP are surely among them.
I’m just saying, erlang was built for telephony at scale, not for building some REST website. “You probably won’t need more than one big host for any given request” isn’t really a winning argument for scaled systems
Correct me if I'm wrong, but I believe "scale" in the original context meant developing a system with strong fault tolerance properties, so that if a node went down to ie a hardware failure, the system as a whole would keep working normally.
So, did you run into any systems that needed to scale to tens of thousands of cores for a reason inherent to the problem they were solving, and was built on top of BEAM?
It's a nice point. I am a fan of the beam runtime, and it has been an influence on the design decisions of kameo. However I don't see myself switching to another language from Rust anytime soon, especially with the amazing advancements with wasm and such.
Although Elixir is a nice language, I struggle to enjoy writing code in a language lacking types.
This is why institutions break down in the long run in any civilization. People like you, people of principle are drown out my agents acting exclusively in their own interest without ethics.
It happens everywhere.
The only solution to this is skin in the game. Without skin in the game the fraudsters fraud, the audience just naively goes along with it, and the institution collapses under the weight of lies.
They can "tell" employees all they want, but the engineers who really get stuff done are not going to get fired for working from home, and a bunch of them are choosing to stay remote ... or paying lip service to office days by promising to come in and then not showing up.
What are managers going to do with critical engineers who are delivering? Fire them? They will have another job doing their individual contributor work in a week. There's a real shortage of individual contributors with skills and the ability to deliver consistently.
On the other hand, if you are in one of those political mid-career positions that mostly involves communication, coordination, impressions, etc., then yeah, the org doesn't need you as much as you need the org so you better haul your ass back to work.
Not really. The utility of credentials is falling fast vs recommendations, networking, and work experience.
If you want to learn some deeper technical things, I suggest online one-off courses or online mini-certificates.
On the other hand, if you are still in one of the few remaining societies where credentialism is still rampant, then you may want a Masters for the prestige.
I don't really think a Yale degree makes you better than someone with a community college degree. There's no magic at Yale. The education you get everywhere is pretty good now because of all the resources universally available to all students.
But Yale basically applies a filter function and attracts the top 0.01% of high school graduates every year (plus some less elite legacy students and DEI admits). When you hire a Yale graduate, that's what you are paying for. Not the Yale education. If you could find a similar filter function some other way, you'd hire that 0.01% of high school graduates via that filter function.
And in fact companies are always trying to get ahead of their competitors and find other, less well-known, filter functions to get high performers who others don't know about. In the 1980s and 1990s Microsoft was among the first to discover that Indian IIT graduates were products of an extreme filter function applied to Indian high school students (IIT grads are like top 0.0001% of Indian high school grads). For a long time Microsoft hired those engineers for cents on the dollar. By the 2000s though, the word was out ... hiring IIT grads is as difficult as getting any other high performing grads.
There was also a brief period of time when Google had an edge in recruiting by identifying high school kids who were good at programming competitions online and via contributors to projects in Google's open source projects. But now, that signal is well-known too.
So John's community college degree doesn't matter if John is an elite performer.
As someone who has studied at both kinds of schools I can tell you there is a WORLD of difference.
In a middling school the professor was constantly providing remedial education to the students and had to cut down the curriculum breadth and depth.
When we are talking about the elite 0.01% of students the professor is irrelevant for regular coursework. They are almost uniformly autodidacts. A mentor is of course useful at the very boundaries of knowledge that textbooks and papers don't cover. But by the time you are in that range of work, you will be recognized through your performance, and you can find mentors by reaching out to them with links to your work.
Generally, people are completely unaware of what top 0.01% of performance looks like because we are so rarely around these people unless we are in some very elite institution or working on some project which attracts such people.
reply