

Wolfram has got to be a liar, right? Hand-curating trillions of pieces of data? - andrewljohnson


======
andrewljohnson
"Wolfram says that trillions of pieces of data were selected and managed by a
team of experts at Wolfram Research, and that these experts also tweak the
information to ensure that it can be read and displayed by the system."

If 100 experts each curated 1000 pieces of data a day, it would take them
20,000,000 days to curate 2 trillion pieces of data.

Has WA been in development for longer than I suspect, or am I missing
something here?

~~~
noodle
why does all new data have to be hand-curated and why just 1000 pieces of data
a day?

it says it makes use of lots of public data sets. i'm sure there's some
implicit trust there.

thats not to say that there isn't exaggeration happening. probably is. i'm
just saying, lets not throw them under the bus just yet, either.

~~~
andrewljohnson
Well, I am giving them credit for 1000 pieces of data per person.

Even if one person is curating 100,000 a day, it's still way too long of a
time frame.

I just object to someone throwing around the word trillion, related to
something people do by hand. I understand it's probably not a straight out
lie, but it's a dumb statement regardless.

------
dsil
Think of it this way. Say someone was trying to get and use as much GPS data
they could find to map the world's trails. Each source and format might
require a different algorithm to "curate" it into a usable form for your data
structures, but you certainly wouldn't have to tweak each individual bit of
data.

------
mr_eel
Just think about it; it’s likely that a lot of the information was already
structured and comes from vetted sources. Of course they don’t inspect every
single bit of data! Instead they’re likely to do doing some checks,
editorialisation and general shepherding to make sure that the data is
accurate.

