My company is sourcing AI from MTurk. It's actually cheaper than running fat GPU model training instances. The network learns fast and adapts well to changes in inputs.
I envision the sticker "human inside" strapped on our algorithms.
AI now is like Cyber was in the 1990s it's seems to be nothing but a buzzword for many organizations to throw around.
The term AI is used as if humanity now has figured out general AI or artificial general intelligence (AGI). It's quite obvious organizations and people use the term AI to fool the less tech inclined into thinking it's AGI - a real thinking machine.
Remember 5-ish years ago when IBM's marketing department was hawking their machine learning and information retrieval products as AI, and everyone in the world rolled their eyes so hard we had to add another leap second to the calendar to account for the the resulting change in the earth's rotation?
I suppose their only real sin was business suits. Everything seems more credible if you say it while wearing a hoodie.
Has been like that for quite a while, it has up and lows with the 90s marking a winter season for AI, and now the hype machine is on full steam again, until people find out again a lot of it is marketing BS to get funding. Then the research that is worthwhile gets mixed up with that, go into a lack of funding and in 30 years or so it's back on full hype again.
20+ years ago I used to refer to this is as artificial artificial intelligence (AAI) specifically as part of a pitch to MGM for an MMORPG to run their non-player characters. Not surprisingly, it didn't catch on...
Totally should have trademarked it along with "IGlasses" in the very same pitch. The patch was apparently rejected because our level design ideas were better than the actual episodes of the show upon which it was based: "Poltergeist: The Legacy."
You mean high carbon, low silicon? Because humans usually have a higher carbon footprint than computers, it takes a lot of computers to match one human. Plus we're made of carbon.
i m not sure, if you factor in the CO2 footprint of computer manufacture, and the fact that AI needs powerful computers & networking to be delivered. Our body carbon is almost 100% recycleable.
If only the carbon footprint of a human was the body carbon.
Modern humans have a very heavy carbon footprint, especially in the US. Think of all the things you do and consume and all the carbon involved all thorough the chain. It's a big number. Computers are extremely efficient compared to that.
A typical computer used by a typical person has only a fraction of energy use of GPU farms utilized to train DNN models. We're not at the point where you can pick an existing DNN off the shelf and just use it, you have train (at least partially) a new one for each new task, often many times.
Even companies like Facebook, Apple and Google employee humans to do work that people believe is done by "computers" and non of the companies seem keen on informing the public that they do in fact have humans scanning through massive amounts of data. So perhaps it is in fact cheaper, or the problems they face remains to hard for current types of AI.
Given the number of people Facebook employees to censor content and the mistakes they make I would label most of Facebooks AI claims as snake oil.
Recaptcha is probably one you've actually interacted with, but even then you're mostly reinforcing existing predictions. But other applications within Google Maps are things like street number recognition, logo recognition, etc. Waymo contract out object detection from vehicle cameras and LIDAR point clouds. Google even sell a data labeling as a service.
I believe Google Maps has a lot of humans who tidy up the automated mapping algorithms (such as adjusting roads).
Annotation is time consuming and therefore extremely expensive if you have a $100k engineer doing it.
Yes you can report problems with the road network and people update GMaps manually. And up until a couple of years ago users could do it themselves, but they took it down for some reason.
Changes to other types of places can still be done manually by GMaps users themselves, and other users can evaluate that, and I guess if it's a "controversial" (low rep user did the change or people voted against it) a Google employee evaluates it. And if you're beyond certain level as a GMaps user you can get most changes published immediately.
I envision the sticker "human inside" strapped on our algorithms.