My handwaving analysis would be that advertising around search is slowing and instead mostly growing around content - this is because the web has matured and users are now educated on what it has to offer. They spend less time searching and browsing the web and instead spend most of the time consuming content (via streaming services and social feeds) through established platforms (youtube, instagram, netflix, spotify, twitch, etc.). And Alphabet's main content platform youtube is somewhat harder to monetize.
I think it's just finally tied directly to growth of internet use. Their YoY gains used to exceed it because they were grabbing more above the fold space every year, experimenting with different ad looks, etc.
I believe the well of new tricks is drying up... everything above the fold is an ad now for high value queries.
That's not a bad place to be.. it's just not as comparatively nice as consistently doing better than general growth.
I suppose one caveat is that they make less on mobile ads versus desktop, and desktop share continues to shrink.
After going up by the same amount in the last two weeks.
I've seen this pattern in previous quarters: irrational increase in the days before earning, then a fall back to the same price. Not sure what's happening, could be option/day traders.
This has been a thing on the stock market for a very long time and is not Google-specific. Basically every ticket is going to see action prior and subsequent to earnings reports.
TLDR; Revenue increased by 19% which didn't met the growth expectations because same quarter last year revenue growth was 26%. If you don't take fines in to account margin is improved and paid clicks are increased by 39% but the cost per click went down to 19% yoy which seems to be the big factor in missed expectation for the growth. GOOG headcount went up from 85K to 103K yoy.
GOOG is getting beaten right now with analysts because Amazon and FB are being perceived to have done much better with their ad business. People on TV are asking to reduce costs, investments like Waymo and monetize properties like maps. They seem to have lost faith in further growth in Google's ad business due to increasing competition from FB and Amazon.
I must say, nearly 20% quarterly revenue increase sounds pretty amazing for such a large company (I guess not every quarter is equal - but 20% increase every quarter means a doubling of revenue year to year). Is this typical for all the big tech players?
Is it really not crazy - is the market(s) still expanding that much?
To many, investing is not about the now but the future. Looking at googles curves revenue growth is slowing and costs are growing. Numbers look great right now but meh in the future.
For me in the past few years Waymo was the most interesting growth part of Alphabet, but after the Tesla presentation I feel that Tesla is in a better position because of the amount of edge cases they can gather for their models.
Waymo is more likely vaporware. Traditional ML isn't going to work for all the edge case. Tesla has thousands of cars on the road. The Tesla Deep learning play is the only way this is going to work
ML without hardware side radars or redundancies like high resolution maps in the areas they operate is what isn’t going to work.
Name a deep neural net with anywhere close to 100% object recognition precision/recall.
You’re engaging ins hand waved reality distortion from musk that somehow their cheap low quality sensorsuite with gaps that they shipped hundreds of thousands of, can be patched over by software and be made to drive across country and service millions of people in robotaxis one year from now despite the fact that they haven’t even filed any significant l4 miles with disengagement reports?
This seems like a market rush disaster waiting to happen. You don’t ship mvp/beta of software that can kill people so you can gather training data.
2. You get multiple bites of the apple the chip is processing 200 images a second. It had plenty of sign in the next frame to identify an issue.
3. All Tesla have redundant side sensors and radars, and an adversarial system for checking it's conclusions.
4. Having don't actually need to file those things in Tesla case. Why would you? They could easily test the functionality and tell no one and not worry about the regulation until they've passed a certain amount of accuracy.
5. The beta is navigation on autopilot and it's proven to be safer than regular drivers by far.
And most importantly what do you think will happen if elon release. Self driving on local roads in a year. Other than that what more would the car need to do. Be able to do that without people touching the wheel. If he just release that as a "feature" and it works there won't be a marketing issue. In fact he already has.
1. 89% != 100%. If you miss a pedestrian and hit them, you're in trouble. Do you really want a car that misidentifies things 11% of the time? The kind of nines you're looking for in a self driving car are much higher.
2. Multiple frames misinterpreting something the same way doesn't help you. Also many times you can only draw 1 conclusion from many frames e.g you need N seconds to tell a person's intent.
3. Redundant sensors are great. Radar + camera can help give good estimation about where things are in the world, but cameras have disadvantages (e.g occlusion, low sun angle, etc) and radar too (resolution, poor range, interference, etc). If you don't have redundancies for the weakness of each sensor, you are going to have problems in the long tail.
4. Ghost mode driving doesn't actually capture the long tail of crazy situations, especially since their current sensors most certainly don't store all the data needed to recreate the world. You need to drive a verrryyy long time to see some real-world situations that break your assumptions about your sensor robustness.
5. Highway autopilot is more or less solved (with varying safety bars). Surface streets are the hard part and Tesla hasn't demonstrated any progress there beyond PR / hype.
2. Multi frames means you don't misinterpret the same way. The rotation scale, and other variance change.
3. You do have redunancy in the sensors and something that no other sensor besides a brain has. The ability to predict items directions and speed relative to yours. Something no amount of sensors gives and which is the real goal.
4. You should point that argument to waymo not tesla. They are the only ones looking at all the cases they can and learning from that data.
5. Highway piloting is solved in tesla case. You know who hasn't solved it no other automaker. No other automaker can drive from on ramp to off ramp. They can even really change lanes.
1. Incorrect. Comparing human performance on _recognition_ over imagenet != comparing human total visual ability (object detection, which is a superset of recognition, + tracking) to a machine. Machines are far worse at accurately saying what + where things are + what they will do next. For example, cyclists move very quickly. Can your system not only see them but understand their intent with extremely high accuracy? What if your camera is occluded even partially, what failure mode do you have?
2. That's just wrong. If your image is washed out from low sun angle, all 200fps are washed out and you might misconstrue the red light for a green one.
3. You need n+1 for each kind of detector resilient to different failure modes to call yourself fault tolerant. 2 cameras both suceptible to the same problems isn't fully fault tolerant. Camera and radar have overlapping and joint uses but don't overlap enough e.g if radar fails camera can't do the same things the radar can.
4. This is unsubstantiated. You really think other sdc companies aren't collecting data from cars on the road? What do you think all those cars in mountain view and SF you see are for? What evidence do you have that Tesla is "looking at all cases" -- do they even have the infrastructure necessary to simulate their car's behavior over long-tail situations? (Hint: rumor is it's pretty lacking)
5. The serious L4+ companies e.g Uber are pretty good at driving highways. They're all working on the next phase of the problem.
5. See this is why I know you not actually reading the papers and seeing the comparison of their performance cause uber is doing terribly on the highway and otherwise. Raquel Urtasun already got her program shut down for 9 months for terrible performance. That has never happened to Tesla.
This is more due to regulatory games than performance. Tesla's aren't regulated like L4+ autonomous vehicles, so when they kill people (which they have!) they don't get shut down.
I would be scared to trust a self driving vehicle based on high resolution map: the government can change the roads any time, and I don't trust my government to notify Google in time about it.
,,You don’t ship mvp/beta of software that can kill people so you can gather training data.''
I think you haven't looked at the presentation: the new data gathering and verifying is based on shadow mode on model mispredictions, and it's not enabled until the misprediction rate goes down to an acceptable rate.
It doesn't help them in agent modelling, which is the hardest part of self driving. They are doing sophisticated agent modelling in simulation, while Tesla is mostly using shadow mode to verify the agent models.
Sorry but I don't believe that gathering data is going to be an effective advantage for Tesla over Google. It's an advantage over the bloke off the street and his autonomous driving start up. Google has proven time and again they will brute force problems. Google wanted to have pictures of every single road in America for Google maps. So they did. If that data is useful Google will get it - and they'll be doing it on a standardised platform that collects all the information they need. Meanwhile Tesla has a fleet of cars most of which don't have the most recent sensor suite, and even their most recent sensor suite may well not be enough - at which point they'll need to convince another 100,000 people to buy a new Tesla to get the sensors.
Putting aside the EC fine, operating expenses grew 19.2% year-on-year while revenue only grew 16.7% year-on-year. This is clearly not sustainable - if it continues expect to see some belt-tightening at Google.
Eventually, It'll be hard to not argue that some part of revenue and profit misses arise from activism against serving the US government, or opening a China search engine.
At that point, will Google choose to pursue business interests or pander to activists?