I never saw a corroborating, or adding more detail source for Yegge's post, and I wonder if this wasn't Bezos himself talking. I think someone put him up to it.
This uncorroborated story is not 'conventional understanding'. It doesn't even demonstrate any link to AWS vision or architecture. Service oriented architectures existed years before this mythical memo and AWS didn't exist until years later. This memo reads like a Steve Jobs parody - just blatantly attributing all major tech and architectural decisions to Jeff's genius and foresight.
How do you know? Did you work at Amazon at the time? What's with this reflexively hostile posting style?
Tons of people also say Musk is a massive fraud who takes credit for others' work and is simply a cutthroat businessman like Edison without any technical expertise. Your posts sound the same. http://www.paulgraham.com/fh.html
Bezos and Musk are, in my opinion, clearly very good at both technology and business.
Did you read the linked story? How does demonstrate any tech vision or leadership for AWS by Bezos? Do you just blindly believe all of the claims (that don't even relate to AWS vision/tech leadership) made in the story? Do you honestly believe Bezos - the business leader and highest level manager of Amazon, who has an enormous amount on his plate, who has no technical background, is coming up with all of these high level architectural and tech decisions, instead of, say, the highly qualified and experienced engineers at Amazon? You seem to think billionaires are some kind of superheroes from fantasy land instead of actual human beings. You have to be a naive child to believe half of the claims about Bezos made in this story.
I did read it. I have no idea how much of it is true. But it seems fairly likely that enough of it is true to suggest that Bezos has a good mind for both technology and business.
The main claim made in there seems to be that Bezos wanted Amazon employees to switch to a more dogfooded API/service-oriented approach, with clear interfaces between different teams and areas of concern rather than internal private channels, to enable future renting out of hardware and services to customers, and that after he issued that edict people scrambled to implement it. Maybe Bezos originally came to that decision in part or wholly due to suggestions from others, but, either way, I've seen several people say he indeed did do something like that.
He used to program. He has a degree in "Electrical Engineering and Computer Science" from Princeton. Amazon had a ton of spare resources they could rent out. He's a pretty smart guy and is known for thinking about the long-term. It's not a massive stretch to say it's possible he could've made a decision like that back in 2002, when he had less on his plate than he does now.
> Attributing stuff to a malicious effort to make the public more manipulable (as if that was necessary) should be dropped in favor of the natural incompetence of the press in framing issues reasonably.
Except we've known for a long time that the mainstream press works, very intentionally, to manipulate public opinion.
Probably sometimes, intuitively it's pretty damn obvious not all the time, and based on what evidence? What a huge sweeping claim to just drop in one sentence.
The press has influence. The press has always had influence. In a society involving mass media, the press can't not-have influence.
We might talk about what form a society would have to have for the press to not have great influence - maybe "very strong education and civil institutions" or "directly democratic workers' councils" but if we're going have a modern capitalist society with multiple poles of elites and atomized consumers, the press, the corporations, the state and highest professional all will have disproportionate influence.
Which is to say, now, all the different press outlets manipulate, sometimes in distinct and opposite ways, sometimes in agreement with each other. And all the various economic and political institutions manipulate.
And the thing with this manipulation is it doesn't require crazy plan like the ggp/op insinuates. They don't need to intentionally create "deep fake" as vague threat, human psychology just naturally drifts that way, including the psychology of the reporters themselves. And manipulation for a specific purpose just requires the poor framing of ideas that can leveraged whenever you need it. So yeah, the press certainly manipulate but it doesn't generally have a "master plan" of manipulation. That wouldn't help (not that it hasn't been tried).
A big chunk of the issues you're writing that test suite for, and constantly rerunning, are solved, out of the box, by having a decent type system. The type system also gives you superior code completion (with proper IDE support), which already outweighs any potential reload time costs in Java vs Python/Ruby, for me personally.
If your development productivity is entirely reliant on how quickly you can reload your changes, then you're doing something wrong. Though I guess you'd have to write mounds of unit tests and constantly rerun them to prevent common issues that Java's type system solves for free.
Actually, Java, its syntax, the explicit types and how its often taught only obscure the solutions that could be much more obvious. Since I have seen Clojure, I was swearing why nobody has shown it to us right in the beginning when I was in the university. Back then, Clojure was already an established, stable language and that has been a decade. I could just skip the C, Java and other classes and would be a much better real-world problem solver/ engineer much earlier. If I needed the specifics e.g. for embedded or legacy applications, I could always look-up/ learn the details for C/ Java etc. but that is not needed for more than 95% of the tasks a software engineer would encounter in the real world.
You're not saying anything specific about what's wrong with Java. Java is not perfect, but you haven't actually given a single good reason. The type system alone is a huge benefit over languages such as Python/Ruby. Lisp has existed and been taught for a long time, and I like Lisp languages such as Clojure, but the allusion to Clojure being some kind of magic bullet is also pretty baseless.
Well, I did say at least 2 specific things (syntax, explicit types). But maybe approach this a bit differently and talk about what is great in Clojure that is not good in Java:
- persistent datastructures that can easily be made transient in specific performance critical cases
- consistent syntax
- the language itself is a datastructure so manipulating of code is very easy and you have a serialization format (again EDN) basically for free
- the REPL
- most of the code transfers 1:1 to ClojureScript, that is not the case for Java and JavaScript which are completely different languages and have very different strengths
- dynamic types, but out of the box type hinting is available for some corner cases where it improves performance/ makes interoperability a bit clearer and e.g. using clojure.spec you can with some effort make something approaching depend type systems/ very strong tests incl. generative testing
- Clojure/ ClojureScript interoperability with the Java/ JavaScript ecosystem
- developer productivity
- actually performance especially compared to Python/ Perl/ Ruby, very carefully written Java would win microbenchmarks but in Clojure you can probably improve the overall performance of a large codebase because in the same time as you would need with Java you can do many, many iterations more and therefore explore the optimal solution
- you can become very proficient in Clojure in about 6 - 12 months where with Java you probably need maybe 3-5x that time to tackle the same problem space
Yes, some of those things are not so specific to Java. If you only care about performance in micro-benchmarks Java would probably win but probably 95% of the problems in the real world are way more complex. Also, good luck writing correct multi-threaded code in Java vs Clojure. Clojure is uniquely positioned for multi-threaded workloads thanks to persistent datastructures, atoms, agents etc.
I don’t think op was stating developer productivity completely relies on reloading changes, but developer experience does matter. Many folks end up working with legacy code bases that don’t have clean code, weren’t designed with testability in mind, among other issues. Reloading changes, writing to a logger, etc. end up being common techniques. Depending on the complexity and how much your org is willing to invest in such a system, you can be in a tough spot.
> Now, Google comes and creates a whole new UI toolkit from scratch, couples it with a very beautiful SDK and component framework and offers a far better programming language than js could ever be but we're still nagging
All of these things, more or less, already exist in great variety in the JS ecosystem (angular, react, vue, redux, webpack, etc.). Many people have also already adopted other languages such as TypeScript, which is easier to learn for someone coming from JS, and superior to Dart imo.
Does anybody know if there is a filter or something that increases oxygen concentration in the air? Maybe something that pumps air from the outside and only lets oxygen in?
You can get air exchangers with filters. They use the hot/cold air from inside to heat/cool the air from outside as they filter it. It reduces co2 by venting it outside.
I’ve seen ads for medical “oxygen concentrators” which involves a face mask. No idea how they work, or what they do.
> And like that there are a lot of other UX small things that make for a more pleasant experience in the Mac, particularly for end users.
There are also a lot of bugs and inconsistencies in the Mac UX that make it very annoying to use. A few that I have to deal with on a regular basis:
- App doc will randomly break - either it will not auto hide, or it will not show when pushing cursor down.
- For some reason the OS needs to disable/reset all displays multiple times in order to redetect external monitors. Not only can the monitor detection take a few minutes, but it often messes up window/workspace positions.
- You cannot drag fullscreen windows from one monitor to another. You cannot drag a non-fullscreen window to a monitor which has a fullscreen window.
- Audio will inconsistently switch between native speakers and HDMI, completely ignoring user's manual override.
- Windows will randomly disappear - app is still running and shown in doc bar but you cannot alt/command-tab to the window, or show the window from doc bar.
-Top bar will randomly not auto-hide, and/or not show when hovering.
Mac OS UX is far from any kind of golden standard fanboys try to make it out to be.
I'm more or less forced to use windows or mac for work (mac happens to be the lesser evil), but my personal debian pc is so much more intuitive, consistent, and stable.
> You cannot drag fullscreen windows from one monitor to another. You cannot drag a non-fullscreen window to a monitor which has a fullscreen window.
I think that there's a concept that just doesn't click with you (which is okay), but it does click with me.
There's actually no "fullscreen window". If you make an app fullscreen, the app creates a new "screen" (for lack of better terms, maybe it's called a Space?). The app and the screen are now one.
You can't drag a fullscreen window from one monitor to another, because there's no window for you to drag. You can only drag the whole screen (in Mission Control).
You can't drag an actual window to a fullscreen screen (eeh), because that screen is an app and is not supposed to contain any windows.
This feels intuitive to me, I use it with joy and I miss this concept badly when using various Linux DEs.
(the other stuff you mentioned are real bugs that can sometimes happen, yes, no problem with that -- but I've also had plenty of those in Ubuntu and others)
Eh, that's a lot of extra steps and complexity for no apparent reason. Virtual desktops, supported by many linux UXs for a long time, provide essentially all of the same functionality without the extra mental overhead of having to deal with 'mission control' and thinking about monitor vs space vs window, or thinking about 'which state is my window in?'.
> You can't drag a fullscreen window from one monitor to another, because there's no window for you to drag.
But you do get the window bar on hover, which is the same 'control' you use for dragging non-fullscreen windows... the fullscreen window is still a window, except it's controls have been restricted and behaviour modified to prevent it from acting as a window for no apparent reason. The way this concept is implemented in macOS is just ugly imo.
> For some reason the OS needs to disable/reset all displays [...]
Yes, I hate this. Also, many monitors do not like whatever the Mac is doing, and will tolerate it only a small integer number of times before you have to interrupt power to them for a reset.
> You cannot drag fullscreen windows from one monitor to another. You cannot drag a non-fullscreen window to a monitor which has a fullscreen window.
I was actually replying to insist that you can, but then I realized (just now) that you said "monitor", not "desktop", so I'm not sure that this would work. Does it work for you with desktop spaces? 'Cause it does for me, although it makes the incoming window a chromeless as well and tiles with the existing window, which I didn't really expect.
> Haters pointing out a few flaws (which it does indeed have) doesn’t invalidate the fact that it’s still better than Windows or most Linux UIs.
I agree that it's marginally better than Windows, but it's inferior to Linux given the fact that you can have a superior UX on Linux. Obviously this is based on my personal preferences.
Not to mention, there's no walled garden of applications on Linux. While the "free for all" aspect of applications may be scary for some, the "welcome to adulthood, now make your own decisions" is a welcome change for many. Again, this is my personal opinion.
An issue I see with this is that not everyone has the time or will to customize their OS to their specific needs. In this regard, it might be arguable that Apple has done a reasonably good job at advertising/showcasing the ease of use of macOS out of the box.
Cmd-Opt-D to show/hide Dock btw, if you didn't know that. Can be handy to know for those weird cases where it's not showing/hiding as it should. May or may not help with this particular case you're seeing though, but give it a try!
> By this logic though, if you're already in a short squeeze, then the retail buyers (the ones who are doing the squeeze) not buying any more should not cause the price to drop: the person getting squoze (did I use that word right?) will have to buy whether or not Robinhood traders are. The price would still be shooting up. In a short squeeze all that WSB had to do was hold.
Obviously the blocking of buys will have an effect on the price and availability of the stock. Not that it provides any reasonable justification for RH either way.
> If your "short squeeze" is dependent on people buying more in order to create a short squeeze, then you are just coordinating to create a short squeeze, not dealing with a "natural occurring" one.
Even if you want to buy the stock because you know there is a short squeeze ongoing, that is not illegal. That is just analysing the market conditions and seeing the obvious.