Modern hebrew gets the name from the Mishnaic hebrew name for the city. It's also spelled with a unvoiced glottal in arabic. Note that Israel was in the hellenistic part of the empire, and caesarea was very much a greek speaking city, where there wasn't palatalization of latin c (which was rendered as a kappa). Note russian Czar.
It's a little funny for me to say this (my shtick in my lab is that I used randomized algorithms for everything), but whether this is true is actually a major open problem in complexity theory. Many famous problems are polynomial using random algorithms (famously determining if a number is Prime, and min cut of a graph), but over the years poly time algorithms have been found for many of these situations. Most complexity theorists I find believe that the random complexity classes (ZPP, RP, and BPP) are probably just P (BQP, the quantum analog to BPP is almost certainly not P). That of course doesn't mean that randomization can't speed you up! Prime determination is O(n^6) deterministically but O(n^2) using random search, n being the number of bits.
yup. To elaborate (informally) on why theorists believe BPP = P, basically if we believe cryptography and hence PRGs exist, then every algorithm can be derandomized.
How does this follow? I understand that PRNGs let you construct randomized algorithms in practice, but how do you transfer probably polynomial runtime to definitely polynomial runtime?
It's not "probably polynomial time", it's polynomial time with bounded error probability. Derandomization means bringing the probability to 0.
The definition of a PRNG in complexity theory (for the purposes of this topic) is that no polynomial-time algorithm can look at its output and distinguish it from real randomness (with high confidence etc etc).
Now imagine you have a PRNG whose seed size is logarithmic in the output size. You can compose it with our BPP algorithm and iterate over all the possible seeds in polynomial time (because the seed is logarithmic), and take the majority result. You built a polynomial time algorithm that, if it gave incorrect results, could distinguish the PRNG from real randomness. So it must be correct.
it can be probably polynomial time... that's ZPP! (ZPP is the complexity class of languages that are accepted by a randomized turing machine in expected polynomial time. However, in the worst case, the turing machine is allowed to take arbitrary time). In ZPP note the error probability is 0... it always returns YES or NO correctly.
It's unknown whether ZPP = P (but it almost certainly does). It is known that ZPP is the base of the random hierarchy... ZPP = RP u co-RP, where RP is the class of languages that are accepted with bounded error.
Any good complexity textbook beyond Sipser will discuss the randomized complexity classes at extreme detail, since they are crucial for constructing all the other interesting complexity classes (notably IP and MIP).
Sure, there are plenty of complexity classes :) but TFA is about derandomization of BPP, even if it doesn't mention the class by name, it talks about Nisan-Widgerson.
If you can make pseudorandom numbers well enough deterministically in polynomial time, you can simulate any algorithm in BPP in polynomial time on a deterministic turing machine, so BPP = P
A BPP algorithm can be seen as a deterministic algorithm that takes a string of random bits as part of its input. One way to derandomize a BPP algorithm is to run it with every possible random string and pick the answer that the majority of random strings lead to. This blows up the running time by 2^r with r being the length of the random string, making the running time exponential.
If the PRNG can take a seed of length log(n) and generate a random string from it, then you might try to take a BBP algorithm and reduce the length of the random input to log(n) using the PRNG. We assume that the PRNG being good enough means that this new algorithm still has the usual BBP guarantee that it gives the correct answer for at least 2/3 of the seeds.
However, now the derandomization of running it on every possible seed just gives a slowdown of 2^log(n) = n, which means that the running time stays polynomial.
The cryptographic definition of a PRNG is an algorithm whose output is indistinguishable from true randomness by any polynomial-time process. You can think of a BPP algorithm A as a deterministic one that takes randomness as an additional input. If the PRNG is truly a PRNG, then one can replace this random input to A with the output of a PRNG, and the output of A can't change.
Astronauts also need to be able to survive 1) high g reentrues (apollo reached 7 gs on reentry) and abort modes, which can be very high acceleration indeed
For example, the space shuttle uses it's wings (and body) to generate quite a bit of lift and spread the reentry over a much longer period. the g-forces during re-entry. It's 10min at 1.7g.
Though that's from LEO. Apollo came in directly from the moon at a much higher velocity, resulting in ~7g; For the Apollo missions that never left earth orbit, reentry was more like 3.5g.
A space ship aiming to carry untrained passengers will pick designs and mission profiles that are within their passengers abilities to withstand for both launch and reentry. Apollo picked a design and mission profiles with 7g reentry acceleration because they knew their trained astronauts could withstand it.
As for abort.. it's only limited spikes of high-g you only need it to be survivable for the passengers, while the pilots need to be able to control it.
not if you already have one! In general, a second bachelors in the US and Canada requires 60 credits, or two years of studies. Schools generally don't require you to redo gen ed requirements.
This is important to consider, since many boot camp graduates, including many of the most successful ones already have a degree.
Again this is not about people but about businesses... in this case startups which are somewhat unusual businesses since they have tons of cash an no income, and often don't spend money on things like CFOs that aren't really necessary for them yet.
IF they have tons and tons of cash they are rich. Which means they should know much better. They have less than zero excuses. They are often not making positive money so they should be extremely prudent with their finance. Tells a lot that nothing is expected from them and they should be treated like kids...
I would have said this shows that’s are necessary but it turns out you’re right. Who needs a CFO when you have the nation’s politicians in your pocket?
While it’s easy to say to administration… it’s more complicated than that. Salary for professors didn’t increase much, but benefits increased massively (because of the increased cost of health care and pensions). administrators also get paid more. Higher ed is a labor intensive business, and labor intensive businesses are expensive.
I disagree… cmake is very clear about what problem it’s trying to solve… build portability. If you need something to build on windows Mac and a variety of unixen cmake works much better than make.
The trouble is that a ) you probably don’t need that and b) cmake is contagious. Soon every library and its grandmother demands cmake. Cmake is also less orthogonal than make is…. Make is endlessly customizable to your work flow. Cmake is too opinionated.
That's because CMake's design and practical realities discourage portability, despite what the docs say. Almost any nontrivial project will need the non-portable target_add_*() calls eventually, so you may as well use them from the start and implement an (almost inherently faulty) compiler/platform switch if you want to target multiple, say to set sane optimization levels.
CMake don't have constructions like "I don't know which C++ compiler is selected, but turn on all warnings for it" and such. So, you need to write compiler-specific options by hands. Many people knows only gcc and/or clang.
CMake declare that it shields developer from knowing nuances of build systems and toolchains, but does it very badly.
I've had to use CMake in a bunch of work projects, but never have I needed them to be portable. They've always been targeted only at Linux. CMake's biggest claimed feature is one that, to me, is utterly useless: I don't care about building on Mac or Windows.
No... lattice-boltzmann discritizes the navier stokes equations. This is just a cellular automaton. It uses a small number of rules in order to determine the change of every cell. You can of course conceptualize an eulerian solver of navier stokes as a cellular automata... but this system simply uses a very small set of simple rules to get something that looks right.
So there’s an interesting prehistory to all this. The us music industry began in the late 19th century… and this is before the record! music publishing and songwriting was literally publishing… of the sheet music. The standard form of this was as a piano score, and that’s the form that songwriters generally had copywrite on. Now if you were to perform this you would obviously need an orchestration, and this was subject to copywrite as well, but orchestrations were generally not published.
An important thing to consider is that orchestration is a technical skill, and many great songwriters had limited formal training in music. So orchestration was almost always separate from songwriting. It still is on broadway; almost all great broadway composers outsourced orchestration (including composers like Richard Rogers, Gershwin, or Sondheim who were perfectly capable of writing it).
Many Hollywood composers outsource orchestration as well, often due to time constraints. John Williams is perfectly capable of orchestrating his work, but often doesn't. I understand he leaves very detailed notes for those who do, however.
Orchestration also means something different these days for most film, TV and game composers compared to the traditional definition. Traditionally, orchestration would be taking a piano or short score and expanding it to be played by an orchestra. Nowadays, almost all media composers create full digital mock-ups of the music first with the entire orchestra and then some. Orchestration then is mostly a process of transcription, typesetting and adjustments for live ensemble so the recording and performance accurately conveys the intent. John Williams and Howard Shore are two of the old guard who write short scores with notes for the orchestrator about what to do. Nearly everyone else is writing for the full ensemble and orchestrating as they go. Then the orchestrators translate that to traditional music notation so it can be played.
The actual act of orchestration (translating to the different instruments) is an exercise in "backwards compatibility" and musical knowledge. (Because every instrument has a range, a clef, and they play in a range but read on another range, etc, etc because of some weird thing that happened in the 17th century)
If you look carefully in the endless credits scroll you can see the credits for the orchestrator(s), if the composer doesn't do the orchestration themselves. Herbert Spencer, for instance, is credited for orchestration of Star Wars Ep. 4, A New Hope. He worked as orchestrator on a lot of other John Williams movies and was a film composer in his own right as well.
In John Williams' case it's particularly funny, since he got his start as an orchestrator! The notable exceptions to this rule on broadway are Kurt Weil and Leonard Bernstein.
bernstein "touched up" the orchestrations, and of course orchestrated the overture, but you're right. Hershey Kay and Maurice Peress did the orchestrations. West Side Story was also not orchestrated by bernstein (except for the orchestral suite)