Scott Aaronson's blog is so awesome. I would be reading it now if I still had room in my head for the whole 'rationalist' thing (along with Overcoming Bias, etc.) Check out his "Favorite Posts" in the right-hand column.
Lambda the Ultimate is also a rare gem of a community on the internet and likewise I'd still be reading it if I wasn't trying to care less about design and more about hustle right now. I'm really interested in dataflow programming and I've learned a lot by searching through previous discussions on the concept at LtU.
Had to write my own bitmap processing library, since couldn't find anything fast enough off the shelf :-D Handled alpha blending, file i/o, etc (checkout the bitmap.lisp and color.lisp files in the repo).
She's often considered the face of the irrational exuberance behind the first dot com boom/bubble.
Rating many companies that subsequently lose 90-100% of their market cap as 'strong buys' is a pretty scary track record. IIRC, her portfolio of 'outperforms-or-better' lost ~80% of their value in one year.
Granted that the 'market can remain irrational longer than you can stay solvent', etc so she may not deserve all the scorn heaped on her (I haven't done the research to know ...)
To add to smanek's reply: Meeker was part of Morgan Stanley during a period of heavy conflict of interest. MS did not separate the function of investment banking from research, so you can imagine why her investment ratings on no-profit internet stocks were quite high. (Henry Blodget got caught (the infamous "POS memo"), it is unclear how much farther Eliot Spitzer would have gotten if he hadn't been caught with his pants down.) She eventually admitted that she did not author investment reports with her name on it, so who knows if the article above is actually based on her words or not.
Whether it's true or not about people "giving up possessions for the internet" is immaterial. The takeaway with Mary should be: A shyster of a saleswoman is trying to sell you something.
But it is too late to use a bank. He was mugged and his money stolen. What do you tell him?
You avoid answering because you know that cash theft is a big hassle.
Bitcoin shares some of the properties of cash. This includes its advantages (direct people-to-people transactions, no account can be "frozen", pseudo-anonymity, etc) but also its inconvenients (thefts are hard to recover). And smanek's point is that if cash is good enough despite its weaknesses, then surely Bitcoin is good enough too.
Furthermore, unlike cash, Bitcoin has numerous ways to palliate these inconvenients. Two examples:
0. You cannot "back up" cash (if you put it in a bank, you lose all its aforementioned advantages), but you can back up bitcoins by making copies of a wallet file. Very useful especially when using deterministic wallets.
1. Cash can only be secured in primitive ways (physical security), but Bitcoins can also be secured by using wallet encryption, storing them offline, remembering them (brainwallet!). Soon you will be able to store them on tamper-proof credit-card-sized hardware Bitcoin wallets protected with PINs or similar.
I will be the first one to admit that Bitcoin is not yet easy enough to be used securely by inexperienced users. But these problems are solvable, and are being solved.
Yeah, and I know what they are, although I wouldn't describe generating a wallet on an offline PC and keeping it offline as 'very clever'. At any rate, after reading it again I think I confused what the parent was insinuating, so nevermind.
Right, and while you could store cash or bitcoins physically in a bank, there's no point in doing so for the average consumer. Cash is comparable to bitcoins in this regard: it's a riskier form of money because it's anonymous and can be stolen relatively easily, but it's also really convenient for financial transactions.
People in the thread were talking about computer security mostly, though. In the end, the best "protection" is that insurance will reverse your losses in case security fails. Something a lay person "securing" their bitcoins on their own computer won't have.
Currently? Of course not. But when the idea of storing money on your computer (e.g. in a file) becomes more widespread, then the answer would be yes. Or at the very least it would be equally secure.
It's not that hard to provide people with a secure environment, as long as that environment is designed with with security in mind. A non-jailbroken smartphone is probably good enough, though a dedicated device might be better. And you can use live boots.
I can see a future where we use such (or similar) things to do money transfers.
Wells Fargo is a bigger target, but they also have many more layers of protections and an interest in providing customer service. My hunch would be that it would be easier for a criminal to profit off of attacking WF than attacking a personal bitcoin user, but that it would be easier to actually cause financial harm to a personal bitcoin user (even without profiting themselves). But to clarify that, I have a couple of questions about how storing your own bitcoins works, since I've never used it:
* How does bitcoin storage work with offsite backups? If someone compromised the backup, would that give them access to your money?
* If you lose the file (hard drive crash, home burns down, backup system fails, whatever), does neither you nor anyone else have that money anymore? I.e., someone wouldn't have to gain access to the money themselves to deprive you of it?
keep your wallet file encrypted and back it up to multiple locations on a regular basis (to update the backups with new private keys that are created by your client software).
there are also ways of generating bitcoin keys completely offline as well as producing signed valid bitcoin transactions completely offline. This way you can forward funds to keys that are not on a machine connected to the internet, or keys that are backed up only on paper (in multiple safety deposit boxes if you like). And also you can then put signed transactions from the offline machine onto a usb stick or whatever and then use a networked machine to forward those valid transactions to the bitcoin network.
Coinbase is doing something like this for their storage of customer funds. Coinbase seeks to be a bitcoin bank that wont get hacked or that if it somehow does get hacked (cough inside job, cough) that only very small losses could occur.
Absolutely not. But then again, I used Android at version 1.6, I run a 12 hour old nightly rom and my desktop is my server running 3.7-rc5 and running a 12TB BTRFS RAID. "Stable" or "mainstream" isn't really in my vocabulary.
To be sure, I'm not advocating that my parents start using BitCoins. I just tire of this implication that somehow USD is, by virtue of being USD, automagically more secure than Bitcoins.
Armchair loud mouths (I have one in mind who went into hiding after trolling HN repeatedly) stop by for months following an online wallet incursion to tell us how stupid Bitcoin and Bitcoin users are.
Sure, but if you use a "real" Bitcoin bank, theoretically there is someone you can go sue as well. They won't have the bank roll that WF has, you're right, but there is still grounds for a civil case I'd imagine.
pg: I have fair bit of lisp dev experience. If, as a weekend project, I modified the HN src to use postgres and memcache would you consider using it in production? Obviously, I don't expect carte blanche prior agreement, but I wouldn't want to invest the time unless I thought it was plausible the work could actually help.
I would expect it to solve most of your performance problems for the foreseeable future (at the very least, by letting you scale horizontally and move the DB, frontends, and memcaches to separate boxes - plus ending memory leaks/etc by moving most of the data off the MzScheme heap).
The obvious downside is that it would use your (or someone at YC's) time. First to merge the changes I make to http://ycombinator.com/arc/arc3.tar into the production code, then to buy/setup some extra boxes and do the migration. We're probably talking, roughly, a day. It also has the unfortunate side effect of costing HN's src some of its pedagogical value, since it adds external dependencies and loses 'purity'.
Been looking for an excuse to learn arc for a while now ...
Careful now :) It's not like there's anything stopping HN attracting a wider audience anyway; there's no restriction on who can register. Anyone can come and join in, which (in my opinion) is as it should be.
Of course. I'm not suggesting that there should be any limitations on who can join, but as the community moves more mainstream, quality will dilute. As the site is rather un-sexy right now, it seems to attract those who are genuinely interested. Remember what happened to Digg...
Very generous offer, but I would argue that HN's slow performance is a feature, not a bug. The average drive-by person, that is attracted to sensationalist articles and titles, simply doesn't have the patience for the slow load times of every page. The user that is seeking intelligent conversation, however, is more than willing to have 5+ second wait times if they know that they will be getting valuable content. Couple that with page load times having consistent slow load times, rather than surges of performance, and I wouldn't put past PG to build a delay into page loads to act as a sort of filter. Even if it's unintentional, I would still argue that is still useful in driving out some riff-raff
I also believe that Hacker News runs on a small stack of services developed by some past companies from Y Combinator.
I would agree that there is also little to no desire to make Hacker News "the news place" - where it supports thousands of posts a second and is extremely popular. In general Hacker News is used (and the hope is to stay that way) by startups and people interested in startups - it's slowly growing out to include more types of people - marketing, companies, blog posts who just want a lot of hits, etc - and not many people want to purposely support that.
On a several decades/century scale it's worthwhile to funnel at least a few percent of GDP into basic science research. But, in any given quarter/year, it's almost certainly a net loss. The trick being that every few decades, you'll get nuclear power, the transistor, etc.
Or, as a 'local' example: non-trivial number theory had basically no benefit for centuries but humanity kept 'investing' resources into it - which a short term optimizer wouldn't. Then, cryptography came along and it suddenly 'paid' for the entire field a dozen times over.
: Research is sort of like early stage VC - but with funds that pay out over 70 years instead of ~7.
: I would love to write about many more examples in much more depth, but will omit for the sake of brevity. I roughly feel like the newtonian mechanics was directly responsible for the industrial revolution, relativistic physics for the nuclear age, quantum mechanics for the computer age (with similar analogues in the biological sciences).
The example Carl Sagan gives is Maxwell's equations. They seemed pretty abstract at the time, but well ... Radio. Television. Fibre optics. Satellite communications.
That said -- these didn't happen all at once. And every technological upheaval, no matter how large it has been in itself, has still appeared in the larger context to be only an incremental improvement.
'Math' is broad - if I can recommend only one book to cover all of Math I'd probably say 'The Road to Reality' (http://www.amazon.com/The-Road-Reality-Complete-Universe/dp/...). More practically (for the subset of math most programmers are likely to care about), you'll do fine with one good discrete math book and one linear algebra book. Throw in one each on Stats, Abstract Algebra, Calc (up to ~diffeq), and Real Analysis (in roughly that order) if you're a bit more ambitious ;-)
I also think for understanding how operating systems work, nothing beats writing your own! I learned most of the concepts by building a toy OS during the better part of my undergraduate studies. I highly recommend this for people who like coding and are afraid of jumping into the theory too quickly. For example, analyzing memory allocation algorithms is never as interesting as when you have to pick one for your own kernel!