Hacker Newsnew | comments | show | ask | jobs | submit | IanCal's commentslogin

> The content you share lives encrypted in the URL.

I'm a little confused.

If the encrypted data is "in the url" and you need to share the url, then why not just share the encrypted data itself?

This seems like it adds a lot of problems to solve key distribution.

Edit - it doesn't do key distribution, the encryption and decryption are done on the server.

reply


Also assuming the $50,000 figure (after ubers cut) is the equivalent yearly pay to the $19 hourly pay, then that requires working 50 hours per week, 52 weeks per year.

For anyone in the UK wanting to compare it to a normal permanent position, that takehome is equivalent to a ~£21-22k a year job (pre tax), but you'd be working 56 hours a week (47 weeks per year) and have no employment guarantees or pension. To be required to work this long in your job, you'd need to explicitly agree to it (and they can't make it a requirement of your employment or treat you differently if you opt out) [0]

[0] https://www.gov.uk/maximum-weekly-working-hours/weekly-maxim...

reply


I'm quite surprised at the number of people who have just said "marketing". I don't doubt that's part of it but even without it I think it's a fairly solid idea.

Kickstarter have all the right tools for selling and a proven capacity to scale well. They've sold over 4 million dollars worth to over 20 thousand people in three hours without any hassle. They're handling all the user management, accounts, live updating figures, "stock" and payments.

How much would it have cost them if right at the peak their site went down for 2 hours? Or people weren't sure if their order was accepted? Could it cost them 5% of their orders minus the amount it would have cost them to build on their own? How long would it take to have built and tested their own site to handle the load of orders (and given that they only see this peak for a short period after launch).

Plus of course marketing and the fact that a lot of people already have a kickstarter account (reducing even more the barrier between "oh cool" and clicking buy).

reply


There's still a big problem with just upping the speed. If we have a chip running at 1THz then in each clock cycle light can only travel 3mm.

Moving to more complex 3d chips could help improve performance, but there are limits to the number of dimensions we have too.

reply


Yeah, I don't think 1 THz is achievable, but we can likely do substantially better than 4 Ghz, even when we can't go smaller. Then there are also gains to be made from building bigger, building in 3 dimensions, decreasing waste heat, etc. I suspect that innovation with chips will get bumpier, but likely overall maintain the pace of innovation for a couple decades more - which is all we need to get into really interesting territory re AI and other things.

reply


> For instance, wouldn't you need to know details of how different buildings are constructed?

They learned the typical profiles of power use for different journeys and then tried to distinguish between them (at least in one experiment).

http://arxiv.org/pdf/1502.03182v1.pdf

reply


> #1 motivating auto manufacturers to support the oil industry

Are the energy costs for electric cars higher than those for traditional cars? I thought they were significantly lower.

reply


In the UK, all goods must be "fit for purpose" and must be fixed or replaced if they break within a reasonable time frame. For things you'd expect to be of reasonable quality, this is usually 6 years (if your £5 Argos blender breaks after 5 years then it's probably not covered, if your Apple laptop breaks after 4 years then it really should be).

reply


> - You have a 6 years warranty in the UK.

It's worth remembering this, however it's not always 6 years. There's no fixed time, but 6 years for electrical goods is a fairly good point.

This all comes down to the sales of goods act, which means good must be "fit for purpose". If an expensive laptop breaks after 4 years, then you can argue it wasn't fit for purpose and therefore must be fixed or replaced (or a full refund issued). As long as you haven't caused the problem through misuse, you're covered.

> since I ordered it on the Apple website, even if it was on the french Apple Store website, they proposed me to replace it using the UK warranty

If you'd ordered it through someone else, you would have to have gone through them instead (it's up to the seller to sort out, not the manufacturer).

This is a very powerful piece of consumer law, and it's a shame it's not as well known as it should be.

> - Buy your Apple computer from Apple if you can.

Unless you'd bought it from another person rather than a business, you should have received the same treatment. This isn't Apple being nice, this is them fulfilling their legal obligations.

reply


> If an expensive laptop breaks after 4 years, then you can argue it wasn't fit for purpose and therefore must be fixed or replaced (or a full refund issued). As long as you haven't caused the problem through misuse, you're covered.

My understanding is that under the Sale of Goods Act you are entitled to a partial refund [1], which reflects the use that you got out of the product. For example, if a laptop broke after 4 years, and 6 years was a reasonable lifespan, that suggests a 1/3 refund.

I believe that the new EU directive entitles you for a full refund for the 2 years of the warranty. In this regard, it's a stronger protection than the UK Sale of Goods act. In other regards, the Sale of Goods act is stronger (5 years to make a claim in Scotland, 6 in England, Wales and Northern Ireland).

> Unless you'd bought it from another person rather than a business, you should have received the same treatment. This isn't Apple being nice, this is them fulfilling their legal obligations.

Some retailers might be quicker to fulfil their obligations than others, so that's worth considering when choosing where to buy the product. My recent experience at the Apple store after buying from the Apple website was pretty positive.

edit: re-reading your post, I realise that you already pointed out that your rights are against the retailer, not the manufacturer.

[1] http://www.which.co.uk/consumer-rights/regulation/sale-of-go...

reply


> My understanding is that under the Sale of Goods Act you are entitled to a partial refund

You are entirely correct, I'd misunderstood things and possibly got confused with a shorter timeframe (if it's very early on, you're entitled to a full refund). Thanks for the correction, I'll update my post to point out the error. (edit - oh, I'm now unable to edit the post :( )

> I believe that the new EU directive entitles you for a full refund for the 2 years of the warranty. In this regard, it's a stronger protection than the UK Sale of Goods act.

Very interesting to know, thank you.

reply


Some vendors are better about it than others. I've had a bad experience with O2 denying my guarantee rights under the Sales of Goods Act, for what was clearly an "unfit for purpose" phone (iPhone 5 with a faulty sleep button[0]). Apple replaced it for me of their own accord, though.

[0] - https://www.apple.com/uk/support/iphone5-sleepwakebutton/

reply


I am not surprised, these big telco's have nothing to gain by being customer friendly.

That's why I buy my phones straight from Apple. It feels a little pricier that way, of course.

reply


A general rule I like to go by is "if it's important, it'll come up again".

Missed a phone call and there's no message? If it's important, someone will call back or email / similar. If not, then I shouldn't worry anyway.

> There was a time that I used to check my Facebook about a 100 times a day, but lately I only check it a few times a week. Something is fundamentally broken.

Sounds like something was broken, and now it's ok.

reply


First off, thanks for building and releasing something.

I think I might be missing something though, I have never understood why having a clean history (when the real history isn't clean) is important.

If you've got huge branches with scary merges, isn't the problem that you have huge long lived branches? Why does it matter that my graph shows when I actually did merge things?

I usually hear a git bisect argument around this, but are there good examples where it becomes a problem? Bisect lets you specify when to skip a commit, so you can be as selective as you want with it.

Lots of people complain about problems rebasing, and I've repeatedly seen issues where people mess up and that causes more pain when just merging seems to work perfectly fine. It seems like a lot of work, effort and time to avoid potential issues later (and even then I'm not sure what those issues are).

This isn't rhetorical, given so many people have strong opinions on this I assume I'm missing something important.

reply


It's much easier to review a feature branch that has a clean history. Especially if it's remotely complex. If every commit is self-contained and each adds only one logical feature dependency, then every commit can be reviewed individually for correctness and unintended side-effects, rather than the entire complex diff as a whole.

reply


But how often that ideal is true? It assumes that developers don't make mistakes in any of the commits, or never add "temporary" code which is removed in later commits of the feature branch. This is the reason that I tend to review a single diff, so that I don't have to browse through numerous commits and check if some line of code in a commit ends up being in the final merge diff.

reply


No, it assumes that developers clean up their branch history as necessary before submitting it for code review.

reply


That seems like a lot of work just to make code review easier. Of course, you can clean up the branch history by squashing commits, but that's the same as reviewing the complete merge diff..

reply


Not just code review. It also significantly helps reviewing history later, which is done for any number of reasons, including tracking down where bugs were introduced, understanding the reason for a given change, etc.

I've worked in projects that always keep a clean history. And I've worked in projects where developers don't bother cleaning up history before pushing. And in every non-trivial project that follows the latter I've always hit cases where I try to track down something through the history only to find that the history doesn't capture the meaning of the code. This is not just bad merge behavior, but also things like lumping unrelated changes together into a single commit.

Even something just as simple as reading recent history to keep up with the changes being made to the codebase is significantly simpler if developers are diligent about keeping a clean history. For example, the Rust project has a lot of activity, but also has a strict code review policy that means that nearly all commits are done with a clean history (very occasionally, long-lived branches are allowed through that have control merges, but those are rare). And as a result, even though I no longer have the time to actively contribute, I've still managed to keep with every non-trivial change made to Rust merely by periodically reading through the history since my last pull. I can't imagine trying to do that if the history weren't clean.

reply


Arguably the only reason it could be a lot of work is when the submitter doesn't really understand the code that multiple mixed up and muddled commits have resulted in. By giving your reviewer a mess, you're only asking for the reviewer to make sense of it instead of doing it yourself.

Instead, breaking up the feature into logical pieces allows you to documented every step properly in a separate commit message.

If you properly understand what you're submitting for review, then it isn't really much work at all. It's even easier if, during original development, you commit often. But you can always split commits up during rebasing, too.

reply


If you really care about this, rebasing is a shitty solution. Patch queues like mercurial's make it much easier to logically decompose a feature into a series of patches and shuffle code between the commits until they all settle nicely.

I say this having gone from a mercurial user that produced pristine commit histories to a git user that makes dirty, nasty histories.

reply


Why are you making dirty, nasty histories? The git users on kernel.org sure don't, and I bet they have more complex branches in flight than you do...?

reply


Supposed there is an undesirable behavior caused by some subtle interaction of features coming from 2 branches.

With a normal merge, git bisect will likely point to the merge commit as the first commit that introduced the behavior, which was the first time these 2 branches ever interacted. This means resolving inconsistencies after the fact, somewhat similar to "eventual consistency". It is more coarse-grained, can be harder to reason about, but may scale better (no serialization).

With a rebase, git bisect will point to one of the rebased commits, each of which already interacted with the branch coming before. Rebasing is sort of similar to the situation where DB client retries a transaction because the DB doesn't know how to serialize 2 transactions. It is more fine-grained, can be easier to reason about, but may have problems scaling, and may sometimes be tedious.

reply


For me, git bisect really does become much more interesting to use when feature branches are rebased to get a clean history. Just reading the history becomes easier, since each feature branch has a self-contained part of the ordered history.

In addition, it is also useful when merging a feature branch turns out to be a problem later on. Reverting a rebased merged feature branch is very easy, without having to think about if there any issues with mid-branch merges and so on.

reply

More

Applications are open for YC Summer 2015

Guidelines | FAQ | Support | Lists | Bookmarklet | DMCA | Y Combinator | Apply | Contact

Search: