Hacker Newsnew | comments | show | ask | jobs | submit | shawnb576's comments login

Event Hubs == AWS Kinesis

Neither of those have any built in rule functionality. You have to set up machines to pull messages and process them.

This offering is quite different in that it builds in declarative rule support for filtering/forwarding the data to other AWS offerings like DynamoDB, S3, Lambda, etc.

That said, you're still going to be writing a lot of code to turn it into a useful application or monitoring system.

How much of this is the $$$ and how much is the effort?

IOW, if this thing was to generate $500/month, would you keep bothering to operate it?

I happily pay $24/year for NewsBlur. I would happily pay/donate at that rate for Mitro.

And it sounds like _just_ the people in this thread would get you pretty close to $200/month given a model like that. I know more that would.

As other commenters said, the platform integration and password sharing are really, really great.


Sorry for the delay. The monthly costs (which are actually closer ~$800-1000/month), are a small part. A bigger worry is if we take money from people, we really have some obligation to provide "reasonable" service. We've had relatively few system-administration incidents so far, but I'm concerned about something happening when we are on vacation or busy with other things.

Worst case scenario is Chrome changing how extensions work, which requires us to actually write code, or someone finding a serious security vulnerability.

As a conclusion: It really would take more like a total of $3000/month in fees to make it worth someone's time to deal with the paperwork, the administration, and to be willing to be on call. It seems unlikely we'll get there, but I'm investigating the possibility.


Yeah, I ran a team like this. I was able to attract the best of the best and built a solid team, but the incentive was to hire some people that weren't as good so they could give you somewhere to throw the 10%'s. It's awful because a lot of those people still work hard and are reasonably good at their job, no one wins. I was fortunate that a partner team had a lot of low-performers and I got a little break when we combined curves.

That said, I'm skeptical that the stack rank system is gone, based on what I hear from those that are still there (I've been out for many years now). It sounds like a lot of the same curve-fitting is just renamed.


When I was offered a position at Microsoft, I declined in part because, after the interview, I was concerned they were precisely looking to hire someone to be the "fall guy" for the team (this was while stack ranking was still in effect).

Now I'm by no means a great developer (as one of my Microsoft-employed friends brutally said about my declining, "No loss for Microsoft"), but I imagine this concern could scare away genuinely good developers too worried that the politics of "keeping the team together" would outweigh any actual individual contribution they'd make.


> as one of my Microsoft-employed friends brutally said about my declining, "No loss for Microsoft"

That's a pretty poor thing to say to a friend.


> to say to a friend.

I would review his definition of a friend... I'd say that was a very poor thing to say even for an acquaintance.


You might want to reconsider your friends.


A friend who cares enough to tell you when you suck is worth their weight in gold.


The friend said "No loss for Microsoft", showing that he cares about Microsoft.

A real friend might have said "From my perspective you could improve as a programmer. Your weakest point is field x and therefore I recommend that you read the following book."


I think the OP knows that he isn't all that awesome at programming. As a friend, I don't go around pointing out to my friends weaknesses in their person they them selves are well aware of. In this situation the friend could have said something like "Oh, that's a good idea that you passed on that job, you are week in that area, and they were probably looking to hire you as a sacrificial lamb." A statement like that shows that a person is your real friend. They are acknowledging your weaknesses, but with the goal of helping you in the end of the day, not the company.


I don't think so in the general context, unles the relationship is close and you can say anything to each other without hurt feelings.

Saying a person sucks is pretty pointless without a detailed performance review, identifying a few key points to improve in the next span (of months/years) and keeping tabs on progress. Usually saying someone sucks in general is pointless demotivation and hurts feelings without purpose.

Identifying specific sucky aspects in ones work and suggesting helpfull improvements is beneficial, if requested.


I agree with you, but he could have said "I honestly don't think you know enough X to be really useful for Microsoft."


I worked with Anders for years (he was on my interview loop, still no idea how I got in), and no, he's not the father of the CLR. That project (Project 42, COM+, NGWS, among other names) I think even pre-dated his arrival at MS. Anders was pretty focused on C#, which _was_ a direct consequence of the dust-up with Sun over Java. The MS Java libraries (WFC, etc), COM+, and C# sort of all converged to become what is considered CLR+.NET today. When VJ++ was killed via the Sun suit, the CLR was already in existance, but still pretty rough. It was a solid (fun, but painful) 3 years between when that happened and when .NET 1.0/VS 2002 shipped.

The "father" of the CLR is probably Mike Tougtangi, or Chris Brumme. I'm not exactly sure who was on that original team.


Who killed VB6 and thought VB.Net would be a good idea? I remember a long gone Channel 9 video where the VB guys had arguments for .Net and showed a demo of the third part transition tool, that ship as Lite-version with Visual Studio 2002/3.


The idea was that a converged, single, multi-language runtime and framework was the way to go. This is the way the culture works in Redmond - that's considered "strategic thinking", which is only exists in that bubble. It wasn't that the plan was bad, it's that it was unrealistic to think that the users of a very mature, very scenario-focused product would be happy with a V1 generalized product that was sorta-kinda the same thing. This was obviously false, and I actually fought against the porting tool because I didn't think it would work (it didn't), and just piss people off (it did). To keep VB6 alive until VB.net matured is anthetical to the way the place runs. Newer is _always_ better, even if it's not, and there lacked any incentive in the comp system for people to take care of the existing userbase. So, that's what happened.


This really puts into focus the meaning of "strategic thinking": abandoning current customers in favor of potential future customers.


> there lacked any incentive in the comp system for people to take care of the existing userbase

Can you explain that? Are there incentives to do other stuff?


The basic system - no. As of 2012 when I left, comp was about what your management was interested in. Maybe "take care of existing users" is on your review commitments, but what's really on there if you want to get ahead is: ship features, drive future business, etc. As such people wanted to work on the next new thing, it was relatively bad for careers to be on existing old things. Massive variations between groups, but basically that was the culture.


Thanks for sharing your experience, it helps to understand some of the business decisions.


Honestly there were a lot of bad design decisions in VB6 that needed breaking backward compatibility. Requiring to use "SET" to assign objects, parameters being BYREF by default, inconsistent base (1 or 0) for various collections. VB.net isn't that difficult to migrate to and when one gets used to it, it's hard to have much nostalgia for VB6.


Where you there when generics got brought in? What was that like from MS's side?


Yeah - this is where Anders shines. Generics were brought in primarily to avoid boxing/unboxing costs of value types, but really are incredibly powerful when used properly. And I'll say that generics in C#/.NET are done very well. Every time I use Java, I want to use generics until I remember Java's generics are confusing and almost useless.

I really miss having them when doing work in other languages, even with some of the covariance gotchas, they're just amazingly good.


I thought the CLR team had no plan/way to ship genetics in the CLR, and C# would have had to gone with erasure, like Java. That the first C# with generics came from across the Atlantic. Isn't that so?


I also note that this idea that people care about animals and not the other way around is hard to respect given which direction the killing typically goes.

For the most part animals ignore people. For the most part people kill animals.


A lot of dogs could absolutely kill you, but they don't. My cat could be 100 lbs and she'd still be more interested in kibble and pillows than mauling me. So, no not all animals are sociapathic (I'm not sure that's the right word in any case). Wild animals are wild and do wild animal things, they absolutely live and die by a different code.

But this whole argument, I think is spurious. Simply because they follow a different set of rules doesn't mean that they don't deserve our caring and protection.

Your argument sounds a lot like the one people make about other groups of people who find themselves morally superior versus another group. Replace "animals" with "muslims" or "jews" or "communists" or whatever and it may sound familiar.

That we recognize and respect that animals have needs and feelings even if they are not beneficial to us is exactly the point of this.


By your argument the cost of food/goods are going up because we're becoming less efficient at producing it. This is not how inflation works, and that's not how deflation works.

Maybe go bone up on your basic economics?


Basic economics are just, like, what the man wants you to think, man.


No... That isn't my argument at all. I guess you should go back and read my argument again because you clearly can't understand it in a single go.

The reason the prices go up are because the fed absorbs all of the technology-powered reductions in price in order to maintain an artificial level of inflation.

You are a baboon.


I find this argument very uncompelling. This product sounds like it never exited mvp if sql server free was to be good enough. I'll agree that at small scale there are lots of potential technologies. But it also makes me wonder what was so challenging here that they felt they had to keep swapping around. I won't speculate here.

However I disagree with the premise. At scale this absolutely matters. Different databases have vastly different design goals or cap theorem attributes. You'd better pick the right on or it'll impact your customer xp.


> At scale this absolutely matters.

So worry about it when you get to scale, not before. Most products and startups never reach the scale at which such basic technology choices matter; their time is better spent growing the company (adding user features, attracting new users, etc) than fighting with technology.

MySQL was never designed to scale to the levels which Facebook and Google use them. And yet both companies are still using them today, even as the roles they use them in diminish as better technologies are brought to bear.

Technology shortcomings can be worked around, so long as you're still in business.


> So worry about it when you get to scale, not before.

You are advocating people to spend man years of work just as their company start to get sucessfull porting their codebase, instead of stopping a couple of minutes earlier on to correctly decide what tools they'll need.


> instead of stopping a couple of minutes earlier on to correctly decide what tools they'll need.

Based on my previous experience, I'm advocating exactly that. The reasons are:

1) Bikeshedding - Everyone knows that N technology is better than M technology, even if nobody has real work experience with N

2) Starting with a new technology that people have only done side projects in costs significantly more time than just a few minutes.

2a) Nobody knows what those costs will be before hand

3) By the time you actually have to spend those "man years", committing resources those "man years" of effort won't negatively impact your ability to reach new customers.

3a) This work will not be required "just as their company start to get sucessfull", but at some point after that.

3b) You will most likely not have to do a full re-write all at once. Small refactorings of the actual pain points is how its typically done (there are plenty of examples of this in the marketplace today - Google, Facebook, Twitter)

4) Premature optimization, YAGNI, DRY, etc. all apply to your technology choice as much as they do your codebase.


For some values of startup, this all makes sense. If you're trying to solve a hard computing problem (not a social problem) then you have to write your own code. Turns out open source rarely gets around to hardening their code.

The you get to decide - jump aboard some community and try to help them get the code where you need it, or write your own. The community sounds nice, but the thrashing there can add more work than it's worth - tracking (incompatible) changes, arguing for your APIs and layering etc. Doubles the job at least.

Then you write your own. Again time wasted but not how you think. With examples of working code and/or a good idea what you need, you write it and it works. But you spend the rest of your life defending your decision. Every new hire naievely says "you could just have used node.js!" And you try to walk them down the design points but its almost hopeless.

I'm maybe a little depressed about this right now. My startup just got refinanced under the condition we use open source for everything, and write our app as a web app. Which of course is a pipe dream, since our app does hard things using carefully written low-level code, complex network strategies and heuristics hard-won from a thousand customer issues met and solved.

But no! Chrome can do all that, for free! So I'm asked to quit being a not-invented-here drudge and jump on board.

Anybody hiring?


Ugh. Good luck with that... Maybe you can pull a "Breach" and only use Chrome as your, well, Chrome.





I'd like to know if these sounds are learned, instinctual, or a mix of both. IOW, if you placed a monkey raised elsewhere into this environment would it know and/or adopt these sounds.

Isn't assigning meaning to otherwise-arbitrary symbols/sounds a key aspect of language?


There is some non-arbitrariness to language.



I guess it gets arbitrary at complex enough levels though.

I'd hazard a guess that hok and krak have some component of instinctual/physical nature to them. I personally think that certain sounds are related to physical experiences or expressions of emotions. Obvious ones are surprise of "Oh!" with an open mouth.

"Hmmm" whilst thinking or concentrating, frowning and closing your mouth.

I'm currently watching my son learn to speak and his verbalizing seems pretty closely tied to his emotions at the moment.

"Oishii" means delicious in Japanese and it seems something that makes sense to say whilst you are smiling at enjoying your food. The long "ii" vowel to rhyme with the "e" of "she" in English.


>"Oishii" means delicious in Japanese and it seems something that makes sense to say whilst you are smiling at enjoying your food. The long "ii" vowel to rhyme with the "e" of "she" in English.

Honestly, I don't necessarily agree. For example in Japanese 'iie' sounds very similar to 'yes' or 'yeah' but it actually means the opposite, it means 'no', whereas 'hai' means 'yes'.

If we want to talk about individual phonemes caused by emotional reactions, there might be some truth behind what you're saying, however as soon a we enter the realm of "this word sounds soft so it's positive" and "this word sounds hard hence it's negative" everything collapses.


Obviously you can find tons of examples of words that are different in different languages. I probably confused the situation by bringing Japanese in. My son is Japanese so we talk in Japanese to a baby. I wasn't trying to compare languages. I was trying to talk about baby words.

Yeah it falls apart at any level of complexity.

I just think there are certain cases in often used words and words that babies say or hear a lot at first. Like the mama/haha/papa/baba words. I'm talking about a 'language' in the same way the article talks about an animal language. Like a few often used words linked to emotional states.

I don't mind if you disagree I just happen to believe oishii may be one of these words.


all predicate adjectives in Japanese end in -i ookii, chisai, mazui, etc. the -i is the suffix indicating it is a predicate

oishii thus means "is delicious" - you don't need a "da" after it


I made a mistake by bringing Japanese into it - see my other comment. It's the language we speak at home so the one I use to talk to our son. The emphasis was supposed to be on baby-talk not foreign languages.


For humans, because of the diversity of sounds we can make, apparently the only thing approaching a universal word is "huh?" [0].

It'd be interesting to see how many sounds a monkey can make. If it's a very low number there'd probably be more universally used sounds but I'd imagine it's probably greater than most animals. I couldn't find a useful way to search for that. Unfortunately a lot of the results tended towards the "What sound does a monkey make" type of response and I'm not versed enough in linguistics or monkeys to query more efficiently.

You can see on the map [1] that the Ivory Coast and Tiwai Island are more than close enough for the monkey's languages to have split off at a much earlier stage and evolved differently. I'd assume this is more likely the case.

So I'm going to assume no, a monkey taken and raised elsewhere probably wouldn't instinctually jump into the trees upon hearing a "krak". But even the article states there are more experiments that need to be performed (although that particular experiment is a little insidious considering the intelligence of the animals you're kidnapping from (maybe if you saved one whose parents were incapacitated in some way)).

Of course, you could just go to the zoo and yell "krak!" at some of the monkeys and see what happens! Might get you some weird stares! ;-)

[0] - http://www.newstatesman.com/martha-gill/2013/11/what-one-wor... [1] -https://www.google.com/maps/place/Tiwai+Island,+Sierra+Leone...


Well, it seems strange to me. The article you link seems to suggest "huh" might be a universal word, but in the details, they show that it's pronounced differently all over the place. For example, it tends to be closer to "ah?" in Mandarin.


I'm suddenly very curious about the universality of the similar "uh-huh".


I think that depends on what you call 'meaning'. Is meaning an intuitive, instinctive representation of something?

What I mean is, for every word that exists, do those words trace back to a reality origin through a pattern of substitutions? Substitutions of symbol to reality are really just an associative relation. Substitutions between symbol to symbol are functionally operative the same way in which a substitution between reality and symbol works.

So then my question is, is language really anything more than remembering that the cherry came from the tree? Once the cherry is disconnected from the tree, we have two things - cherry and tree. But before we distinguish them as parts, we recognize them as a whole. When I walk away from the tree, taking with me, the cherry - what happens if I still use the tree in my mind to represent the concept of fruit? It's a choice function. Does it matter whether I remember these things using sounds, symbols, images, experiences, or feelings? Language is interpreted and expressed across and using all of these domains. A poem carries greater meaning than the words do individually, and that is because there is emotional association that maps to the selection of words. We don't really call 'emotion' language, nor do we call 'art/music/math' language, yet these things arguably can have a strong influence across how we 'know' what language represents.



Applications are open for YC Winter 2016

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact