Hacker News new | past | comments | ask | show | jobs | submit login
Competitive Analysis for Engineers (staysaasy.com)
127 points by _njuy on May 4, 2021 | hide | past | favorite | 28 comments



I've been blown away by how little engineers will know about the use of their own company's products, even the one they are working on. It's an area that companies could stand to do more training in.


It's a tradeoff of course. The more time engineers spend in customer meetings, the less time they spend on engineering. The other thing I observed when I was more directly involved in such things is that you need to give them a good cross-section because otherwise it becomes "the one customer meeting I was in, they wanted $X so why aren't we doing $X?" See also sales rep discussions.

You can summarize things based on research such as focus groups but it's less concrete than hearing about specific customer situations.


When I worked at Basho, I realized how challenging it was to really understand the experience of using Riak (a distributed NoSQL database) when it was simply not useful to Basho itself.

We dogfooded it, certainly, but Riak is designed for massive data sets that we simply didn't have need of. So while we tested it, we had to rely heavily on our customers to know what use cases were working well and what weren't.

To help alleviate that problem, our developers were frequently engaged with the customer support team; I think it was fairly standard that new engineers would start out on customer support.

I really envied companies that could properly exercise their products internally.


Some companies do this to themselves, by isolating engineering and product functions from each other.


Yes. I suspect it might be a result of something I call "organizational laziness". A corporation wants to do a minimum amount of work to justify its expense and "satisfy" the customer. It is an MBA-rational thing to do.

I suspect that if engineers truly knew the competition, they would be motivated to build a better product, which would also take more effort and take longer. So they have to be shielded from doing this by product management.

As an engineer, I find this "organizational laziness" (do just the bare minimum) very demotivating. There are many good things that have been built because people were irrational and just pursued them for their own sake.


The process is know as "dogfooding". When I was working at a Mobile OS company, the company use to give us new handset every year so that we understand the product we were building. It also doubled as perk :)


This is what happens when you hire engineers who are not passionate about your product. Training doesn't help if they're not engaged.


Where does Microsoft find software engineers passionate about Word? The fact is most software developers work on products that either just aren’t passion-worthy, or only are to some small niche of people that probably doesn’t include a lot of software devs.


While I agree, I think at the same time companies prioritize engineers to work as fast as possible, so no slack time means no time to deeply learn the product.


What does it mean to be passionate about the product?


StaySaasy is my favorite tech blog of 2021. I am glad to see it on HN. I think their posts are a really unique and thoughtful corpus for a gap in thought leadership for scaling B2B engineering / product teams.


Did they ever mention who they are or what company? Very curious as I also enjoy reading their posts on occasion


Thanks for reading, and glad that you've enjoyed the blog! We've preferred to stay anonymous as it allows us to write more authentically.


Don't disagree. Hard bit is knowing how to find that info out!


>Hard bit is knowing how to find that info out!

You'd have loved trying it pre-Web :-) I was a product manager in the late 80s/90s and I needed to do competitive analysis for pricing and feature prioritization.

One of our sources of information was an analyst firm that basically collected faxes of product briefs from ourselves and everyone else and then basically charged us large sums of money to send us copies of the product briefs for relevant products because we couldn't ask for them directly.

There was another firm that shipped us basically their own datasheets in a standard format of the products in a space (computer systems), which were often considerably outdated/inaccurate.

I sometimes say if I had to go back to those days, I'd quit in a week as I basically wouldn't have the information to do my job.


>I was a product manager in the late 80s/90s and I needed to do competitive analysis for pricing and feature prioritization.

I've only worked at one company that really went all-out on this. Samples were bought of competitive products, tear-downs and estimates on manufacturing cost, use of ICE-machines (where practical) to understand the underlying software to some extent. All in addition to studying usage, manuals, etc. Mid 1980's.

I think it was really useful, but then they also had a QA department that was a peer to and practically as large as engineering.


There were some firms like IBM who, I'm told, had massive competitive analysis teams including doing tear-downs, etc. We were actually a fairly large company (Data General) and we still didn't do a lot beyond talking with analysts who had mostly never touched the physical product--and customers of course. This was also a period when people went to events like Comdex and returned with literally a box (or boxes) full of paper.


At least Tracey Kidder wasn't breathing down your neck.


I joined DG a few years later than that though I knew many people in "the book" including Tom West who I sort of dotted line reported to for a time when NUMA servers were coming out. (For those who don't know what we're talking about, "Soul of a New Machine" is still one of the best books about product development ever written.)


Not only do we have the web, we have app stores with public reviews! In addition to easily learning about the competition, we get to learn what their customers think of their own features. Pretty magical!


I'm coming at this more from a B2B perspective.

Certainly there were computer mag reviews of products and I'm honestly somewhat split on the transition from "expert" and theoretically unbiased reviews to a much more heavily crowdsourced set of reviewers. Not sure I'd call modern online crowdsourced reviews "magical."


Mag reviews are definitely not the same thing as user reviews which are prompted during usage. If anything, expert reviews are useless.


When magazine reviews were something that was a seriously paid occupation they were often pretty good. The problem today is that they're often essentially algorithmic. Reviews in e.g. the New York Times review of books insert were generally better than random Amazon reviewers (though not always).


Author's co-writer here – there are a few common ways to get this information from public sources. Public documentation, press releases, analyst reports from places like Gartner or G2crowd, field reports from a sales team in an enterprise business are some examples.

Thanks for reading!


Analyst reports are a whole subject themselves - good, bad, and ugly.


They occupy a really strange place in the industry – relied upon by many, but heavily conflicted in terms of their incentives. The ritual and strategy of participating in something like a Gartner magic quadrant is worth a whole post in and of itself.

Thanks for reading!


The phrase "competitive analysis" has a well-known meaning in the context of online algorithms.


It also has a different meaning in business. This blog is tailored to software requirements I think.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: