Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What problem in your industry is a potential startup?
241 points by choogi on May 4, 2018 | hide | past | favorite | 245 comments
This has been done two other times before:

https://news.ycombinator.com/item?id=13139638 (2016), https://news.ycombinator.com/item?id=9799007 (2015)

Both threads generated a lot of really interesting discussion, and I was curious what the discussion would sound like if this were asked again in 2018.

I'll take a startup in a box. I want Kubernetes, Elastic, Kibana, FileBeat, Prometheus, Consul, Grafana, databases, HTTP gateways, etc all set up on on the cloud (or set of servers) of my choice and a dashboard that lets me add users (e.g. like a modern Webmin/cPanel+WHM) and make minor config changes if I don't want to run these myself. I want those things HA and I want stable hostnames for them (e.g. Consul DNS on every box). Then I want an empty app template where I can provide a few things: commands to build my app (including dependencies), a systemd conf for start/stop of my app, a config value to tell you where the logs will be, a config value to tell you how to consume metrics from me (e.g. a local HTTP path for prometheus), etc. I feel like we're close with Helm and Kubernetes but I'll be damned if coordinating and setting all of this stuff up HA, getting notified of failures, getting notified when I need to add more servers, being cloud-independent, etc isn't an extreme burden to entry.

I want to write code and deploy, not spend most of my time on ops or marry myself to a cloud vendor. I started my company as the only tech person and I feel like I have to be more admin than dev even though I'm doing the same thing as everyone else.

I don't think for a startup you need K8s and most of the things you have outlined! Maybe for a late-stage startup, but then you are already doing something so a generic slap-it-there solution might not be the best idea.

You are describing a large-scale setup boilerplate, far away from what I'd call a startup in a box. Slap in a simple server like my own project [1] or no server at all until product-market fit, and then but only then start scaling.

[1] https://serverjs.io/

But if the cost of having those things is low enough, maybe it's worth it to have them sooner rather than later.

I've experienced many of those tools in the context of a 200-person company; I can definitely see how managed versions of them could deliver value to a much smaller org.

Of course you don't "need" them. However, I don't believe I'm describing a large scale boilerplate. You can get this with a few servers and a couple hundred a month these days. Heck, you can do it on a laptop w/ minikube probably (though obviously not ok to do in prod, just goes to show the costs are not the problem here). Sure it's easy to run an app. But it's not easy to admin it as you grow (even though you are not yet "big" and still may be only one or two people).

Who will troubleshoot when shit hits the fan?

It seems to me that you are introducing technical debt from day one because it’s cool.

What technical debt?

The kind of technical debt otherwise known as premature optimisation. If you don't even know yet that your app has to scale - and in which way - preparing for that contingency is a waste of resources.

I wouldn't qualify that as a premature optimisation. Kubernetes is de facto standard of running applications now and you get all the benefits for almost free. Premature optimisation in this case would be over provisioning - throwing more servers than you need.

K8s is "de facto standard" already? When did that happen? No doubt that K8s has a great future and I personally like it a lot as well. But to call it a standard requirement itself is premature.

Take a look at Cloud Foundry, especially the Container Runtime.

The Application Runtime (CFAR) is like an OSS Heroku, installable on AWS, GCP, Azure, vSphere, and OpenStack. Then just push your app and scale it out. It was first built as an enterprise Heroku competitor, and the commercial flavors power a bunch of big companies’ infrastructures. It even uses the same “buildpack” model that Heroku uses to provide language and framework dependencies.

The Container Runtime (CFCR) packages k8s in a way to make it easier to deploy, maintain, and scale, and it also deploys and manages the health of the underlying VMs. It originated in collaboration with Google.

Source: I’m a Cloud Foundry Foundation project lead.

Hey! This is what we are building at Asyncy.com - we are inviting users into our private Alpha release in about a month. I highly recommend you inquire and check it out. 100% OSS, full stack app out of the box using microservices.

Even if someone assembles a product like you ask, how will they monetize it?

I'm assuming you want the product to be built with open source components, so then the startup would be selling a glue script to package these open components together.

The only feasible way to make money with this would be to sell a proprietary glue script. Not sure how acceptable that would be to you or to potential customer base of this startup.

How about giving it away and then offering consulting?

There is an inherent conflict with that kind of business model. If your product were simple enough, that would reduce your opportunities for consulting.

So to rope in consulting offers, you'd be inclined to add complexity and config options to generate more avenues for consulting.

Depends on your mission.

In my experience, the companies that succeed are those who are sincere in wanting to help their customers.

Take a look at https://robinsystems.com

​Robin enables an App-store like user experience to simplify deployment and lifecycle management of Big Data, NoSQL and Database deployments, On-Premise and in-Cloud. It supports application-level snapshots, clones, time-travel, QoS, scaling, backup/restore, etc. It takes care of HA, stable hostnames, Application templates, events, notifications.

Check out the demo clips for [video]

Cloudera https://vimeo.com/213037162/a66e0b4e77 HortonWorks https://vimeo.com/227832200/8d9c749984 MongoDB https://vimeo.com/206348836/a36e535add ELK Stack https://vimeo.com/225899966/6000bae6f2 Controlling IOPs in shared environment https://vimeo.com/171608156 1-Click deploy, snapshot, the clone of entire Oracle https://vimeo.com/195549219 Even RAC cluster https://vimeo.com/223730427/e8cb8c92f8 and SAP HANA many more applications

Disclosure: I am the lead developer at Robinsystems

That's a solution not a problem :)

The point is to find problems to solve not just ideas.

I did the original posting of that question and wrote an essay about it afterwards:


I've also been searching for this for years, and the closest thing I've found is Convox [1]. Unfortunately they only support AWS for now, but it's a really good PaaS on your own servers. It's open source, too [2]. You can run a convox rack without using their hosted console.

One problem is that it requires a minimum of 3 instances, so it can be more expensive than Heroku to start.

[1] https://convox.com

[2] https://github.com/convox/rack

I'd second this -- and very occasionally I think of spinning off my current code into this.

If find PaaS can be a little too opinionated. You get boxed-in too quickly. Whereas IaaS is too raw. I had a bit of a head scratch when I recently moved to Google Cloud and was a bit flummoxed on where to put my (backend) session state. Similar experience trying to get Blue-Green deployments on GKE (Kubernetes).

I feel GCP/AWS and friends will get there. But for now there are definitely some gaps.

Sounds a lot like Terraform https://www.terraform.io/

I am somewhat familiar w/ this product and I'm afraid I'm not articulating how simple I expect. Where do I click to search my logs?

re k8s: https://medium.com/@steve.yegge/honestly-i-cant-stand-k8s-48...

But ultimately all these arguments converge to: something that beats k8s's suckiness is a potential startup.

Oh, also: https://www.openshift.com/products/pricing/ is kind of like what you're talking about. It may not have everything, but it's managed k8s on your cloud with support.

Closer. Things like being able to see the logs of all your apps and searching them is a common thing I assume. We have all the tools today, that's not the problem. The problem is me having to read pages like [0] just to do what almost 100% of companies have to do. I am having trouble articulating the simplicity of what I would expect.

0 - https://docs.openshift.com/enterprise/3.1/install_config/agg...

Biggest problem in this is, one size does not fit all.

It is hard to compile a stack - which would be useful for most companies.

Yes, there have been attempts made - for Example, Open Stack. But then I feel, there is always an element of specific technology based on the needs of the product being built.

Aren't you talking about Heroku?

No. Some of my quotes: "set up on on the cloud (or set of servers) of my choice", "being cloud-independent", "not [...] marry myself to a cloud vendor", etc.

You know why you can fill your car's tank on any petrol station? Because all the different players -- pump vendors, car makers, oil companies etc -- are using the same standards for the size of nozzle, fuel composition etc. They have concluded that standardisation is in their interest, as opposed to locking the customers in.

In software there are no such standards, or rather there are too many standards, and new ones are being released every day. Currently there is no clear incentive for vendors to standardise, and it's quite possible there will never be, it's early to tell.

What you're describing currently requires enormous resources, and no single company can pull that off -- even Google is struggling with keeping both back-end stability and front-end simplicity (I'm currently working on a new GCP tool which just got into beta, so I can see it firsthand). Standardisation allows many different vendors to focus only on a part of the whole supply chain, while with the cloud you have single vendors trying to cover everything under one roof.

It would be great if we could combine Digital Ocean's droplets with Google's BigQuery with Amazon's RDS, all through a simple front-end UI -- but the nozzle size is not standardised, and I wonder if it will ever be...

If you bought all of this pre-made from a vendor, you'd be tied to that vendor, subject to their pricing whims, support SLA, etc. Maybe they'd be multi-cloud, maybe not.

I don't want to buy it pre-made. I want to buy the coordination/orchestration software. And I don't want to spend forever in yaml/chef/ansible whatever to setup these things every company does. I want to download it and I want to run it. I'll give it a list of servers and some SSH keys. It can tell me if it needs more capacity. The dashboard can even be local.

That sounds like an open source collection of things, rather than a startup idea.

i like it though, could really help in some situations.

I mean, everyone has different requirements or preferences, so you don't get it all set up for you generally. But there is at least one such solution which more or less fits your description: DCOS.

> I mean, everyone has different requirements or preferences

I've found this is only kinda true. Most people need the same supporting infrastructure and most people only need to deploy their HTTP-visible app, maybe a cron or backend daemon, and probably a data store. Everything else from logs to metrics to alerting to HA gateways and so on are quite universal.

Are you starting new companies that frequently?

Could be good for a VC that doesn't want its' investments wasting money reinventing the wheel.

Or maybe start small to test your startup idea and then you don't need a military grade server setup ready for WW3. :)

Depends on if, by "start small", you mean small effort or small money. Having those things I mentioned shouldn't be expensive, but are huge effort. I want to start small, but if I can't do something as simple as delegate SSH access or search my logs, I can't even test my idea. What I'm talking about is meant to increase the ability to start small for those of us w/out dedicated ops teams.

My point is that you don't need an ops team or lots of ops to being with. A single server can do miracles for a good period of time.

Just use a PaaS like Heroku? Keep the tech simple and vendor locking is minimal.

Someone recently mentioned to me that platform.sh does some of what you're looking for

besides the " marry myself to a cloud vendor" part it sounds like you want a platform as a service. AWS and Google both provide more or less what you've asked for. Just takes minimal glue to turn the features on.

Yup, I basically want self-hosted PaaS in a box, but also w/ opinionated interoperability (e.g. use Prometheus alert manager to tell if my DB disk is reaching 90%, use consul DNS to have stable names for my log server, use Grafana to show me memory usage across every container/app, etc). It's always the "minimal glue" that takes forever and often once "glued" it becomes stale because it's not maintained unless it's someone's job or something breaks.

Yup. Me too.

Could you not go with serverless and eliminate all of that complexity?

Not easily across cloud. If as part of this package, it included a "cloud functions"/"lambda" component, cool. There are a few vendor-independent ones out there. But I need to search logs, store data in a database, etc. I don't want to go to the Google console to see my logs. Also I think traditional apps have many benefits wrt cross-request state management that are not as easy to do generically when serverless.

I do B2B sales (specifically, customer support for consumer-facing businesses), and I'd pay ~$1,000/mo for a service that solves this problem:

Which warm intros should I ask for? There are ~1,000 companies in my ideal customer profile. I've got ~500 friends who I'd feel comfortable asking for warm intros, and say these friends each have ~500 friends. After deduplicating, that's ~100,000 second-degree connections, some of whom are decision-makers at companies I'd like to sell to.

I'd want someone to go through my LinkedIn/Facebook/Instagram/Twitter/etc., and tell me something along the lines of: "Ben might know decision-makers at Companies A, B, C, D, and E." And, conversely, I'd like to know all the possible warm introductions that could lead me to Company A (e.g. "Ben, Max, and Jennifer could possibly introduce you to Alice, Bob, and Cameron at Company A").

All of this information is available to me; it's just a total O(N^2) pain to clean and aggregate it. Like, I can certainly spend an hour listening to podcasts and looking through Ben's LinkedIn connections, Facebook friends, Instagram followers - and seeing if any of them are COOs at CPG brands. But I'll run out of podcasts eventually, and then it's not a very high-leverage use of my time to repeat that process for Max, Jennifer, Nate, Christy, et al.

We built intrologic.com to do exactly that. Started like a side project to try to sell our previous product (a sales chatbot) but then took off on its own. Connect your networks and your friends and then you search for warm intros. It's free (we plan to make money only when an intro is made). You can sign-up at https://intrologic.com/start. I am the founder.

LinkedIn doesn't let you crawl their graph like this but they could do it

You have 500 friends?

I have about 20-30 people I still stay in contact with and it's super tough to schedule activities as it is. :(

This is a good an important problem - finding the right person to sell to (and a route to get to them) is very important in b2b with large businesses.

A next generation internal / corporate portal. Everyplace I have worked at had either a really bad implementation of an internal portal or had a bunch of wiki pages clubbed together. It was incredibly difficult to get even basic info, such as where is the conference room, who works in the security team, what is the expense policy, so on a so forth. In my current multi billion dollar company, when I am meeting with someone new, I have to actually go to LinkedIn to understand what they do in the company and in what is their focus area.

I agree. I think if you could branch off what stripe has done with their 'home' feature, you'd have a real business on your hands. https://stripe.com/blog/stripe-home

I worked for a company within the a team that allocated quite some time to such internal tools as well as for others.

The real issue is how custom these solutions need to be and which weird rules and workflows some need to follow because of internal rules, iso norms and whatever else.

I don't think a one fits kinda all solution would ever really work. However i highly believe that companies need to allocate enough time to build tools that work _for them_

Software development infrastructure in a box plus strong training materials / guidance for scientists. Increasing numbers of scientists are writing code without any formal CS training, and the outputs are predictably awful and unreliable. It is very common to find no testing, no acceptance criteria, no version control, no formal planning, no code review, no style guide being employed, sparse commenting, fragmented development environments / dependency hell etc. People frequently know that what they are doing is suboptimal, but it is hard to convince them that they should put in the work to use industry best practices for a variety of reasons.

If someone could create a product (probably infrastructure plus a Python IDE) which made doing things the "right way" easy for these users, and which would provide case studies or tutorials to show them WHY doing things correctly is beneficial using analogies to good lab behavior, it would be hugely valuable.

I'll give an example: A friend is a PhD candidate in a neuro lab. He uses Matlab almost exclusively and the undergrad was in neuro as well. He has never taken Clac, let alone any matrix algebra. Ask him what det() means, and it's a deer in headlights. That said, he's written about 10k lines of matlab code, with 15 levels deep of nested for() and while() loops. To say he needs help is known, as he paid me to help him out for 16 hours the other month. Literally, it was impossible. I could barely wrap my head about his problem, let alone the code. Remember the saying : 'Don't do data science in a GUI' ? Well, his issues were a good argument for that. Trying to get what he was doing in a python IDE would not be possible, it's just too complicated.

I once worked for a company where the accountants had written a massive and convoluted VBA application that had gradually become mission critical to the company.

The IT department allocated a number of people (half a dozen?) for some time (8-12 months?) trying to turn it into a maintainable software product.

They failed.

I suggest you look into the Software Carpentry foundation - they are making a great effort to improve scientific programming through short workshops: https://software-carpentry.org/

Ironic that scientists are not conducting any testing.

I routinely bring this up actually. Testing is literally equivalent to running experimental controls -_-

I'm doing this - and have already made good progress. Contact me at wtpayne-at-gmail.com it you want to talk about it more.

i can't see a business model - the scientist is already paying with their time to get something working that is just good enough for publication. all the problems you describe are down the road and scientists kick that to 'industry'. unless your 'right way' is also easier to boot, why would they add to their time?

I think that is indeed the perception, but that the time savings would actually be quite pronounced if people did things correctly. I've seen many projects get immediately bogged down by bugs / feature creep / lack of planning and end up taking far longer than if people had done things correctly. Also, many labs hand off code bases when post docs or students leave, creating chaos for the next person that is tasked with working on them.

As an example, I wrote a proof of concept script to show that we could automate some basic image analysis in my lab three years ago. That was immediately grabbed by an investigator and put into production without any further thought. Because it was a proof of concept script, it was of course very buggy and required substantial feature addition. This was added without any thought for design etc. Fast forward to today and this code base is a sprawling shit show which is being rewritten for the THIRD TIME. Each time has ended in failure because people failed to observe basic best practice, and this attempt will likely fail too. That is an ENORMOUS waste of investigator time. Another project I can think of involved a model which had a 10,000 line function. No one could trust what was being outputted by the thing, so they eventually abandoned it. That's hundreds of investigator hours down the drain.

I agree 100%. This is something I've taken to heart after seeing and trudging through academic code over the last few years.

In a way I also think this is a language problem. I hope that for some data-intensive projects productive statically typed languages (aka Swift + Tensorflow + Python interop) can help fix this.

I work in the humanitarian field in a poor country ravaged by war and famine. We have a number of humanitarian actors currently working in this country and no one actually has access to accurate population statistics. Well, none exists. The last census was conducted in 1975. I believe someone could use technology to get a much better estimate of the population. One thing that always springs to my mind is the possibility of using aerial imagery. It doesn't have to be exactly that, though.

I think this need, a real need that is, can potentially make millions for the enterprising type, here.

Why don't I try the same, you might ask. But, while I have some ideas, I am may not be able to raise the resources needed, at the moment.

I've been working in Kenya as a technical Co-founder of a startup in the ag space. We focus on smartphones, but I used to work with the telecom operators a lot in my previous work (where basic GSM connections gives some indication of population connected).

Some of the best stats you get for telecom coverage will come from the annual reports of commercial telecom companies. Eg in Kenya you can look at safaricom. In somalia I guess you can look at Telecom Somalia and some of the others (keeping in mind their market share).

GSMA do research every few years on the number of SIMs on average per market. This is how they get to a figure about subscribers (ie people) rather than connections (number of SIMs). The latter is what is usually reported by commercial companies.

You can quite easily do some online digging for a given market to get fairly current numbers (ie last year). For doing anything with sizing markets etc I'll always start here. Not with World Bank or similar data which is really outdated and not indicative of current trends in technology at all (which is important for me).

The international development sector is unfortunately so slow to realise what's going on technologywise - i'd wager Facebook will know a huge amount more about populations in a lot of these African countries than their governments or the international development sector do. Perhaps if they were a bit more genuinely philanthropy inclined they could do some amazing socially impactful stuff right now. But I don't think any of us will count our chickens on that one.

Why do you think there are millions to be made here? I'm sort of familiar with international development, and I'm surprised by that estimate.

I'm curious how exact you would need. Like within 90%? 80%? 50%?

For those that want to follow up on this, you could use current satellite imagery to make population estimates, but it wouldn't be really accurate. You could wait for companies like Planet Labs[1] to get daily photos of the Earth (IIRC 4m resolution), and increase accuracy. You could also hire pilots to strap a camera and survey the interested areas. There are also, IMO, unethical spins (especially in oppressive regimes) that you could do with that data (ie: civilian surveillance). Or even cheaper, use drones (because you don't need daily or real time surveillance). That could also have other spinoffs. Or you could literally just send people to go count. Don't know which would be the cheapest.

[1] https://www.planet.com

I am not exactly an expert on matters demography but I think 50% would be low. At the moment, we have a study that was done by one of the UN agencies and the figures they came up with seem very disputed. They used 1975 census as the base for their extrapolations. Quite a lot happened since and their figures are not really believable.

How much penetration do cell phones have into a lot of these countries? May be a way to tease out a rough population from cell phone use statistics.

What I can tell you from my experience in the Gambia is that there many people have cellphones (not smartphones), but most have 2 or 3 because provider have lower rates when calling to the same provider, so one phone for Qcell, one phone for Africell,... Not sure if it's like that in many other places, there is a lot of tourism on the coast and returning tourists often bring their old phones to give to the locals.

Somalia, is the country I am talking about and cellphone penetration is quite high. Most people might not have smartphones, though. I have been thinking about this too. Using cellphone network data and come up with estimates.

Not sure how it is where OP is, but in many African countries it's common to have multiple SIM cards, to take advantage of in-network calling deals or to compensate for spotty coverage maps.

Any way to get in touch?

my username 77@gmail.com

Inter-operable video and phone conferencing without prior setup required. Probably half the video conferences I’ve tried to participate in in the last year have been a disaater, with people having technical problems dialing in, dropping out, etc. Even phone confernces (using VOIP conference providers) have awful quality. I want to be able to email some people a link, and with no account creation, registration, or software install required, get an extrwmely high quality low latency (comparable to FaceTime) voice/video conference.

I have been using jitsi for the past year or so and the experience is pretty much what you describe.


I used to use this in my company and while joining and creating meetings is pretty painless the audio/video quality was pretty crap. Perhaps they have improved in the three years since I've used it.

It's pretty good now.

Even phone confernces (using VOIP conference providers) have awful quality.

VoIP, especially multi-tenant voip like conferencing is 70% dependent on YOUR local network connection and 30% the provider's infrastructure. Having a high bandwidth pipe is no guarantee for 100% clear and jitterless voip communication. Have you had a network engineer take a deep inspection of voip traffic to see if there's any QoS or packet filtering going on that could degrade performance (i.e. if you're a small office that has your voice traffic occurring side by side with literally every other network device you're GOING to have a bad time).

Amazon has the entire company dogfooding Chime[0]. Pretty happy with it. Really good for video/phone conferences. Half-decent imitation of slack for chatting.

You can join a concall via phone number + pin, from your desktop, or from the mobile app. The mobile app will even let you view screenshares or video conf.


I personally didn't mind it, but I saw a lot of dislike of Chime internally compared to something like Slack. Anecdotal of course. Worked great for large chat groups though.

The place I work uses BlueJeans and I really like it.

The dedicated app runs on just about everything'; iOS, OSX, Windows, Android.

On phones you can use Video and Audio or just plain Audio (Low Bandwidth mode).

You can send your BlueJeans room URL and they can join with just a web browser (Web browsers get less features though). Non-Hosts can join the meeting without signing up for anything. They just click and join.

I think the audio and video is pretty good. Much better than Facetime, which is not a good comparison because Facetime is pretty terrible. Audio quality is good.

There is dedicated hardware that supports BlueJeans so you can use it in conferences rooms.

You can record meetings.

All the standard stuff like screen share and what not. You can even ping pong back and forth. I worked with someone where they had to type in some stuff and I had to type in other stuff on their computer. They just joined my BlueJeans room and when I was not typing, they could type. No clicking to take control or give it back, it was more or less seamless.

I also want a screen for the audio/video conference to show where the sound is coming from, so we can mute that one guy who doesn't turn his mike off and is heavy breathing on the call.

We've been using http://appear.in/ and it's worked perfectly for zero setup video conferencing.

Full disclosure. I work for Synergy Sky. For a single person our products are way too cumbersome, but if your issue is also on the level of an organisation/company, then your problem is what we solve.

We pay for a high-end solution at work and still experience the same issues.

Have you tried appear.in ?

I’ve used it several times and while the UI was nice the audio quality was poor every time. It doesn’t seem to handle audio feedback loops as well as other videoconferencing tools.

Try using Ethernet instead of wireless, and make sure you have a decent router.

maybe uberconference.com?

I do some illustration and have noticed on twitter how much every illustrator who gets some experience with 3d-tools likes to use 3d modelling to create block-outs for scenes they want to draw. By creating a simple scene out of blocks and shapes you can make your perspective work while drawing a lot easier. A tool that could be really populair would be something that makes making 3d mock-ups easier for 2d artstists without 3d experience.

You can give https://3dc.io a shot. Disclosure: developer here.

What are your thoughts on tools like https://placeit.net ?

I think both photoshop and manga studio can do it.

I am a network engineer for a medium size company (I have worked for very large enterprises too) and there a a lot of opportunities for startups in network engineering.

1)Simple network automation platform that works for "My" custom environment, simply and effortlessly and also it should not break any existing network. (I don't mean like HP Network Automation)

2) Network diagram software - Seriously, any experienced network engineer will agree that this one needs a lot of disruption. Visio is very expensive and even then it is a pain to use. And Lucidchart or Cacoo or draw.io or other online tools too have their flaws/drawbacks.

3)Network monitoring tools - It is a pity that CA Spectrum, which is a ugly and non user-friendly tool, in my opinion, is among the most used network monitoring software. Network monitoring tools are bread and butter of NOC (Network Operation Center) teams.

4) Network devices configuration management tool and Change and topology visualization tool - Netbrain seemed promising in the start. But it seems to do too many things and has still room for improvement.

It is high time that more and more programmers should start building and contributing in network engineering field. There are numerous tools for each and every function. But there is lot of room for improvement in making those tools more elegant, easier to use and more reliable.

Yes, there is Software Defined Networking (SDN) where the vendors (Cisco, Silverpeak, Riverbed etc) themselves provide a nice visual dashboard. But the current "non-SDN" devices are going to stay for quite some time. And also why do we need to depend on one vendor and hence the Vendor provided dashboard? There will always be customers who would want vendor agnostic architecture and common tools to manage the infrastructure.

Note: A lot of the current tools (especially the ones I have mentioned above) do work very well and are used by large enterprises for a reason. But Tesla did disrupt the market of cars in its own way when reliable Toyotas and fast Ferraris already existed.

I'm not a network engineer, rather a security-focused guy, and I'm also very interested in this topic because the basis for a secure network is a well managed network without any unknown/undocumented assets (network equipment incl. layer 2 as well as network participants). Few months ago I attended a talk of Ivan Pepelnjak, the author of a network engineering blog called ipspace. He talked a lot on network automation and enumerated some success story. I remember he talked about people setting up several data centers in a relatively short amount of time by using a lot of network automation / configuration management.


> Network diagram software - Seriously, any experienced network engineer will agree that this one needs a lot of disruption. Visio is very expensive and even then it is a pain to use. And Lucidchart or Cacoo or draw.io or other online tools too have their flaws/drawbacks.

I have been trying to find time to do exactly this. Have a plan laid out, but as is a popular quote : 'ideas are useless in and of themselves'.

Will post it here, should I ever realize my idea.

for 2) did you try yEd? https://www.yworks.com/products/yed

Dia, the diagram editor, seems pretty decent for diagramming. It has the CISCO symbols built right into it, and it's free.

A tool for managing software development that doesn’t suck. Especially if your developers are doing a lot of small projects, that while too small to have their own sprint or their own kanbanboard are too big to fit into a single card on Trello.

Possibly something that mixes business and process models into it, but again, something simple where you attach a single bpmn drawing and maybe an architectural sketch to the process. Add time management, deadlines and maybe a tie in to the web services of an ESDH system and it might even work for task management in case working.

Everything is build for theoretical approaches. Like we do SCRUM, but really, we’re doing scrumish things. We have an odd schedule, we work on multiple projects at once, depending on what resources are available and what has higher priority, sometimes something breaks and then we’re all doing operations rather than development, sometimes the mayor has a direct request and so on. I think we’ve tried all the tools from atlassian to trello and nothing fits, it’s all too textbook for a messy place like ours and often I think we should go back to postits and a fucking excel schedule but I really don’t want to ever print an excel sheet ever again.

Interestingly I do a lot of networking with other managers in the public set for, and everyone had this problem, not just in digitization. There isn’t a single efficient tool for managing your workforce in the public sector.

There are excellent tools, don’t get me wrong, but we can’t have our workers spend hours on them because we can’t sell those hours to anyone.

What you're saying here really resonates, and it's something that we, at Atlassian, have heard from many of our users. The Jira team has been hard at work on a brand new project type that aims to give teams running "scrumish" the perfect opportunity to build a board and workflow that will truly fit any style of work (even work with an odd schedule, multiple projects at once, and that requires a mixture between operations and dev!).

We've gotten a lot of feedback that often times the strict structures of scrum and kanban are overly burdensome, yet teams still want and need some basic guardrails (as well as the ability to modify their processes on the fly). Our Product team is still testing and iterating on this new project type quite a bit, and if you're up for it, we'd love to give you an early demo and get your honest thoughts and feedback.

If you're interested, please shoot me an email and we'll find some time for a demo: jake@atlassian.com


Jira PMM @ Atlassian

There’s a fundamental divide that most of these tools struggle with which causes the sucky behavior: you plan work at the team level but you manage scope at the project level. If you don’t have a clear correspondence between teams and projects (a team owns a project and does all the work), it will be hard to manage in all of these tools, because you’re always missing out on half the story. The process packaging like scrum or kanban merely obfuscates the fact that the team/project mismatch creates the confusion.

Been searching for something like this for awhile now. We have a small team that gets distributed across a handful of projects, but trying to find a good way to manage that has been a chore.

Originally we used JIRA, but it was complete overkill. We've settled on making adhoc GitHub Project kanbans and very creative use of the labels, and so far it's been ok, but not perfect.

The other thing we have to do is time tracking for specific tasks depending on the project/client, which has us going over to ConnectWise (the primary part of our company is an MSP), which is just terrible.

The final problem is tying all of the documentation together. The MSP side of the company uses ITGlue, but it's not enough for everything we do as developers. Confluence was actually nice for that, but since we've left Atlassian we're just tracking stuff using a doc folder attached to the project source itself.

I would look at clubhouse.io, or maybe cushionapp.com?

I've never used cushion, but I like clubhouse a lot.

Working for a German Big four Car OEM. We need the following for measurement data and we simply have no solution (except matlab, which is not good enough).

We want to plot big data (up to terabytes). Columns should be selectable by gui and nameable. The Data then should be be added to database with an ID. Everything should be usable without use of a scripting language.

Right now the terabytes of data have to be loaded in to ram just to see the first few lines and determine what the columns stand for. Now I know that there are editors that can load data partially but these have to be reinstalled which requires admin rights etc. This is a huge burden in a big company! The process of simply plotting, selecting and storing data takes a huge amount of time. The solution should be web based because no admin rights are availabe.

Often I am impressed how many tools and hacks exist simply to get one thing done: visualize measurement data. Excel is not enough because even the import of dot vs comma vs tab etc takes too much time and everytime has to be relearned. Engineers have to plot the data sometimes every few months and then you have a new excel version that autocorrects measurement data to dates or whatever.

In my opinion this would solve an obscene amount of work. Right now every engineer is hacking together some scripts that are extremely inflexible. When just csv-type data has to be handled.

Edit: this also applies to smaller amounts of data of megabytes. How can we plot them more robust than excel and then select x and y axis? I am pretty sure that we would love to buy a product that solves these issues.

Thanks for input, but do You mind being a bit more specific about the following:

- would You actually be interested to buy this service?

- what sort of visualizations do You actually make? Do they need to be interactive? SVG? Size? How do You use them?

- what exactly do You do with data before its plotted except selecting columns? Is there aggregation or any kind of processing?

- how often is this actually used because You say 'sometimes every few months', does it mean its like a quarterly report?

- what other well established tools have You used other than Excel?

- how big is Your largest data? Size, rows, columns

- if it applies with small amount of megabytes, is there a reason beside simplicity why You can't use PivotChart in Excel? Or Excel in general? Or R/Python to generate it?

I am data scientist who regularly plots quite large data sets, and I like speed :) Its totally doable to build a service that You can run locally, load a CSV, read like 1% of data, play with it.. when you get what You want, load rest and wait a bit and get the visualization You want.

But depending on visualization requirements there may be many paths solutions.

You should try using approximate algorithms. They don't quite load all the data, but are able to give approximate (near perfect) statistical results whilr consuming orders of magnitude less data.

Count sketches, Reservior sampling and similar methods come to mind.

If I am understanding your problem correctly, I did that for a large American automotive electronics supplier back in the 1990s - though back then 30-40 megabytes of data was pretty big. We trained a bunch of American and Japanese engineers on how to do that, but I don't remember any Europeans.

I think I have an email address in my profile; feel free to send me something. I am fairly certain that your needs can be satisfied with existing Unix tools. Then again, the reason I worked on the problem in the 1990s was to free up engineer time so they could do more valuable things. A gui and other tools could be worth paying for if the bosses have that mindset.

Thanks for your answer. Currently engineers can:

a) try to plot their data alone and spend time on hacking the stuff together. This takes time as the guys doing it aren't accustomed doing it daily. This happens accross all kind of divisions.

b) ask another team (with data scientists) for their support. Maybe the engineer has to write a ticket, or the person who should be doing it has other tasks, is in vacation, not willing, not replying to the request etc.

Either way hours are easily spent on solving this seemingly simple task. The amount of time spent is simply staggering.

Unix would also be my personal choice. But getting the right to put a unix machine into the network for a single user is extremely difficult. Windows, Internet Explorer and temporary admin rights are the work environment that almost everyone has to use. That's why I think a web based solutions is the only viable option.

I work in a similar space and one of the tools that might solve your problem is exploratory.io.

I'm located in Germany as well and would love chatting with you.

Is there a way I can reach out to you?


I'm on the board of my condo building's HOA which has a number of things that would be helpful.

* We pay $200/mo for a basic website with forum, some billing things, some file storage and other stuff I never use. It looks like it was written 20 years ago. (If someone can recommend something already out there that would be helpful)

* Doorman accepts dozens of package deliveries each day which get sent email and tracked in above system when picked up. Needs to write apt number on box and have its own tracking system

* I have to approve lots of expenses not knowing what fixing the hvac unit should cost

* we're getting screwed by insurance company - I have no idea if our policy is good or not

* Insurance claims for damage is a huge s* show

* Energy management is horrible, we dont know where our electricity is going or how we can cut down

* Contractors are unreliable - I want to know who is blacklisted from neighbouring buildings because they suck

* How do our expenses compare to others? I have no idea.

You hit home. Contractors for someone unknowledgeable is a shot in the dark. Insurance are overcharging and they have lots of useless overhead. And given the iot world we live in, we can do better in electricity.

Rentlytics can answer a couple of those questions about how your costs for energy etc are doing and how they compare to other rental buildings.

What exactly does that website do?

Website for residents


Document storage

maintains emails lists

Track who's paid

+ more I don't really use it

There are a bunch here


I don't know much about hoa's but this sounds like something that would work with a common stack like WordPress + Stripe

I work in health tech. Here are some problems: * Helping patients select the right doctor. Currently most people use Yelp or through referrals. The problem is that Yelp has little correlation to quality of care. It's very difficult for patients to evaluate a doctor - usually what they end up doing is evaluating the customer service aspect of the doctor (did they speak to me nicely, did they make me wait for an appointment, etc.) but no one is able to evaluate doctors based on the quality of care. * Helping doctors and patients estimate costs - Neither doctors or patients understand costs. it is very routine for a doctor to suggest getting a lab test from X place because they have experience working with the center. They have no idea that for your specific insurance plan this will cost you 2x another place and so you with an unhappy patient who blames their doctor for ripping them off. There should be tools to patients and doctors estimate costs. * Helping patient select a health care plan from their employer or a marketplace - most patients have no idea what health care plan they are best on. However, theoretically if you have their history of claims and some guess on their future behavior, you should be able to tell them which health care plan makes the most sense for them * Helping patients manage their chronic conditions. Most people are very lax about managing their health conditions, they skip appointments, choose brand names over generics, ignore refills, etc. Technology should be able to nudge them in the right direction and help them optimize on quality and costs. * Building technology that encourages health behavior - A majority of the diseases attacking Americans are caused by lifestyle issues (diet, stress, drugs, exercise). If you could build technology solutions that help nudge people to healthier behavior, you would make solve a billion dollar problem for insurance companies; they would love to reduce the risk pool of their patients. This is a tough battle because even people who care about health are inundated with false information (think Dr. Oz or anti-vaxxers or people who insist every person in the world is a celiac and should go gluten free).

As you can see the bulk of the problems in the health care industry is understanding how to navigate the huge mess of the US healthcare system. A longer term solution is for us to build a single payer system and incentivize patient care over patient procedures but I doubt that will happen.

Fair warning: Healthcare has a long sales cycle, is heavily regulated and is difficult to break into as an outsider. Generally you'll want someone on the founding team with a significant healthcare background and connections (even better if they're a MD). Once you have found a way to sell you'll find that the competition is fairly minimal compared to other industries.

Totally agree. All the problems I listed are quite hard and will monetize slowly. Sales are really long; the company I am a part of had an all star founding team with a history of exits so they gots lot of funding which helped give them a cash cushion to deal with the long sales cycles.

Are there any books you could recommend to learn more about why the American healthcare is a mess, how it became this way, as well as comparisons with other healthcare systems and why they work.

Could a PMR system help with some of these issues? Giving patients access to their records making it easier to aggregate data abiding with HIPAA?

I recently saw an episode of Dirty Money on Valeant. Is Valeant a good representation for most pharma companies and also a reflection of most for profit medical businesses whose primary obligation is to increase shareholder value?

What would be the best way to start aggregating and comparing prices in the USA? If it isn't possible, why do prices vary so much?

Sorry can't recommend any books. The main reason US healthcare is a mess is because we have structured it to be a free market system when it lacks the key characteristics for a free market to work. Free markets work really well in some areas (say groceries stores) and would work terrible in others areas (a national military) See: http://www.who.int/bulletin/volumes/82/2/PHCBP.pdf

Patient medical records won't help with pricing issues, but data transparency would help with automated solutions to manage care.

There is no easy way to aggregate and compare prices. If you could get billions of health care claims then you could build algorithms to estimate pricing or perhaps you could convince the large health care companies to share their pricing with you (highly unlikely). Prices very widely because there each institution negotiates their own set of prices with a specific set of providers. Basically imagine your health care plan is basically a set of discount codes for a set of doctors and providers. Except those discount codes are never shared with you and vary for each doctor and each provider and can change at any time.

What's your opinion on current healthcare startups like zocdoc or Oscar insurance? I've seen a lot of new doctor review sites popping up but like you mentioned that's only 1 problem out of many/bigger issues

Doctor reviews are not correlated to patient outcomes. There are tons of excellent doctors with terrible bedside manner that have bad reviews and incompetent doctors with excellent bedside manner who get great reviews. Quality metrics should be based on data like patient outcomes (but even that is hard because surgeons currently will reject "hard" patients to juice their numbers). As a non doctor, you honestly are not qualified to assess the abilities of your doctor.

I do think Zocdoc is a great idea because anything that makes scheduling an appointment easies is a win. I'm just saying that patient reviews give people a false sense of security that their doctor is good. In reality, almost no one knows if someone is good or not unless you have directly worked with that doctor and have a lot of stats on their outcomes (which only anyways works on specialities that have a lot of procedures). Assessing doctor quality is a very hard problem.

Patient reviews are good for assessing if your personalities will mesh but beyond that they don't go far.

I am not really too familiar with Oscar; what are they doing?

Hi would you like to setup a call to talk more about these issues?

Sorry I don't have time for a call but happy to give my perspective on HN if you have any questions.

No problem.

> Helping patients select the right doctor. Currently most people use Yelp or through referrals. The problem is that Yelp has little correlation to quality of care. It's very difficult for patients to evaluate a doctor - usually what they end up doing is evaluating the customer service aspect of the doctor (did they speak to me nicely, did they make me wait for an appointment, etc.) but no one is able to evaluate doctors based on the quality of care.

We actually explored this for a startup idea. The problem is that it is difficult to find a group of people who can do the evaluations in a truthful and holistic way:

* Hospitals will never want to give out outcome data because outcome data will be used against them for ratings by people who don't understand it (for example, some community hospital in Montana may be rated higher than Mass General because of case complexity issues). Or worse, it will be used by people who DO understand it :D

* We explored having doctors rate other doctors in a variety of ways (which I think would reflect the "truest" measure of quality). Residents and fellows could rate attendings, but they might not know how attendings in their hospital compare to attendings in most other hospitals. Additionally, attendings or hospitals might apply pressure to these groups to provide good ratings. Specialists could rate other specialists in their field, but then you might see collusion, false negative reviews, or retaliation. How you would avoid these problems is not immediately clear to me.

* As you point out, patients are really only able to evaluate bedside manner and not quality of care.

One way we thought about it is having a rating system which would have public profiles for physicians and anonynmous reviews from other physicians and members of the care team. Ratings coming from other physicians in the specialty and providers at their institutions would be weighted more than ratings from other physicians. The highest rated physicians would also have more weight within their specialty than an average rated physician. You could bootstrap the system by asking specialists to provide the names of the top X people in their field. These people would automatically be rated highly.

Patients could log in and provide comments about patient experience; hospitals could log in and provide outcome data if they wanted.

I am not entirely sure how you would really monetize it. It's the equivalent of the dating app problem; the better you are at matching, the less money you make as users exit the platform. I do agree that it would be great if osmeone could solve this problem though.

> Helping doctors and patients estimate costs - Neither doctors or patients understand costs. it is very routine for a doctor to suggest getting a lab test from X place because they have experience working with the center. They have no idea that for your specific insurance plan this will cost you 2x another place and so you with an unhappy patient who blames their doctor for ripping them off. There should be tools to patients and doctors estimate costs.

We had a startup that came at this in an indirect way. We were trying to make it easier for labs and other providers to perform eligibility checks and facilitate prior auths in real time. Our proposed solution would have involved running the check during ordering, and then having the phlebotomist or lab tech at the hospital doing the sample collection contact the doctor and inform them that something would or would not be covered by insurance. The provider could then talk to the patient about out of pocket costs etc. I think this actually could have worked; our team imploded before we could prototype it however :(

While I like the other ideas, glhf trying to get people to practice lifestyle management or be adherent to a treatment regimen XD

As a frequent patient, I see such incredible need for "disruption" in healthcare, both in terms of transparency/ empowering the patient, and in terms of cost suppression. However, anyone who tackles this space has to be aware that there are some huge entrenched interests who like the current inefficiencies and cost opacity very much, thank you. Breaking into this space would be 1 part technical know-how to 10 parts legal jujitsu.

As an example, look upon the bloodied corpse of the failed startup Remedy, which was actually doing something really good -- helping users find billing errors and getting money back for them on bad charges. But incredible amounts of pushback stymied them:


I am also in healthcare tech and also worked with the idea of a doctor rating system focused specifically on using doctor procedure/diagnosis billing data and PQRS. We abandoned the idea due to lack of data for each healthcare field (cardiology, radiology, oncology, etc.) and hospitals not wanting to participate.

However, I have looked into incentivized, distributed platforms such as wings.ai or golem.network and believe that this could be applied to healthcare. If patients were incentivized on going to the doctor for a copy of the CPT/DGX codes on their bill and survey information around PQRS you would have a lot of useful data. This data could be used not only to provide a rating system of doctors, but also price comparison on CPT/RVU across the United States and the ability to provide PQRS data for doctors. Its an all in one, decentralized, healthcare platform for patients, doctors, and insurance providers.

Journalism needs a WYSIWYG editor for stylized content -- content that is more visually diverse/interesting than the linear text+image+embed format, but not SO so custom/new as to truly require a developer to build it.

What is a good live example of this stylized content?

Not op but The New York Times has some interesting layout options available to their writers and editors. You can see a small sampling of it in https://open.nytimes.com/building-a-text-editor-for-a-digita... which is written by one of the developers of their new news room text editor.

Have you seen WordPress' new Gutenberg editor that they'e working on?


... is just about as far from WYSIWYG as possible. LaTeX is not friendly for a newsroom full of non-technical journalists.

We run background checks, and there is definitely a space for public record aggregation.

We directly interface with those interested in the results of the check, and there is an overwhelming amount of work in building integrations with schools, applicant tracking systems, hospitals, public records, courts...

We spend most of our time building XML and JSON parsers to cram their data into our models.

If there was a company that provided a single interface to this data, you could write your own ticket. I know we aren't the only company in this space with this issue.

Interesting problem to solve - guess the pain will be when all the data sources change their endpoints and everything sort of breaks. At least that's been my experience with aggregating public economic data feeds.

Does Checkr (https://checkr.com/) solve this problem? Or is it more complicated than that?

May I ask, how does that work with Privacy concerns? I am aware banks run background checks before you open an account. But I always assumed that the "checker provider" has a licence.

I think Socrata (now acquired by Tyler Technologies) was maybe tackling some of these challenges?

How broad would a vendor's coverage need to be on day 1 to be valuable for you?

shoot me an email with some more info, may be able to help you out

What is the current state of the art in financial modelling in corporate finance departments?

I used to be a management consultant. We often built financial models of company operations or parts of their value chain, and then looked at the change from process improvement, restructuring or bolting on new business lines. Everything was done in Excel. For the annual strategic planning and budgeting cycle, large companies used expensive proprietary systems to aggregate divisional financial plans.

I now work for a big bank, building out a Jupyter-based data science and machine learning platform. We have hooks in to SDLC with code reviews, commit history, and all the good stuff that software engineers nowadays take for granted.

So what if Finance departments dropped Excel and instead used our dev tools and methodologies? I'm genuinely curious if any companies are doing this, or if any startups are building such solutions.

The cost of achieving new tool literacy is very high. What does this degree of rigor really add?

Many spreadsheets are used as disposable report tools to support management level business decision making. While there are exceptions, in general perhaps they are more like one off report-generation shell scripts than unit operations in a larger business process. This distinction is significant, because rigor adds more value on automating processes than one-off reports, owing to increased lifecycle complexity.

At the management level, time is gold. These are people who have enough money, lots of responsibilities, and no time. They already have a tool that works. You would be essentially asking them to waste their most valuable resource investing in a new tool that may disappear tomorrow without a strong/clear ROI.

I don't doubt you could get some customers for such a product, but I'm skeptical it's going to change the paradigm. Platform-for-everything businesses (Google, Oracle, Microsoft, etc.) tend to have a large minimum snowball size.

I see things developing differently: an open source financial gateway will become the standard accounting interface to many businesses as trade moves toward greater transparency, predictability, speed and automation, and we see features like arbitrary asset settlement, multi-hop transactions, banking automation and multicurrency accounting becoming standard. Accounting departments will begin to thin out as forms on such a system become input to generate figures and reports previously generated manually. It will probably be hosted. We see a little of this now with cloud accounting systems, but I'd wager it will go a lot further with Germany's Industry 4.0 vision and a similar result in China. Supply chains will be the driver, there's just so much fat to trim.

Technology adoption. I'm not necessarily talking about change management (although that could be a feature), but technology adoption in the broader sense. For example: if I'm a Health IT executive, how can I ensure the adoption of say, Kubernetes is the right path for my organization? I may have 20+ stakeholders, from actual practitioners, to finance pushing me in various directions. It's almost like there's space for adoption assurance, or some type of 3rd party integrator that sits in front of the bleeding edge of technology and helps dinosaur industries move faster through adoption. A layer that could understand my IT footprint, and recommend tools/improvements/etc. Like CreditKarma, but for IT.

This is a really simple but basic one. Market size might not be billions of dollars, but a basic learning management system along the lines of Teachable/Udemy that allows for code with built-in testing would be used overnight by a dozen code bootcamps and would pull a lot of people out of the other platforms.

Maybe it's just a feature, not a full product, but it makes any "learn to code" MOOC unusable.

Stephen Grider on Udemy uses Jest to create automated tests for his JS and React courses that you can run while you watch and code his videos.

Jest and Mocha both have a watch feature that will test a file each time it is saved for "continuous testing", much like using Gulp watch or any type of dev live server environment.

Also Jest features Snapshot testing which will take a picture of a UI and test all changes made to the UI, as well as alerting you in tests if the UI has changed. I could see this being used in bootcamps as well.

Yeah, I just want that built into the LMS itself

learn.co does something similar thing.

you install a cli program, write the code, type something like: learn test .... tests are ran and results show. if all tests pass then dashboard is updated showing tests are passing.

What exactly do you mean when you say testing? Like being able to write unit/integration tests? Or testing whether or not the submitted code fulfils some requirements/assignment?

Yes, unit/integration tests

I'm sorry, I for one am not really understanding what you're describing. Can you elaborate?

Isn't automated testing commonplace in MOOCs?

MOOCs yes, but not MOOC platforms

The process of getting quotes, sending POs, receiving invoices, paying them Net30 etc. is an extremely manual process. Companies have dedicated employees that all they do is send quotes, receive POs, and receive payments. The process is so painful.

Edit: I am in the industrial space. Basically all large equipment purchases work via a [Quote > PO > Pro Forma Invoice > Final Invoice > Payment > Receipt] process.

Add government red tape on top of this and it can take a week or more to get a COTS part ordered from a vendor that sells all of their stuff online. Heaven help you if the vendor isn't in the approved vendor list, that's another week to get all the compliance paperwork to them and get signed. It's agonizing when you need something quick but still need to go through all the hoops because the item is technically government property.

I was working on a solution addressing this problem during my undergrad inside our university incubator (in India). Although I left it due to unavoidable reasons. But do you think the problem is big enough to worth exploring again?

The problem grows in scale as organizations grow in size or spending increases and is applicable across most industries.

The biggest challenge is finding a solution that fits workflow needs, has the necessary features, and is user friendly enough that the solution will actually be adopted.

There appeared to be dozens of choices for invoicing software, both in general and targeted at specific verticals.

What makes all of the existing offerings not usable?

You are right, there are many solutions and the existing offerings are often usable however organizations have difficulty with the implementation.

Two majors problems around proactively managing an organization's spend culture:

1) Entropy: the pain around the problem is not acute and is a slow dissent into disorder. The problem gets harder to solve as organizations grow.

2) Chosen solutions are not adopted by the team and data around company spending is lost.

Solutions need to fit current workflow or have a dedicated champion with authority to insure adoption.

What part of the process is painful?

Please send me an email, same issue in my firm, keen to discuss. I may have a solution.

I’m currently working on a solution for this.

In healthcare billing with patient insurance companies, hospitals and doctors are contractually prohibited (and sometimes illegal) from sharing how much they get paid per procedure unit (RVU) from insurance co's. However, if you are a third party for the hospitals and have access to the billing info of a metropolitan area you could create some kind of price comparison system for all of the hospitals.

Will someone please convince the trucking industry that "communicating" by swapping text files over FTP in an absolutely incomprehinsible, proprietary format is simply no way to live?

Especially with the recent push by the fed to put electronic logging systems in every truck, this system is absolutely ripe for disruption. Downside is you'll be fighting entrenched companies like IBM for ground.

I work in the trucking (telematics) industry. I've dealt with a decent amount of ftp file transfers, but most stuff I've seen has been some sort of API based (SOAP, REST, etc).

Can you give a few more specific examples? Are you in the industry now? Working at a carrier?

I work in intermodal. Whenever one of our larger customers wants to give us a load to move, we use X12 EDI. This is standard throughout at least the intermodal industry, though from what I gather that may also be true of over-the-road as well.

An example 204 EDI (Load Tender) looks like this:

    N1*PF*XYZ CORP*9*9995555500000
    N3*31875 SOLON RD
    N1*SH*XYZ CORP*9*9991555550000
    N3*5555 TERMINAL RD
    N1*ST*1 EDI SOURCE*93*9990055555
    N3*31875 SOLON RD
    N1*ST*1 EDI SOURCE*93*9990055555
    N3*55555 5TH AVE
I haven't proper parsed it, but I believe that's going from Cleveland to Mayfield. One of those L11 segments is probably a reference number. There's no MS1 segment so it's likely over the road? Anyway, it's not exactly descriptive or even human readable...

A reply accepting a load looks like this:

These are commonly exchanged as text files over FTP sites.

Some of our more forward-thinking, larger customers are considering moving to AS2, which I believe is sent over HTTP vs FTP. A cursory Google search doesn't really turn up any clear examples on AS2, which doesn't exactly comfort me, but at least there's an RFC[0] for it, whereas for the X12 spec you have to pay[1] to see certain parts of it.

Not that anyone follows the "spec" anyway. We code special handling for every single one of our customer's EDI transmissions.

I wish everything was REST, or at least JSON. That would be 10x easier. Instead we spend weeks going back and forth on silly things like what a 07 means in the ATS segment, or what character to use for line endings (wish I was kidding -- we've been blocked for two months on the line ending character).

What's more is with the ELDs in all our trucks, customers are increasingly wanting GPS updates. I'd love to offer them a streaming socket with GPS data -- it's completely feasible considering our ELD backend. Instead everyone is wondering how we can send updates in 15 minutes increments over FTP, especially when these transactions are often batched in 5 minute loops on both ends in the first place.

It kills me a little. We could be doing so much more. I can't believe we aren't pushing for real time. I can't believe five to fifteen minute batching loops are acceptable.

[0]: http://www.ietf.org/rfc/rfc4130.txt [1]: http://www.x12.org/x12-work-products/x12-edi-standards.cfm

Good old EDI. I last worked with it more than 15 years ago. I used to have several thick, bound EDI specs for each message my team was working with on our desks. After a while you get really good at counting the number of separators...

For small messages and loads, it's entirely reasonable to work towards near real-time processing. A possible reason why some EDI folks you talk to might be reluctant to think in that direction is because EDI payloads can go upwards to 10s or 100s MB. Yes, text. Instead of trucks, think ship cargo manifests. That could blind-side them.

Thanks for the reply! That certainly does look like a pain in the ass. Being on the telematics side specifically, we deal less with EDI stuff, but I'm sure down the road I will have to, and I agree that a structured API would be much better.

If you don't mind me asking, what ELD are you running? My company specifically builds one of those, amongst the rest of the productivity suite necessary for a driver to do their work.

We started out using XRS. We run almost 100% independent contractors. Started out putting our own tablets in all the trucks. We discovered drivers can run up enormous data bills when they figure out how to circumvent the MDM... It apparently was also a pain to physically install XRS (I don't know the details of that). We also have nearly 50% turnover per year. The whole situation was really untenable.

We've since switched to GeoTab and a BYOD model. The GeoTab devices are a lot cheaper, so a contractor walking away with one isn't that big of an issue. Rollout was much smoother this time.

I would really like to pick your brain a bit more if possible. We directly compete with both XRS and GeoTab, so getting insight into your fleet's decision making process would be super helpful if you're willing.

Also could potentially talk about our ELD offering (amongst a bunch of other stuff) if you're interested.

darrin [at] platformscience.com

I work for a stone distributor and we have issues with trucking, but had never delved into it.

This was interesting to read.

Donyou have more sources for gaining more insight into how this all works? I'd love to take a crack at parts of this.

If you Google '204 EDI specification' you should find a lot of random specs in PDF out there for various companies. 214 will show you some of the status updates too. Stuff like this: http://www.shipfsp.com/media/pdf/it/EDI_214_X12_V4010.pdf

I've found this EDI notepad useful for parsing flat files: https://www.liaison.com/products/integrate/edi-notepad/

And this npm library actually parses EDIs pretty well too: https://www.npmjs.com/package/x12

> I wish everything was REST,

REST and X12 EDI are orthogonal features: REST can use any format for resource representation, and X12 EDI formats are a data representation.

X12 EDI can be used in a REST way (one of the two mechanisms in healthcare standard operating rules for certain HIPAA-mandated X12 transaction is RESTish—the HTTP+MIME method—though there is also a SOAP method.)

they're probably using edi via ftp because the entire logistics industry is still using edi via ftp. edi is still the dominant way retailers, suppliers and shippers transmit information

Well, I guess journalism needs a way to be profitable again, and creators in general need a way to sell their work without requiring long term subscriptions or ads. But given the many, many companies who've tried to fix this issue (by allowing users to pay for certain bits of content via microtransactions or bundling subscriptions together ala Blendle), I'm not sure what the answer would be.

I also feel game development needs a way for creators to commission help with various aspects of the process too. Oh sure, there's the odd forum where you can pay for graphics assets or music, but what if your problems are code related? Or game design based? It's a lot harder to request that sort of thing online, let alone find a way to pay for it. Where can I say, hire a level designer or game programmer independently of a studio?

As far as I can tell, nowhere, which makes it awfully hard when I'm stuck and just need a bit of help to finish a mostly complete project.

Anyone who solves that would get a lot of my money, I'll say that much.

What you describe were by far my favorite kind of jobs on Upwork. People who have a idea what they are doing but stuck with individual questions.

Upwork is by far not optimal but you can easily find talent for quick questions there.

Secure internet for people who travel frequently or work from coffee shops. Filter the “free” internet connection through a TOR router and protect your network/browsing.

TOR is not meant for security, it's meant for anonymity.

If you are logging into services that can identify you, checking your personal email over a TOR connection, or doing work over a TOR connection, you're putting yourself and your company at risk.

Anything wrong with just using a VPN?

No, not at all. I view it similar to the classic Dropbox discussion. There were existing alternatives, but Dropbox took off because "it just worked" without having to use FTP and Linux. In this case, you eliminate the VPN step, by linking the router to your devices and using that single device to connect.

Secondly, with VPN, I first have to connect to the open network in order to activate the VPN. I also need to do it for each device I want to connect (phone, computer, tablet).

A huge problem with CRMs is the lack of staff engagement. A company will spend $30 million to customize their Salesforce workflow (or their SAP workflow, or any other workflow or CRM tool) but the staff will hate it and so the investment seems wasted. That’s why Natural Language Processing seems like it could be a win for this space. A salesperson should be able to write a quick text message on their phone, and that message should be parsed by an NLP script and then put into Salesforce. The promise of this idea, as well as the problems, I detailed here:


just wrote a blog post on this exact topic for the biopharma industry: https://newbio.tech/blog/bio_charts.html

its a $600B industry that is in decline because its traditional R&D engine is sputtering out, and big pharma has been amazingly acquisitive the last five years to replace off-patent blockbuster drugs (more IPOs and big M&A than software the last 5 years despite getting 1/5 of venture funding)

tons of really interesting new tech for startups to explore: synthetic biology, cell and gene therapy, bioelectronic medicine, many many others

Digital marketing and programmatic advertising - TRAFFICKING. Everyone hates it. It's 100% required but no one has solved the issue. It can eat up so much time and if you make a mistake can cost valueable data.

What do you mean by this? As in, people hate doing marketing or they don't understand how to advertise? Plus, a mistake can cost valuable data? Where are you losing data?

Not OP, but worked in the field for a time. Trafficking usually means configuring your ad in the campaign management system. What are your targeting parameters, what are your tracking tags, uploading the ad itself, entering in lots of custom information that while conceptually similar across ad platforms usually has different names and often has to be manually entered. A lot of platforms do offer APIs of some sort to help with bulk campaign/ad creation, but there's often no "one stop shop" to be able to set up a google campaign and facebook campaign at the same time. There are some companies working on this, I think usually referred to as (or in conjunction with) "marketing automation".

A mistake can mean - misconfiguring your target (wasting money on ads that won't give you an ROI), misconfiguring your 3rd party tracking (letting data like conversions go unaccounted for, or not having your auditing tags setup, meaning you show ads to fraudulent users that you otherwise wouldn't have to pay for), etc.

Another ad guy here. The problem is best practice on one platform doesn’t equal best practice on the other. Additionally, there are many ad formats that don’t overlap on multiple platforms. Even GDN vs FB is a huge disparity. I would be interested in the product though if one existed.

Alternatively, one place to manage creatives and language as well as targeting for each campaign might be helpful as we use google sheets, excel and Trello for this now.

Do large agencies have systems for this? Or is everyone using Excel?

What this guy said. It's a very manual process.

An app for installing and maintaining mixed fleets of lora/sigfox/... IoT sensors.

When installing these at scale in existing buildings you have to be able to send out local workforce to properly install and activate thousands of sensors, as well as maintain them afterwards, without prior training. It’s one of those things that sounds easy on the surface but is riddled with complexity, like how to register which sensor is installed where in a foolproof way, or how to easily locate faulty sensors for replacement.

What tools are used to maintain these now?

what's the market for this? how many people are installing iot fleets?

The price of sensors is dropping and starting to reach the point where cost of retrofitting buildings is outweighed by savings in more efficient building use. That trend is going to cause most buildings to be retrofitted. You can do things like occupancy detection of every workplace which are very valuable to building managers.

There’s plenty of competition in people selling the sensors, providing connectivity, or doing data analysis, but I’m not aware of any solutions for installing the damn things which aren’t tied to a vendor.

I am working on content AI platform (preadr). Preadr brings to you Internet's finest stories/content. It is a content discovery platform that helps you discover quality content that is relevant to you.

The Problem

There are currently three main ways we discover an ever-growing amount of content on the web: news, social networks, and search. There is a fourth category that is missing: relevance—a break from the noise on the Internet to discover what's relevant to us.

News delivers what’s happening in the world right now. Social networks let us know what’s happening with our friends. Search is great at finding the needle in the haystack. But how do we discover things from around the web that are new and relevant to us?

Incentives on existing platforms are such that new and entertaining content wins. We need a better system that can filter the signal out of the noise.

The Solution

We’re building Preadr to tackle the relevance problem and bring forward quality content. A platform that helps you discover most relevant content based on your interests, for both, leisure as well as learning.

Every day we analyze an ever-growing amount of new links and create a storyline of the most relevant ones. We curate content from the most trusted sources on the internet and let our algorithms do their work to filter the relevant from the non-relevant. Since quality is not limited by the format of content, we offer a mix of different formats i.e. articles, videos, podcasts, etc.

ctrl+f "construction" = 0 results.


small construction companies are still in the stone age. plangrid and submittal exchange exist but not much else is popular.

textura is owned by oracle, and everything else is owned by trimble and autodesk.

theres a plethera of attempts at field document management, and timecards, but 0 great medium-large business size erps. procore is like half an erp without an accounting system.

there is a huge untapped thirst for something that "just works" regarding labor productivity tracking and document management.

construction is one of the places where I think an enterprise blockchain could actually apply better than a traditional database. imagine a construction project with one blockchain, and every general contractor, sub, and vendor participating. shares, payments, todo, gantt charts, drawings, the model itself. they can all access the database from whatever supported client their firm uses (think email clients all working with each other) but on the backend working on one shared distributed database. I think you could turn down the bad actor security a bit, similar to https://azure.microsoft.com/en-us/blog/announcing-microsoft-...

Engineer behind REBIM here. I've spent the last one and half years developing an AEC SaaS platform that includes

* A project-based document management system that has baked-in version control.

* Issue and Task trackers.

* Soft realtime features such as notifications when models are converted, when anyone comments on an issue or task you've logged and a realtime chat system.

* A browser-based model viewer with the ability to:

* Federate multiple models from various project disciplines into one scene.

* Take screenshots of the scene, mark them up and log issues and tasks on model assets with the marked up screenshots straight away.

* Associate documents with model objects.

* Hold conversations on model objects.

* Store / review feeds from the built counterparts of modelled assets.

Video tutorials: https://www.youtube.com/channel/UC8xrkI2ZaSm-5s_aJnnGpeA/vid...

API documentation: https://app.rebim.co/static/docs/index.html

Intro for small to medium sized design studios: https://rebim.co

Intro for enterprise customers: http://rebimenterprise.com/

We've only just started beta testing this February but you're welcome to sign up for an account at https://app.rebim.co

I can be reached on lukebrooks [at] azurelope [dot] com if you need any assistance and we would also love to hear your thoughts on REBIM if you have any suggestions for improvements!

Some sort of enabler for content. I work in content marketing (not click bait shit but helping b2b companies tell their story without hiring a full design/marketing/dev team).

Our process and system is super efficient recording video + sending notifications to team members to edit, add captions, strip audio for your podcast, set up your podcast, set up your alexa flash briefing, etc.

But it takes hours to do all of this if youre on your own and thats ONLY if you know how to do it all. Content is the black box most have no idea how to do. If you don't pay our agency to do it for you you are kind of out of luck.

We sell a book on our process now and sell about 50 copies a week. These people are validated and want to learn how to do it, and are willing to pay to learn.

It only makes sense to build the platform that automates this process for these people and offer it to them. They've already paid to learn. Might as well offer the platform to do it.

I would love to check out the book. Can you mention it here?

Patch distribution for games. Every desktop game does its own thing, with its own CDN, incompatible with other engines.

I don't really see a market for that. Most desktop games these days are handled by store fronts like Steam, GOG, itch.io, Origin or the Blizzard Launcher that already handle updates for you. But it's certainly an interesting topic and itch wrote about it a while ago: https://amos.me/blog/2017/efficient-game-updates/

From a technical side of things: - data management - faster analytics software - test and learn optimisation - faster model deployment

From a business side of things: - loyalty program/rewards - pricing optimization - financial services to underserved customers including entrepreneurs, families, millennials, freelancers, etc.

im creating something for loyalty rewards but most small businesses seem apathetic, too busy for anything new

Yes, that's typically the case for small businesses. You'll find much better luck mid-market.

Open hardware and software for PLCs for manufacturing.

Along with this, we need a better way to program robots. I program Fanuc i200D and that whole industry makes my blood boil. Want more than 200 variables in your program? Shell out $10k.

High level programming language to program robots with safety in mind would be amazing. RoboDK is the only one that is doing this and they still suck.

If you have arrays why don't you implement a turing machine and compiler to turing machine so that you can have more variables without shelling out? Or some other model of computation.

Does the Fanuc need to be constantly reprogrammed? Or is it a rare occurrence?

I have worked in process and equipment R&D for ten plus years, off and on:

What I want is a PLC: 1) A PLC which has a hard real-time process and soft real-time processes. Beckhoff and B&R do this, other PLC's do this as well. 2) I want to do the hard real-time programming in Rust or Ada, or "Safe/restricted C"... Rust and Ada are so much more expressive than structured text. 3) I would also like to have a simple API to allow deterministic, hard real-time communication between the real-time control domain and a soft real-time domain so that higher level languages can be used for control problems. And I would like the PLC to support one of the high level languages: CLISP or Racket, F#, Julia, Python, Elixir... don't care what it is. This is actually doable on Beckhoff but the tight connection between hard and soft real-time was not really there.. and beckhoff runs windows. Would prefer Linux, VXworks, or QNX.

Integrating with 20 year old PLC's and robots is such a pain. Most people we work with have stacks upon stacks of abstraction to keep them safe and give them a sane wrapper for the functionality of the robots. So many labs have homebrew monstorous python scripts made by a grad student that left 2 years ago that everyone uses and no one can maintain.

The PLC/robotics/industrial automation space is often ten to twenty years behind in software best practices, with things like source control being completely alien to many vendors. Documentation can also be hard to come by, buried behind layers of paywalls and service agreements.

Of course, installations are usually expected to function for ten to twenty years at a minimum...an order of magnitude greater than anything your typical js-framework-of-the-day considers.

Even just allowing us to patch the servers would do wonders... a lot of this crap is running on 20 year old operating systems.

No one is saying to use the latest javascript framework for programming robots.

I want to see a good identity and user manager SaaS oriented towards smaller customers. There used to be Stormpath, which was pretty amazing, but they got bought by Okta, which is less friendly to small teams and more enterprise oriented.

I'm working on https://github.com/fiatjaf/accountd.xyz, which is basically a universal association of profiles (identified by a username) with accounts on "silos" like twitter, github, trello and email addresses in general. You can use it either for user login or to post-association of accounts with profiles. There's no documentation, but you can see it working on https://sitios.xyz/ and https://sitios.xyz/trello.

Do you think it is interesting somehow?

This is not quite what I had in mind, but it's quite interesting. One recommendation: don't use a username, that approach has a number of flaws you can easily avoid here.

Doesn't auth0 fill this role?

Ah yes, possibly; looked at them about a year ago when their product was still incomplete, will have to check them again.

But in any case it would be good to have some competition in that space.

Are you talking about something like AWS Cognito, or Firebase Authentication?

Like userlist.io ?

Not necessarily an idea for a startup, but I want a Chrome plugin that changes the click behavior of an email address. If I click on the domain part of the email address, I want it to open that domain's website on a new tab.

I work as a fitness instructor and I want to have an holistic control panel to monitor my customers health and program, I want them to speak to each others and share. I want them to see their historic datas too

Healthifyme is doing something on similar lines but there are based out of India right now.

A startup in the multi media field would basically be tooling creators. Innovation in the sense of just creating things easier to use, a jackknife agency that can develop specific tools per department.

Replacement for - Login with Facebook.

The music industry as a whole. Who owns what song, what percentage, what role (writer, publisher), who recorded it, who collects for which industry, who collects mechanical royalties, performance royalties, would I make more money with ascap, bmi, sesac or gmr? How about this publisher, vs this other one? Tell me how often the music I own is played on radio, spotify, etc. Was my royalty here calculated correctly?

Modular template based responsive html email builder with version control, comments and approvals.

Our marketing department sends 10+ campaigns/week, each goes through multiple changes and compliance approval. It’s very time consuming, especially when someone needs to touch the code.

This looks interesting. Is there any solution that you are already using?

> especially when someone needs to touch the code

Do you not generally touch the code? Is some kind of WYSIWYG editor used to create an email or is the code human written?

Right now using an in-house app to make edits to emails initially built in dream weaver.

So, html email template (no unlined css) > merge word content in dreamweaver > upload to custom web app > app uses premailer to in-line the specified css > make changes to the content via ckeditor (which messes up the html in many cases) > export in pardot-ready format.

The app versions the content changes and allows comments as well easy email tests. We do 20+ revisions per campaign so that’s a necessity.

This was built 5 years ago and still works but getting inadequate for the complexity of responsive emails.

I’ve looked at this: https://beefree.io/ but it does not allow the creation of custom content modules. So looking into building a new version of our own, again.

It’s a pain but nothing compared to debugging email markup.

A new operating system that allows peripherals like network cards, monitors, sata disks, bluetooth, speakers, fmri scanners, etc. to be used interchangeably between machines, locally and over the net.

Edit: I'm in the computing industry. Have you heard of it?

There's Plan9.

API Management and Api gateways for regulated industries like banking, finance government etc. With an eye on governance, IAM, auditing etc.

All solutions out there are horrible....

I work on product at TIBCO. Mashery plays in the broader API Management space, but we haven’t traditionally targeted banking/govt/heavily-regulated verticals. From Day 0, Mashery has been a multi-tenant SaaS with locally deployable federated gateway and many buyers in these verticals get jittery about SaaS, multi-tenancy, etc.

However, that SaaS-led model is changing fast and we’d love to hear about your pain with single-tenant, on-prem solutions available today. Would you be open to a no-strings-attached chat? mashery-pm<at>tibco<dot>com

Sure, no problem.

I will write an email tomorrow.

Out-of-wallet challenge questions or knowledge based authentication for developing countries. There is a lot of money to be made there. LOTS OF MONEY.

Could you expand a little bit?

I'm from Venezuela, it is not a developing country right now. But it may be in the future.

Half-joke, but a real problem: quality documentation for AWS. I'd pay good money for this.

that’s called hiring a consultant... just sayin

A messaging/texting device with a long battery life and decent keyboard.

Try a Blackberry or Nokia device.

In my industries, all of them.

for example...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact