
Dotscience is shutting down - lewq
https://dotscience.com/blog/2020-05-19-dotscience-is-shutting-down/
======
lewq
CEO here. Happy to make introductions to the team for anyone who is hiring. We
have valuable residual knowledge about building a product in the MLOps
platform space and achieving product-market fit, and I hope that this might be
helpful to someone else.

~~~
wtvanhest
I cannot imagine shutting down a company with a team. Must be really hard.
Your note was refreshingly succinct, with genuine thank yous to the people
that helped.

I don't have any advice or anything, just wanted to let you know that anyone
that has ever tried something and failed understands how painful it is, and
everyone else won't. Best of luck to you and your team as you all move
forward.

~~~
lewq
Thank you

------
lewq
Btw [https://mlops.community](https://mlops.community) (which we started) will
live on, Demetrios, Chris and Dan are kindly keeping it running ️

If you're interested in MLOps, come and join our weekly 9am PT / 5pm UK
Wednesday online meetups - it's a great community and it's growing quickly -
and now it's vendor neutral too :)

------
tsieling
I appreciate how this was handled on the homepage: direct and looking out for
your team.

~~~
jorams
> on the homepage

What do you mean by this? I don't see any indication of them shutting down on
the homepage, only on the blog.

~~~
mikeyouse
I got this popup from their homepage:

> _Dotscience_

> _Thanks for checking out dotscience, unfortunately due to the current
> situation we have had to close our doors and are no longer operating. You
> may however enjoy checking out the MLOps community to find out more about
> all the tooling and best practices happening right now in this ever changing
> landscape._

> _Here are some relevant links to the MLOps community: Slack, Youtube, Weekly
> meetups._

~~~
jorams
Huh, that's interesting. I even tried a private browsing window without any
form of blocking, but I don't get a popup at all.

~~~
Wistar
I got the pop-up but only on the first visit to the home page. Subsequent
visits or a refresh do not trigger the pop-up. Firefox on Win. Noscript set to
temporarily allow all.

------
bagrow
Thoughts on how the economic situation will affect machine learning and data
science in industry more generally?

~~~
gk1
If AI/ML was just part of R&D then it's likely to get downsized or cut as
companies refocus on their core products. If AI/ML _is_ or _is meant to be_
powering the core products or services, eg at banks, retailers, and
manufacturers, then those teams are actually more important than ever. Because
ML usually reduces labor and overhead, so the faster they can get it into
production the faster they'll lower their operating expenses.

The challenge for dotscience was that there are _so many_ players in this
space that it's hard to stand out and win deals. See:
[https://zdnet3.cbsistatic.com/hub/i/r/2019/07/17/b17497a0-84...](https://zdnet3.cbsistatic.com/hub/i/r/2019/07/17/b17497a0-844e-42b7-9dde-9020902c0b46/resize/1200xauto/bb71fe6d39deaa6ae7f0d06ce64cc841/big-
data-landscape-2019-v7.png) (These landscape charts are always made to look
complicated on purpose, but the point is clear.)

------
nojito
It's really revealing that such amazing companies on paper...just don't make
money at all and have to resort constantly to outside money to stay alive.

~~~
fxtentacle
Your comment is worded in a slightly mean way, but I don't disagree. I work on
autonomous 3D navigation. Clicking on their "Autonomous Vehicles" solution I
see:

\- model management solutions

\- enormous quantities of data

Well, I have docker containers for my data, and docker containers for the
underlying TensorFlow versions, and docker containers for model source code
that went into large-scale training. All of that is coordinated using a git
repo which has some shell scripts to execute the correct model with matching
data on a compatible TF image. So if someone says "I need you to re-run last
week's model", I check out the script from that time and run it on an empty
server.

I honestly don't get what MLOps is or why I would need it.

~~~
antonvs
How many other ML people do you work with? Do they all use the same system?
Are you only working with Tensorflow and no other tools?

Generally things like MLOps are targeted at providing systems for people that
either don't know how, or don't have the time, to set something up themselves,
especially when the data science team is more than just one or two people, and
when they're using a range of tools that need to interoperate in various ways.

Something like Kubeflow, for example, does a lot more than what you describe.

~~~
fxtentacle
2 others, and yes, everyone has access to the central git repo and to our
shared private Docker repository.

And no, we use TensorFlow and Chainer, but both are python frameworks. Plus
Numpy, of course.

I like that KubeFlow has an introduction video, but I find it quite odd, too.
They talk a lot about how they will make things simpler, but then I learn that
I'll need to run Kubernetes on my laptop, my servers, and potentially the
cloud.

Plus they use irritating marketing phrases like "you can just focus on your
model" or "let KubeFlow handle the abstraction of running on X". Most deep
learning models nowadays are memory-limited, so improving the model may well
mean optimizing GPU memory usage. And after trying to port a working model
from Ubuntu + 1080 TI to Google Cloud + V100 and/or Google Cloud TPU, I
distrust anyone who would treat those significant hardware differences as
"just an abstraction".

So at the very least, I'll need to enforce consistent GPUs and Operating
Systems among all servers, just to make things run OK everywhere.

~~~
streetcat1
So ML ops is just part of a solution, which I think is the reason why it is a
hard to sell.

What I think might work is auto ml combined with models ops, or rather auto
model ops.

Or even better: auto data managmenet -> auto pre-processes -> auto ml -> auto
ops.

~~~
fxtentacle
I think a service that rents out GPU-equipped bare metal Ubuntu + Docker
servers by the hour would be what I'd want to use.

If they also offer a private Docker repo, fast S3-compatible storage and some
pre-built images with Jupyter preinstalled, that might be the entire ML
pipeline that I need.

------
ljvmiranda
How will this affect the MLOps Community Webinars? Afaik dotscience plays a
big role supporting it.

It’s a good and vibrant community, and I’m learning a lot from the webinars as
a practitioner. It’d be sad to see if it goes as well

~~~
danthebaker
The community and webinars will still live! :-) Demetrios
([https://dots.ci/demetrios/linkedin](https://dots.ci/demetrios/linkedin))
will continue to coordinate this.

------
kinghuang
Oh, man. That's really unforunate.

How will this affect dotmesh?

~~~
lewq
Dotmesh will remain open source and as the original author I'm happy to help
maintain it. Are you using it or interested in using it?

~~~
kinghuang
Interested in using it. I've kept an eye on dotscience and dotmesh for quite
some time. At work, we've only very recently ramped up Data Science and ML to
a point where tools like these would be really helpful.

I loved all the demos of dotscience I've seen at conferences. I'm saddened
that we'll never get to try it, but glad to see that dotmesh will live on!

