Hacker News new | comments | ask | show | jobs | submit login
Turi Create simplifies the development of custom machine learning models (github.com)
179 points by codezero on Dec 8, 2017 | hide | past | web | favorite | 52 comments



Very excited for this - I've used GraphLab Create a few times for building recommender systems - drop-dead simple API, supports explicit & implicit data (1-5 stars of purchased/didn't purchase), and can leverage content features (product category/department, color, size, manufacturer) to help with sparse data and cold starts.


I wish more comments were like this. Clear, concise, “I like $foo, here’s why in a few words”.


for someone just getting started with ML but with good iOS experience, whats the best way to learn how to use this on iOS? The documentation mentions it can be used for Audio, but has no examples related to it. Any suggestions as to how it can be used for audio samples?


For almost everything related to getting started with Turi Create, I recommend the Coursera University of Washington course. It gives a really good overview of how to use it. That said, it also doesn't cover audio.


thanks! I'll take a look at that course and see if it helps with audio


I would have expected a larger announcement for a release like this. It seems to come from Apple's acquisition of Turi in mid-2016 [0]. From what I can tell, they are open sourcing most of that product with new integrations to Core ML.

Will be interesting to see how Apple plans to position this against existing ML solutions with easy to use Python api's like scikit-learn, keras, etc.

[0] https://techcrunch.com/2016/08/05/apple-acquires-turi-a-mach...


Isn’t this more like Clarifai and comparables? Scikit and Keras are still lower levels of abstraction


seeing as Apple has gathered basically zero goodwill in the ML community, I expect their staff will be the only people to use most of it


That’s a bit harsh. It looks great for iOS at least. Much easier than developing with Tensorflow or PyTorch


Interesting comment, what are your creds to speak on behalf of ML community? Or are you a stupid AI chatbot?


It's really nice to see Apple releasing this, and it looks like it is still actively being developed, too.

I used it while taking the University of Washington ML courses on Coursera and really liked how easy it was to use, so I was bummed when Apple bought them and it looked like it was just going to disappear.


I am gonna have an aneurysm. It relies on CUDA, yet you can't get a recent Apple machine with a proper desktop Nvidia GPU. No, I don't want an external GPU box. What the heck are they running this on internally?

Actually, don't tell me. Just sell me one.


I'm going to be a bit facetious here, please forgive me – are most machine learning frameworks run on desktop hardware at scale? If not, then, I kind of get why Apple creates a framework that doesn't work on their consumer devices.

I mean, do people complain that DeepMind doesn't run on a Chromebook?

I'm sorry but your argument seems totally misplaced. I'd love to hear a case for it though – Apple hardware is very powerful and does a lot, and I wish it did more – I'm sure we all do, but why are we letting that distract from them contributing to open source and releasing great tools?


It doesn't require CUDA. You can run it on a MBP with no discrete GPU. It had been previously available under the name of Graphlab create, then went closed source briefly after the Apple acquisition, and is now back to open source it looks like.


I assume they just use a Linux box, as I expect most do in AI engineering/science nowadays. But it gets gives some hope that Apple finally sees thr light again towards reasonable hardware configs geared at professionals.


It looks like the same group that is behind MXNet recently released TVM [1], which lets you use Cuda, OpenCL, Metal, etc as the target for tensor operations. I wonder if that means support for AMD cards will happen in the near future?

1. http://www.tvmlang.org/2017/08/17/tvm-release-announcement.h...


If they ever add support for AMD GPU in this ML lib, I will seriously consider buying the next-gen Mac Pro.


Coremltools works with existing pretrained models. It converts existing ml models. You don’t have to train one to use one. I assume that this project packages it easier .


I feel your pain. AMD-only GPU config finally got me to ditch the MBP for work. I tried to use my 2014 model as long as I could.


I’m curious why you couldn’t use the 2014 model anymore? I still use mine with the 750m and it works fine for testing the CUDA programming I do. Really hope there’s one with an Nvidia card by the time I decide to upgrade.


Crazy story.

There is an electrical issue which cannot be fixed easily because it is now out of warranty. I get continuously mildly shocked most times i touch the keyboard. I still use it at home with a docked setup (external keyboard, mouse), but it is a total no-no when on-the-go. Docked, it is great -- can even connect three monitors (2 via DP and 1 via HDMI).

The electrical issue started as a mild annoyance manifesting as a slight tickle in the fingers and some vibration, but when using for many hours, my pinky started getting paralyzed and would stay that way until the weekends when I didnt use the laptop. At first I though it was my pillow, sleep posture, or something. Only later did I realize it was long term mild electrocution adding up...

I tried all the electrical re-baselines at the apple store but the next set of fixes were so expensive might as well buy a Razer, which I did...


I have that on mine too. My experience is that it only happens if you use the two prong cord. If you are on a grounded outlet, the effect won't happen.


That sounds like a grounding issue. I’ve had that with various MBPs over the years; bad on some outlets, not on others.


Funny you say that. I (literally) just first-boot started my Late 2012 mac mini running my new eGPU box running a RX580 after a few years of dealing with Apple dual display limitations, and I see this post. This will should certainly give more life to this maxed out quad-core i7 config. I'll have to check this out.


Turi (Graphlab) is an acquisition from last year. Can't remember if they had GPU support back then.


I can’t imagine an AI startup without a GPU support.


It works without CUDA, but it will be slow. That being said, many workloads wouldn't run on a normal desktop machine anyway and will need to be run on a cluster somewhere.


> What the heck are they running this on internally?

based on the documentation, most definitely running it using WSL on Windows 10.


The pain is real.


thank you apple. Is anyone from turi here? i can't find the notebooks and video tutorials that was hosted on your website, is it going to be released?

if your not familiar with graphlab, check the coursera link: https://www.coursera.org/specializations/machine-learning


AI is getting abstracted more and more, is like how languages are today


I'm curious if you see this as a good thing, bad thing, or a bit of both? I did lots of enterprise programming in 1999 to ~2004 and had to struggle with stupid stuff like creating wire protocols or FTP modules. Working in tech is so much fun now as I can do the actual work without worrying about foundational stuff. And for those who like to do foundational stuff, they can work for one of the vendors who create the foundational stuff.

In the 1990s in college, it was worse, I had to sometimes write some data structures, etc because there wasnt a good standard utilities package.

I'm pretty tired with all the grunt work I need to do these days as a machine learning practitioner. I'd love some abstraction or at least automation of the rote work. Keras is great at some of this, but i'd love a lot more of it.

I'm VERY INTERESTED in hearing everyone's ideas on this, this is a big pain point for me, to the extent that i'm trying to actually build a solution around it.


To me deep learning architecture is what CPU architecture are from yeaterday. Abstraction bring speed and standardizeation over customization, and most people doesn’t need a lot of customizations. So to answer you, yes you will get a lot more tooling in the near future, the only thing stopping it is how fast AI changes. With the speed of progress, if someone discovers a brand new brilliant idea that absolutes they way you train a system, then you could start back at ground zero


To me the grunt work is getting data and for supervised, getting labelled data. Algorithm implementation and design is fairly straightforward and lots of fun. Of course, getting really good results in a domain is hard work - but again it's the fun bit. Then there are the domains and problems that can't be treated with off the shelf algorithms... well good luck!


Could you elaborate on what you feel is currently grunt work? Do you mean the difficulty in building and debugging complex models or something more abstract?


a small correction: artificial neural networks (ANNs) are being abstracted away not AI. Currently there is no common language yet to describe AI.


Yes, some of the platforms abstract away ANNs, I especially like how Keras does this. However, the pipeline and lifecycle (where one often spends 90% of the work) isnt abstracted away. Thoughts? What would you want to black box, what would you want to automate, what would you leave openly exposed?


The pipeline is being abstracted too. E.g. AWS SageMaker.


What is the license around this ? Because the main turi site still talks about academic and non-commercial use only.



I guess this was Ruslan Salakhutdinov's talk at NIPS earlier today?


Are there any other similar projects that are similarly easy to use?


I've used Keras to mostly automate ANNs and then drop the trained models into TensorFlow Android. I wouldnt classify it as easy. However, it used to take months to do this, nowadays it takes days to do it. Still incredibly painful and mostly an undocumented black-art.


If one wants to start from zero experience of Neural Networks, but expertise in application development process, what resources would you recommend to start learning?


Ng's deeplearning.ai is good for getting a solid background in how it all works. If you want to just jump into building things, I'd highly recommend http://course.fast.ai.


The fast.ai courses are designed for engineering people coming to ML. They are excellent.


I you're looking to learn about the maths behind neural nets and really understand how they work I'd recommend Andrew Ng's deeplearning.ai course. Starts from scratch and will give you a deep understanding when you're done.


I'm mostly looking for something like "excel for ANN"


Do you mean 'Excel for solving machine learning problems' or do actually want to design ANN architectures with an Excel-like tool?

If the second case, have you ever done that task without it?

Because while there are pain points (eg, keeping track of the shape of your data) there seems to be a lot more critical problems to fix than that.


Ok - excel for not ANN gets tricky when you have many rows (like 10^6) most of the tricky business with ANN is when you have >10^8 rows of data. Why is it that doing this with an excel type tool will be good?


> "excel for ANN"

I think this won't work well - blindly using a complex tool that has many hyperparameters you don't care about.


I'm working on exactly this, I'd love to chat about your needs from such a tool - drop me an email (see my profile)




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: