Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Best gaming laptop for machine learning/deep learning research?
19 points by allenleein on June 17, 2018 | hide | past | favorite | 21 comments
I need to travel a lot so thinking about some high end gaming laptop like Razer Blade...any thoughts please?



Running machine learning jobs on a laptop essentially turn your laptop into a desktop.

Let's say you get suitable hardware, possibly with an external GPU enclosure, and launch a 24 hour training job (which is not that long)... and then what? You can't really close the lid and take it with you anywhere without stopping that process. The workflow becomes unwieldy and inconvenient, requiring compromises between running the ML stuff and your personal mobility. And how about when you're done with proof of concept experimenting and need to train a larger system on all data available, which would take for your laptop, say, two weeks?

Get a decent "conventional" laptop. It will run small jobs quite well anyway, even on a CPU; and for anything beefier connect to a dedicated system remotely, no matter if it's owned or rented, that's the way to go.


If you're going to travel a lot, I would recommend using a lightweight laptop and use a remote server with GPU.

You can either access GPUs on demand or rent a server by the month. Hetzner[0] offers a GeForce 1080, 64 GB ram and Quad-Core i7 for ~$115 / month (+ setup fee).

[0] https://www.hetzner.com/dedicated-rootserver/ex51-ssd-gpu


This.

Other options could be:

1. Get an external gpu with an enclosure that you can attach to your laptop when needed.

2. Build a GPU enabled desktop computer, leave it connected to the internet somewhere (home or office) and access it via a VPN connection.

These, together with renting from Hetzner, are the options I would consider/advice.


Wouldn't this require a constant internet connection?


Maybe not constant, but certainly "readily available".

Don't get me wrong now, I understand that this might not be feasible or comfortable. But OP did not actually describe his/her needs that much, so we are all speculating.

Some good questions that OP could answer might be:

1 - Do you need to do your computation on the spot, like gather data and perform computations in "real time" ?

2 - When you say "travel", what do you actually mean? Do you "travel" like "I am always on a train/plane/bus" or "I move from a site to another constantly but when I'm there I can access a desk and a stable internet connection" or "I move from an airbnb to another" or something completely different ?

3 - What are your security/confidentiality requirements ?

4 - What are your constraints (weight, connectivity, budget, operating costs) ?

5- What are your preference in terms of tradeoff?

6 - What can you compromise on? And what do you value the most? Speed? Computing power? mobility? Energy efficiency? Operating costs?

OP should really answer some of these questions, then we can all come up with better solutions.


Tangently related (mobile GPU): https://www.youtube.com/watch?v=Rjty3I3Cdg8

In this video Linus Sebastian is using a Razr Blade because that is the laptop he chose as a daily driver, but OP could use any laptop compatible with the enclosure.

I would like to add that using external GPUs is nothing new and there are people that have successfully attached high end external gpus even to old laptops like the glorious ThinkPad X220 (via an ExpressCard adapter).


The computer at home setup, depends. For you on the road, no.

You could script everything and have the computer at home every X amount of time pull down from AWS/Dropbox, run and upload.

For office/rented server, they will probably be online all the time.

For you, just to remote in or upload files. Then it’s run remotely and you don’t need to be connected. Then you connect to pull down results


Buy a dumb terminal. A very cheap, lightweight laptop that can handle a web browser, ssh, and ideally drive an external monitor or two. Then rent time on a cloud service of your choice.

Deep learning is very power hungry. Pretending you can do it on a battery is a fool’s game.


You can take a look at the P71 which has a serious set of hardware (if you're not going to be too far from a plug at any given time).

    GPU: Quadro P5000
    RAM: 64GB DDR4
    CPU: Intel Xeon E3-1535 or Intel Core i7-7820HQ
https://www3.lenovo.com/us/en/laptops/thinkpad/thinkpad-p/Th...


I have the P50, and it is so heavy that carrying it around all the time has literally damaged my back. It's fine if you don't walk everywhere in a city with it in a backpack or messenger bag.


There's also the P52s which is lighter (4.39lbs instead of the P50's 5.6lbs). You give up the nice screen, gpu, and Xeon but you do save some weight.


Like most comments, my strong advice is to go for a light laptop with long battery life, and setup servers in AWS/DO/Linode as per your requirements. You will always be able to destroy the server instances to save significant amount of money. You will always be able to add/remove resources depending on your requirements. And you will never have to worry about power supply. Can you imagine how efficient it will be when you run time consuming tasks before flight and get those done on arrival? ;)


It's cheaper and easier to have a workspace in the cloud than having a laptop that consumer more than a kettle.

Just spent the past few weeks setting up distributed Tensorflow on Kubernetes and I can say that the amount of power you have at your disposal is phenomenal. I was able to do hyperparameters sweeping in a blink of an eye. The best part if that you can delete your cluster when you don't need it anymore. I can't think of a more cost effective setup.


would a laptop be optimal for such things? Wouldn't you be better loading data into a home server where you can have more powerful cpu and multiple gpu's?


Like others mentioned, I'd go with a laptop with a long battery and remote in. The main advantage of using a Razer Blade or similar laptop would be doing live demos of algorithms, likely after training them, but that is a pretty niche application.


Get the one with 8 GPU NVIDIA Tesla V100 clustered with NVlink from amazon; actually, rent it, but only when you need to train a model. It'll be cheaper.


The one you like developing on. Thinkpad for Linux, Mac for MacOS. Don't skimp on RAM.

Rent GPUs in the cloud or buy a GPU box that you can run jobs on.


wouldnt something with a Vega M or Nvidia MX150 chip in be the best bet? something like the XPS 15 maybe, thees also the nvidia quadro in the likes of the thinkpad P but im not sure what the performance is like.

the other option is to get a fairly modern laptop with thunderbolt 3 and get an external GPU enclosure to house a card to do the heavy lifting


Get a light cheap laptop with a great display - maybe a Chuwi Lapbook 12.3 if you really want to go #lowend - and spend those $2000 you'll save from not buying a ghastly expensive laptop on cloud resources to do your research right, and at a scale you'll never be able to achieve with a GPU or three.


Have you considered getting a laptop that lets you export PCIe lanes (i think they do that with Thunderbolt these days, not exactly sure) so you can use the external GPU of your choice?


The Dell Precision 5530 is XPS-15 like and has a dGPU option.

Razer has well known quality and support issues.Asus and MSI both have strong laptop lines, but again support and quality is hit and miss.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: