Running machine learning jobs on a laptop essentially turn your laptop into a desktop.
Let's say you get suitable hardware, possibly with an external GPU enclosure, and launch a 24 hour training job (which is not that long)... and then what? You can't really close the lid and take it with you anywhere without stopping that process. The workflow becomes unwieldy and inconvenient, requiring compromises between running the ML stuff and your personal mobility. And how about when you're done with proof of concept experimenting and need to train a larger system on all data available, which would take for your laptop, say, two weeks?
Get a decent "conventional" laptop. It will run small jobs quite well anyway, even on a CPU; and for anything beefier connect to a dedicated system remotely, no matter if it's owned or rented, that's the way to go.
If you're going to travel a lot, I would recommend using a lightweight laptop and use a remote server with GPU.
You can either access GPUs on demand or rent a server by the month. Hetzner[0] offers a GeForce 1080, 64 GB ram and Quad-Core i7 for ~$115 / month (+ setup fee).
Maybe not constant, but certainly "readily available".
Don't get me wrong now, I understand that this might not be feasible or comfortable. But OP did not actually describe his/her needs that much, so we are all speculating.
Some good questions that OP could answer might be:
1 - Do you need to do your computation on the spot, like gather data and perform computations in "real time" ?
2 - When you say "travel", what do you actually mean? Do you "travel" like "I am always on a train/plane/bus" or "I move from a site to another constantly but when I'm there I can access a desk and a stable internet connection" or "I move from an airbnb to another" or something completely different ?
3 - What are your security/confidentiality requirements ?
4 - What are your constraints (weight, connectivity, budget, operating costs) ?
5- What are your preference in terms of tradeoff?
6 - What can you compromise on? And what do you value the most? Speed? Computing power? mobility? Energy efficiency? Operating costs?
OP should really answer some of these questions, then we can all come up with better solutions.
In this video Linus Sebastian is using a Razr Blade because that is the laptop he chose as a daily driver, but OP could use any laptop compatible with the enclosure.
I would like to add that using external GPUs is nothing new and there are people that have successfully attached high end external gpus even to old laptops like the glorious ThinkPad X220 (via an ExpressCard adapter).
Buy a dumb terminal. A very cheap, lightweight laptop that can handle a web browser, ssh, and ideally drive an external monitor or two. Then rent time on a cloud service of your choice.
Deep learning is very power hungry. Pretending you can do it on a battery is a fool’s game.
I have the P50, and it is so heavy that carrying it around all the time has literally damaged my back. It's fine if you don't walk everywhere in a city with it in a backpack or messenger bag.
Like most comments, my strong advice is to go for a light laptop with long battery life, and setup servers in AWS/DO/Linode as per your requirements. You will always be able to destroy the server instances to save significant amount of money. You will always be able to add/remove resources depending on your requirements. And you will never have to worry about power supply. Can you imagine how efficient it will be when you run time consuming tasks before flight and get those done on arrival? ;)
It's cheaper and easier to have a workspace in the cloud than having a laptop that consumer more than a kettle.
Just spent the past few weeks setting up distributed Tensorflow on Kubernetes and I can say that the amount of power you have at your disposal is phenomenal. I was able to do hyperparameters sweeping in a blink of an eye.
The best part if that you can delete your cluster when you don't need it anymore.
I can't think of a more cost effective setup.
would a laptop be optimal for such things? Wouldn't you be better loading data into a home server where you can have more powerful cpu and multiple gpu's?
Like others mentioned, I'd go with a laptop with a long battery and remote in. The main advantage of using a Razer Blade or similar laptop would be doing live demos of algorithms, likely after training them, but that is a pretty niche application.
Get the one with 8 GPU NVIDIA Tesla V100 clustered with NVlink from amazon; actually, rent it, but only when you need to train a model. It'll be cheaper.
wouldnt something with a Vega M or Nvidia MX150 chip in be the best bet? something like the XPS 15 maybe, thees also the nvidia quadro in the likes of the thinkpad P but im not sure what the performance is like.
the other option is to get a fairly modern laptop with thunderbolt 3 and get an external GPU enclosure to house a card to do the heavy lifting
Get a light cheap laptop with a great display - maybe a Chuwi Lapbook 12.3 if you really want to go #lowend - and spend those $2000 you'll save from not buying a ghastly expensive laptop on cloud resources to do your research right, and at a scale you'll never be able to achieve with a GPU or three.
Have you considered getting a laptop that lets you export PCIe lanes (i think they do that with Thunderbolt these days, not exactly sure) so you can use the external GPU of your choice?
Let's say you get suitable hardware, possibly with an external GPU enclosure, and launch a 24 hour training job (which is not that long)... and then what? You can't really close the lid and take it with you anywhere without stopping that process. The workflow becomes unwieldy and inconvenient, requiring compromises between running the ML stuff and your personal mobility. And how about when you're done with proof of concept experimenting and need to train a larger system on all data available, which would take for your laptop, say, two weeks?
Get a decent "conventional" laptop. It will run small jobs quite well anyway, even on a CPU; and for anything beefier connect to a dedicated system remotely, no matter if it's owned or rented, that's the way to go.