Hacker News new | comments | ask | show | jobs | submit login
Visual Studio for Mac (microsoft.com)
664 points by runesoerensen on Nov 14, 2016 | hide | past | web | favorite | 413 comments



Since this was an accidental post (now removed) in advance of the actual announcement and release, we've buried this story. That way we can avoid treating the actual announcement as a duplicate when it happens.


So Microsoft is treating the Mac more seriously as a professional platform while Apple is treating it less seriously? I'm not saying this in a snarky way; I mean it literally as a change of corporate strategies in both companies. Microsoft seems to be saying, If you are a pro mainly using the Mac for professional work, we want to do a better job of empowering you, and Apple seems to be saying, If you are a pro mainly using the Mac for professional work, you need to get used to the idea that we are deemphasizing your market--no hard feelings.


If we're comparing Apples-to-Apples (seriously, pun not (originally) intended), Apple continues to rev on the XCode IDE every year adding more features to it. Most recently, they added a nifty visual memory debugger[1] and have taken another stab at device & certificate provisioning.

[1]: http://useyourloaf.com/blog/xcode-visual-memory-debugger/


You're right, Apple is revving Xcode, but feature wise and bug wise, Xcode needs serious TLC especially on their Swift side. The regressions that are introduced every version don't help and tools like Interface builder, while seemingly helpful, usually make development within teams worse. I still use Xcode every day, but it defiantly needs hardening.


There goes good ol' anti Interface Builder rant again. The real problem is that Interface Builder is too easy to use while there's real depth and challenges to using it just like with doing everything in code.

- Don't throw all your screens into one file, you can even use one screen per file just like xibs

- Use code to style items and create controls you use more than once

- Render all the controls in code dynamically in interface builder so you won't end up with "ghost town" storyboards but everything is visible at a glance

Unless you're working at "Facebook scale" Interface Builder in the hands of an expert will get you very far.

Would you throw all your code in one big 8kloc controller? Of course not, but somehow people manage to cram every screen into the same .storyboard file, just because you can do it. Then they'll complain about merge conflicts, which isn't really a surprise given the fact that you're managing an 8kloc file.

Would you set the background, border, font and color of every button every time you use it in code? Of course not. You specialise a button class. But somehow people are selecting every button manually and setting those properties time and time again once they start using Storyboards while you can use a specialized class that will render in Interface Builder exactly like it will look in the app.


I've been using Xcode since iOS 2 and once Apple introduced IB for iOS, I was all for it. There are a bunch of challenges with IB that are outside of the solutions you proposed.

How do you fix these issues?

- Fixing misplaced views when transitioning from retina / non-retina screens or even different retina screen resolutions.

Reasoning: When I worked at Amazon this was extremely annoying - You didn't even have to touch the storyboard, you only had to open it and tons of misplaced views showed up. This causes a problem when working with any version control system because the XML changes are reflected in git even though nothing actually changed.

- Rendering Snapshots

Reasoning: I use snapshot tests to verify all views via unit tests. You can use IB to capture the view and load it programmatically, but you end up having to load the whole storyboard just to render one view controller.

- Setting all properties via IB

Reasoning: When I setup a button or a view, if half of my properties are in IB and half of my properties are in code, how do you determine what goes where. Apple did add IBDesignable, but wiring that up is so that you can click a drop down is more complicated than just setting the property on the object (and it renders in snapshots correctly and it never suffers from misplaced view and property configurations are in one place).

The teams that I've worked on aren't that big, but I can say that teams I've been a part of that don't use IB have worked a lot faster than IB teams. You may be a lot better at IB than I am, I only stopped using it 1 year ago for my projects after about 4 years of using it.


> Fixing misplaced views when transitioning from retina / non-retina screens or even different retina screen resolutions.

No solution. Non-retina users should carefully commit :(

> Rendering Snapshots

I don't think I understand the problem. Nothing is stopping you from rendering just one of the view controllers alone?

> Setting all properties via IB

I'm doing more and more in code nowadays, including constraints that are also rendered in IB. Makes it easier to change things like ratios or heights all across the app.

Interface Builder then glues it all together and I can throw in some one-off items.


IB is still not a good tool when there are a lot of people working on a project because the files do not merge easily.

I've had several people at Apple tell me they generally don't use interface builder internally.


> I've had several people at Apple tell me they generally don't use interface builder internally.

I heard the same claims about Swift. I have no idea what to believe.


does it support reverse debugging yet?


What I want to know as somebody trying to learn C/C++, is Xcode good for somebody who is wanting to try their hand at bare metal stuff? I always hear how bad it is outside of app dev. but can it help me be productive compared to CLI environment or just using a text editor with nifty add-ons?


I've written a lot of C++ code in Xcode. Xcode is a pretty poor IDE in general but it works reasonably well for C++. IMO things like autocomplete and a graphical debugger are enough of a productivity boost that I wouldn't want to use a plain editor.


Yes, and the ability to find call references (who is calling this function, where is it used) from the drop-down menu is really helpful. I miss the ease of access to that sort of functionality in Visual Studio.


Maybe I'm not using the same version of Visual Studio as you are, but Find All References is right there on the context menu when I right-click in VS.


Yes it is there for me, but the functionality is different on Xcode as it will find all references when you open the menu, so you can see the results straight away. One less click (and perhaps one less popup hidden "find symbol results" pane in VS).

That was what I was highlighting, not the fact that Visual Studio somehow doesn't have that functionality.


Also shift+F12


I would recommend using QtCreator if you want to do just pure C++ work done. It has less focus on app development and it has way better highlighting (semantic highlighting thanks to clang), refactoring and debugging.


Instruments, included with Xcode, is a really good tool. I used instruments for profiling and finding memory leaks... I think it was great... except for the lack of reverse debugging which is pretty much necessary these days.

I recommend this video to see some good example about reverse debugging with gdb. https://www.youtube.com/watch?v=713ay4bZUrw


I have used both Xcode and Text editor & Terminal for C/C++ development. In my opinion both work equally well but I would advise you to use the Text editor and Terminal method because it means you need to learn how things like build systems, LLDB, Valgrind and more work. This knowledge is essential when the IDE does something Unexpected and you need to fix it.


Xcode is not your only IDE option on macOS for C/C++. Decent ones: Eclipse CDT, Qt Creator, CLion.


It's slightly better than a text editor, but not by much


> If you are a pro mainly using the Mac for professional work...

This is extremely dishonest assessment that only 'Developers' do professional work. Mac is used by Engineers, CAD/CAM, designers, illustrators, artists, musicians, etc and to reduce it only to developers is disingenuous.

I for one is happy with the new Mac and there are several people who've found it great for 'pro' use.


Weirdly from your list software developers are the only ones the New MBP can serve. It's got plenty of power to run XCode or a web dev environment.

Engineers, CAD, musicians, 3D artists all need more power than it provides and increasingly need CUDA cores.


>..musicians,..

I'm running a Logic set up on my 2012 MBP, and I've never felt limited by its capabilities. I've been running Macs for music production for almost 20 years, and in my experience they have all at some point not had quite enough in the bag to let me do what I wanted of them. This is the first one (now fours years old) that has stayed ahead of my needs. I can run multiple copies of Massive, and rows of Waves plugins no problem.

I have hit limits with real-time 3D stuff, like gaming and Houdini, but music production is still well within its capabilities for me.


The people who make film/game soundtracks need a LOT of ram, and fast disks, because they're using terabytes of samples in each project. Also massive has pretty low requirements.

The main gripe musicians have with apple is the new OSX breaking their setup every year. The sandboxing in el capitan was particularly disastrous.


It sounds like you're basing this argument on hearsay and not personal experience (shocker.) I have various audio setups with different macs and interfaces and no OS update has broken anything with my setups since 10.6 era.


I don't think that's fair. Even major audio companies were warning customers not to upgrade to Sierra for months due to compatibility issues. Here's iZotope's page about compatibility with their audio plugins - if you bought it more than a couple of years ago, it won't work on Sierra:

https://www.izotope.com/en/community/blog/product-news/2016/...

Personally I had to spend a few hundred dollars upgrading my audio software that worked in Mountain Lion so it would run on El Capitan. I'm still running into a few glitches in places.


That page indicates what versions of those plugins are certified by the vendor to run on Sierra. That doesn't mean that older versions won't run, it just means they haven't been certified to do so.

I've been running software on Sierra that is only certified by the developer for Snow Leopard through Yosemite, and it still runs just fine for my needs. Apple puts a lot of effort into binary compatibility.


My personal experience disagrees with you. Waves, Native Instruments, and others, have had various issues over the years with OS X upgrades. I have projects spanning several years that forced me to wait a revision or two before upgrading to ensure they would continue to load and mix down as intended.


> because they're using terabytes of samples in each project.

This is beyond hyperbole.


Not true, I work with projects where each minute of video is 8 GB. Multiply that with 100 or 120 and you are pretty much at 1 TB.


Just a wild idea here, but maybe a laptop isnt the best solution for you?


That's fair. I definitely have done a lot of sample-based and multi-track audio work and again have never hit that limit (on this MBP). However I do agree that if you rely on 3rd-party plugins there is a crapshoot every year as to how long you have to wait before compatibility reaches 100% and you can upgrade your system without breaking previous mixes or set ups.


If you are an accountant even an underpowered PC is likely adequate.


Why do you assume that software developers don't need more power and/or CUDA cores, too? Development does not only consist of web development.


How many developers need more capacity than the new MacBook Pro but still less than the largest laptops offer? The few non-gamers I know who are still CPU/GPU-limited are using clusters of machines because no single machine is large enough and in most cases it's significantly cheaper to rent capacity for the few times when they need it rather than pay up-front for something which will be idle a fair percent of the year.


I'm not sure I'm understanding what you are suggesting here.


I see three potential groups of buyers:

1. People whose needs are satisfied by almost any laptop with an SSD and semi-recent CPU/GPU.

2. People who need the absolute top-end system but can fit their work on a single machine (e.g. if you had a model which uses 20GB of RAM, a 16GB machine is unsuitable but a 32GB machine is fine).

3. People who have so much data / computation that no normal computer can handle it.

My gut feeling is that #1 covers most of the market and the question is really how many people fall into group #2 but not group #3, especially in the context of laptops where the ceilings are smaller on both the Mac and PC side. I would further expect that a fair number of the people in the second group are not running those workloads 24x7 and thus have a practical, often cheaper, option of renting an hour of time on AWS/Google/Azure/etc. when they need to do something and get the results faster rather than leaving their laptop running for a day or two — even the best laptop GPUs are smaller capacity than what you can rent on a server.


"Web developers" should compile a browser(like Chromium) every so often, to provide themselves a reminder of the massive gap between what they do, and what "developers" do. Those CUDA cores and more CPU are very important to non-web-developers.


I've never needed a CUDA core at all, but then again I'm only a lowly kernel and network stack developer.


He said plenty of power


Pretty sure folks in the other professions mentioned are also lamenting over the missing successor to the Mac Pro, and the loss of not-yet-outdated ports on the MBP (especially since they probably have more peripheral devices than developers).


I'm not sure how you got that out of his post.


The mac users in science are also legion


Sometimes for reasons beyond my understanding: I've seen many people bent over a 15" MB playing with MRI images in matlab/spm/... It's hard imagining something with worse posture, less ergonomics and less efficient.


mac has word, osirix and freesurfer etc but I agree about posture


Not sure why you get downvoted for what is imo an ok reply? Only Osirix is strictly OSX and even though I haven't used it I'm pretty sure there are viable linux/windows alternatives for that and the other software you mention. Anyway, next time I see someone in a situation as I pictured I'll just ask why they chose the Macbook


Osirix is the only FDA approved software of its kind (MRI, 3D reconstruction, database), as far as I know.

Mac is the only platform where you can run freesurfer and word natively.

I didn't notice the downvotes..


I'm a developer and plan to get the new MacBook Pro. I just don't use function or escape keys that much. So, it seems like it's a subset of developers that are bothered by this.

I used Windows when I was young, then switched to Linux for 10 years professionally, and finally I switched to the Mac about 5 years ago. I find it much easier to use day-to-day than Linux. Especially using multiple displays in different offices.


The "oh I can live with this" moment for me with the new mac was remapping caps-lock to escape. It's not as bad as I expected it to be(the new keyboard layout). Not as good as I wanted, but not as horrible as HN and other places warned.


Meanwhile, a "pro" user who's actually used the new Macbook Pro weighs in:

http://www.huffingtonpost.co.uk/thomas-grove-carter/one-prof...

Spoiler alert: he doesn't agree with the assessment of HN users who haven't actually used it.


That article is a typical example of "If it suits me, it must suit everybody." and your post is a good example for the no-true-Scotsman-fallacy of "Noone who _really_ used it dislikes it."

All I really know about this is how I feel about it and I must admit that I am going to go back to the PC world when the time comes to replace my current MBP. The offerings in the PC world are not perfect for me but they suit me better. I bought my MBP because at the time it was actually the cheapest machine offering all those features at a high build quality. I didn't get into a dependency on OSX and am pretty confident that I can just migrate fully to Linux. So I guess I'm not _really_ a professional Mac user.


That article is a typical example of "If it suits me, it must suit everybody."

99.9%+ of criticism of the new MBP has been of the form "Without ever interacting with one, I can tell it is unsuitable for me and therefore is unsuitable for anyone, anywhere, in any professional purpose, ever".

no-true-Scotsman-fallacy of "Noone who _really_ used it dislikes it."

More like "people are pre-emptively concluding, without ever having so much as been in the same room as a new MBP, that it is the antithesis of everything they need from a computer".

Which is, to put it bluntly, idiotic. I've suggested in the past that this feels less like "I have legitimate criticism of this product" and more like "I hate the manufacturer, always have hated and always will hate the manufacturer, and see this as a convenient cover for venting my hatred of the manufacturer". Notice how much of the criticism veers quickly away from specific aspects of the product and into "this is classic Apple", "this is how Apple treats users", "this is what's wrong with Apple", "Apple abandoning a key segment again", etc.


> 99.9%+ of criticism of the new MBP has been of the form "Without ever interacting with one, I can tell it is unsuitable for me and therefore is unsuitable for anyone, anywhere, in any professional purpose, ever".

While this some truth in it, it is not what I took away from the discussion. Most people complain about the following:

* "I expected more."

* "I expected the price for the same specs to drop or at least stay constant but not to rise."

* "I cannot use this machine to do the work in the way I do it currently. This and that port is missing."

While there is a great deal of hate towards Apple, the comments I've read here are not driven by hate but by disappointment. The old MBP lineup was very good for these people. They like them very much. Some love Apple, some don't care but they all agree that the old hardware is very solid for a reasonable price.

I really do think that the new MBPs are good machines for >90% of the current MBP users. A bit more expensive but not too unreasonable given that most are locked into the Apple ecosystem. Some may need some adapters for things like digital cameras, projectors, monitors, USB sticks, keyboards etc. but that doesn't really matter to a true Apple customer. Most of the time the machine is used without those devices.

What Apple should worry a bit about, though, is, that the top 1-3% users are now looking for other hardware. But who am I to worry about Apples strategy. Probably they don't need those few powerusers anyways in their future business model. I don't even hold Apple stock currently. My old MBP runs fine still and I'm not locked into their ecosystem. I can move onto the greener lawn at any time.


I don't care about the touch bar and while I'm miffed about the magsafe I can deal with shelling out extra for accidental damage insurance. Personally my disappointment is with the specs which no time with the machine will change.

The mac I'm currently using was purchased 3 years with 16GB of RAM and if I replace it I will be stuck with the same capacity. I imagine there are a lot of "Pro" market segments that are well served by 16GB or less though. I'm hoping the next revision gets a >16GB capacity and it's released before I need to replace this one.


Lately, my main machine has been a mid-2014 mbp with 16GB of RAM and I recently purchased the new mbp with 8GB of RAM.

Comparing them side by side is a very strange experience. _Technically_ it should be slower but it doesn't "feel" slower. However, at times, there is a bit of shutter that I can't put my finger on - am I just looking for an excuse to say its slow or was the loading time on opening this project always so slow?

In the end, I believe the 8GB variant is suitable for most folks while I myself will upgrade the the 16GB version. However, I've opened up plenty of large projects on the new laptop and have tested speed comparison of every day tasks: transcoding videos, opening million (maybe an over exaggeration) row excel docs and web dev work with 2 VMs running. Overall, performance is fairly up to par with my old machine.

All of that being said, I do believe the machine is a touch expensive. I ordered mine from Amazon when they had some ridiculous pricing and won out so I don't feel bad offloading on craigslist to get the 16GB version.


Thanks, that's a helpful review. It has me wondering if it might benefit from the hardware improvements enough that swapping would be noticeably faster.


I'm thinking along the same lines. If you're interested in other tests/simulations and I can run them side by side and report back. Today I've ran 2 VMs, about 20 tabs, a large project loaded into an IDE and two shell sessions without any noticeable slowdown.


I appreciate the offer, thanks. If you have a desktop VM (Windows or OS X) laying around I'd be interested in how responsive they are with enough of them running to make things interesting. They tend to be the most demanding/least tolerant VMs I have to deal with. But don't bother unless you're really into it.


My hypothesis was that the computer "runs" better because of faster RAM and processor but because the system has 8GB, it can't be pushed __too__ much.

Testing: 1x headless Linux w/ 1 CPU core and 512MB RAM 2x MS Windows 7 with display, 2/ 1CPU core and 512MB RAM each. IDE consuming ~987MB RAM 6x Chrome Tabs open 3x Safari Tabs open Apple Mail.app Other misc software running in the background.

Physical Memory 8GB Memory Used: 5.96GB Cached Files: 2.03GB

Each VM being added increased SWAP. With one headless, SWAP was at ~50MB. Adding WIN7 VM with display brought it up to 251MB, adding a third VM with display brought it to 550MB.

CPU Usage peaked at ~70% when adding VMs with some delays in response when browsing simultaneously.

All VMs running, mocking around a VM and running minor tasks in background (comprising and decompressing junk data) brought usage to about 30%, CPU usage never peaked past 50% without additional load.

Conclusion: I'm actually really happy with the laptop. The whole dongle hell really doesn't exist, in fact I was able to remove cables from my desk. Before, I had to plug in 1 power cable, one thunderbolt/DP, one USB...now all three are going into a single dongle and one cable to computer.

For external HD, I've been using the Samsung SSD USB3 to USB-C for about a year so that made life easier.

Prior, I had to carry an ethernet adapter for remote work, which was replaced by an ethernet adapter of a different kind.

General USBs, thumb drives, etc are plugged into my monitor (which has a hub) just as before, no difference.

If this laptop started at $200 less, I'd say this is a very adequate laptop for work purposes, including running VMs.


Should be fun.

I have a MS Win 7 VM for I.E. testing. It's downloaded from MS so there wouldn't be any variance. I'll run a few of them today and get back to you.


I've spent the last 4 years doing development from a 2012 macbook air with 8 gigs of ram. Its been totally fine, except when I've got a million chrome tabs open. (Declaring bankrupcy and closing them all at once feels great though.)

The posted article is about a mac version of visual studio. Coincidentally, visual studio only runs in 32 bit mode and hence can only make use of 4 gigs of ram total: https://blogs.msdn.microsoft.com/ricom/2015/12/29/revisiting...

The article is worth reading. They (correctly) have kept asking "why does VS need more ram than that?" and just optimize the code when the footprint grows bigger.

And I'm genuinely confused by all these people complaining about 16 gigs of ram not being enough. If you have a laptop today with 16 gigs of ram, have a look. Do you actually run out of ram while working? (And if so, what on earth are you running?). It looks like its genuinely hard to fill 16 gigs without chrome or slack running. Look at all the stuff you can fit in that much ram: https://www.zdziarski.com/blog/?p=6355

I'm also a big fan of pushing app developers to fix their cruft. Maybe in 2016 its not ok to have apps that suck up as much ram as possible. Maybe app developers shouldn't write super inefficient software just because next year we'll have bigger computers anyway. Maybe if you're writing software (any kind of software) that really does need more than 16 gigs of ram to work effectively you should fix your shitty code instead of demanding everyone buy new computers. The atari 2600 had 128 bytes of RAM, and played all sorts of cool games. The original X-Box had 64MB of ram and ran Halo. Maybe its not apple's fault that your fancy 3d graphics program can't work properly in 'only' 16 gigabytes of ram. (Especially given there's 2 gigabytes/second of SSD bandwidth available on those new machines. Yummy!)

I love the fact that the new machines are small and portable. The hardware is more than capable of doing everything I need it to do. The only barrier to all day battery life now is crappy software.


Talking about VS for Windows, that's the maximum footprint of the VS main process. If you use one of the WP8/W10M/Android emulators, then you need an additional 1/2/3 GB for the VM (plus overhead). Throw in some browser tabs, git (in VS15 it will be in its own process instead of eating up the main process' memory), some .NET Native/LLVM, the OS itself and you'll find that having 16 GB will give you quite some comfort


I, too, miss the good old days of splitting bytes into nibbles. Sometimes I program a microcontroller just to feel the walls moving in.

Software today is designed the way it is because developer time matters more than hardware specs. Getting the software to run and onto market is much more important than memory footprint. Very few companies worry about hiring an assembly programmer to quench out 10% performance. They don't even use C/C++ because it's that irrelevant. Before those programmers even finish the market has moved on and the product is obsolete.

People want the RAM because it is cheap and they can put it to good use for their data mining, video editing, virtual machines or whatever.


The atari 2600 had 128 bytes of RAM, and played all sorts of cool games.

I assumed first that this must be a typo for "128 kilobytes". But no, you are completely correct: 128 bytes. Wow!

Atari 2600 Teardown: https://www.ifixit.com/Teardown/Atari+2600+Teardown/3541


I can't say I can really disagree with anything you've said. Good points all around. I mentioned how I use it elsewhere but I also mentioned that it may in part be a goldfish effect where I'm just being sloppy and using whatever is there and more.

Speaking of Chrome... I know of at least one browser that can avoid keeping every tab active and running but despite the cost, those tradeoffs Chrome makes lead to (in my opinion) a snappier experience.

On a related note, I know using Safari can dramatically improve battery life and may have some improved resource usage/performance characteristics but I've never been able get fully used to it without getting frustrated. It feels like death by a thousand cuts. For example, I can't tell if the dev tools are much worse or I'm just not understanding them the way I do Chrome's and Firefox's.


VM's! We need the memory for VM's! Not all developers are "web developers", some of us make real things that those web devs take for granted.


The ram you were using 3 years ago is a lot slower than the ram they have put into the new MacBook pros. So there is that benefit at least.

32 GB is a lot of ram. What do you do where you need that much ram in a laptop? Is that the limiting factor in performance for you versus another component? Have you considered a desktop/tablet combo or anything like that?


To be fair I don't bump up against it daily so this isn't some kind of deal breaker. Just disappointing.

Mostly virtualization (server and/or desktop operating systems) but sometimes decent sized datasets (which really aren't too bad from SSD) and occasionally those things combined with software that is... written poorly. I used to use both a desktop and laptop but it was more trouble than it was worth. I might have to revisit but I'll miss having those workloads local.

I have no doubt I'm an outlier and I certainly don't expect Apple to change anything. I'm not really resentful, just disappointed.


I see.

I was really just curious because for the longest time I got by with just 4gb on a MacBook Air until I switched to the MacBook and 8gb. I run VMs and other software on it but never really ran into a RAM issue.

I think there is hope though. They'll put 32gb of ram in eventually. The explanation was that 32 uses too much power, and of course the rebuttal is "stop making it so thin", which I sympathize with. On the other hand, I do like thin and light computers.


I'm sure part of the problem is I grow into my available RAM and storage like a goldfish. I'm sure some attention to optimization or constraints would make things less limiting but it's nice not having to think about it and just do stuff.

I like thin and light computers too. I have a MacBook Air I use to browse the web and I love it. Sometimes I pine after a career in web dev since it would handle it like a champ.


IIRC the explanation is really "this is what Intel's stuff is supporting, and we're stuck with what Intel supports", and it would've been potentially another year or more of no MBP refresh if they waited for Intel to get there.


> IIRC the explanation is really "this is what Intel's stuff is supporting, and we're stuck with what Intel supports"

All of the available CPUs support 32 GB; the i7s in the 15" model even support 64 GB.

[0] http://ark.intel.com/products/91156/Intel-Core-i5-6360U-Proc... [1] http://ark.intel.com/products/88967/Intel-Core-i7-6700HQ-Pro...


They support > 16GB if you use DDR4. LPDDR3 is limited to 16GB.

Given that Apple maximises for long battery life/power efficiency/thin-ness (where heat = bad), I think they made the right decision.

There are very few people who genuinely need more than 16GB of RAM in a laptop computer.

Would I get 32GB if it was available? Yes. It would make my work a little easier (multiple VM environments), but it's hardly the end of the world on 16GB.


Yeah. I mean look, they could design a laptop for the vast majority of their users, or they could design one that is best for <1% if their users. It's a clear choice.


There are a lot of use-cases for >16GB of RAM, but I'll share mine specifically. I develop network appliances for a "medium-to-large" enterprise. Specifically, these network appliances provide BGP, stateful packet filtering, and the other network services provided by our company's products.

When working on these appliances, I tend to spawn hundreds(close to 1000, but not more on my laptop due to RAM) of VM's, and each VM has between 4 and 48 virtual networks. Then all the appliances begin working, advertising and responding to BGP updates, setting up and tearing down VPN tunnels, and other test scenarios.

Right now, when I want to do this for network spec'd above size <N>, I can't use my laptop. I end up having to provision hosts in one of our data centers just to get my work done. If my MBP had 20, 24, or 32GB(best!) of RAM, that wouldn't be the case. Maybe in 4 or 5 years, 32GB wont' be enough either, but right now I'm only concerned with the immediate. If the MBP's had grown in maximum memory(like other laptop vendor's models), this would have been a great improvement to my workflow, and allowed me to keep it local.

There are probably tons of more common use-cases for wanting all that RAM out there, but that one in particular is mine.


Yeah I can see that (though keep in mind that this is a very fringe case). But on the other hand I'm not complaining that my MacBook can't run any game available at 60 fps.

If I want to do that, I get a desktop. I think that has been a common theme for a long time. Power - desktop. Portability - laptop. We're asking Apple to make laptops as powerful as desktops. It just won't happen, unfortunately.


I interacted with one last week. That interaction confirmed everything I said earlier about the MBP (horrible keyboard, bad choices about ports, etc.).

I've been around the consumer tech industry a long time. I've shipped a considerable amount of consumer computing gear and am very familiar with the design process and the many tradeoffs that happen when you take something from a cool idea to heavy in someone's hands. One of the places I did this was Apple, in fact.

Apple seems to have decided to shift the market for the MBP by making tradeoffs that don't target any of the professionals I know.

I also know that (a) my predictions about the hardware were correct, and (b) none of the professionals I know plan to buy one, beyond the one or two samples we're getting into our group just to make sure we're making the right decision.

And now we're looking at Linux laptops in a serious way.

Professionals, yup.


anecdotally i know that i and most of my friends would never buy one at this price point for just the basic model. but we'd be fine with it if our workplaces bought it for us and paid for all the expensive cords to make a multi monitor setup possible


> More like "people are pre-emptively concluding, without ever having so much as been in the same room as a new MBP, that it is the antithesis of everything they need from a computer".

Many of the complaints are about plain old hardware specs, especially RAM and the USB-C connectors. You don't need to hold a MacBook in your hands to understand how much RAM it has and how many dongles you'll need to buy. Same for complaints about the price.


The keyboard is missing keys I use and the specs are unimpressive. I dont need to sit down with it to know this.


At least it is a refreshing counterpoint to the "if it doesn't suit me, it can't suit anybody" sentiment that I've seen a lot of.


I am in a similar situation... I bought my current MBP and will use it until it dies more than likely, but not sure if I'd buy another Mac. The touchpad on the MBP is second to none, which is what kept me this time but there's a massive premium there.

I use Mac, windows and Linux daily... And honestly the Mac is the most odd ui of the three for me. Having bash is pretty nice as is homebrew...

I'm really hoping MS puts similar effort into a Linux version, since I know I'm not the only one going that direction... Been considering it on my mbp...


I just bought a Magic Trackpad 2 and am using it with Windows 7. It requires the purchase of a $10 utility to get all the fancy stuff (scrolling, right click), but it's not bad.


Got a link to said utility? I'd love to have one at work (windows)... Currently using a mouse, the apple trackpad is so good though, only device I don't miss a mouse with.


Agreed, I love my Macbook Pro (2015 edition), however if I have to replace it in a few years I'm not buying the current Macbook Pro, instead I'll look for a comparable Windows laptop and put Linux on it. (If only there was a real Macbook competitor out there)


Something like the DELL XPS line or the Clevo P650RE may serve you well, the Clevo can fit up to 64G DDR4 RAM, GTX 1070 and a 4K screen.

The only thing it doesn't have is the battery life of the MBP, but if you want raw power, you may be plugged in most of the time anyway.


No way apple doesn't have a 32GB option the next cycle.


Except he's coming at it with a strong bias:

    No matter what you think the specs say, the fact is the software and hardware are so well integrated
      it tears strips off "superior spec'd" Windows counterparts in the real world.
    This has always been true of Macs.
Having used both lines of machines for many years, if I want raw power and 'specs' I use the PC.


His argument falls short the second you want to use non optimized software - he even touches upon this, but "I understand people need to use programs from other developers, but at some point they need to play catch up". This is rather difficult to take seriously. So you can efficiently do what Apple say you can do, but nothing else? That might work in his line of work, but it doesn't for the rest of the world.


For his use case there is a pairing of software and hardware optimized to work with each other.

In "professional" situations this is not, as I understand it, particularly unusual. And yet the unsuitability of the new MBP for "professional" use cases has been widely assumed on HN. It's interesting to see someone who actually has one of those use cases and has actually used the new MBP, weighing in to say "it works, and here's why". Not least because the level of hardware/software cooperation Apple can muster is a selling point for him, but has been ignored by all the "I'm a touch typist whose workflow consists exclusively of function keys, the touch bar makes this a worthless toy to me" noise coming from HN.


While it is not unusual, it's not what Macbook Pro has been known for since their transition to Intel CPUs. So it fits some specific use cases, at the expense of every other. And if they did make the hardware faster, all usecases would benefit.

All those people with a different usecase, are right to be annoyed by that. But just as this piece completely lacks unbiased opinion, so does the noise coming from HN (and the tech community in general).


I've also had the feeling he was strongly biased throughout the whole article


If you still feel the bias, you're not _really_ an Apple customer.


I think you conflated customers with disciples


I bought a MBP in April this year. It was my first Mac. I am quite happy with it, the retina screen & touchpad are really good, but I don't think my next one will be a MBP. Windows has now the Windows Linux Subsystem and it is actually quite good for Linux development on Windows. (I manually updated it to 16.04). I don't need Cygwin anymore and it compiles to ELF format. I don't know if Microsoft will keep it but if the goal is to attract developers who deploy on Linux, it might attract all the ones that cannot migrate to Linux due to proprietary apps or don't want to tweak their system. I recently tried to use Linux (Ubuntu 16.04 and 16.10) as my primary desktop (once again I think I tried every year since year 2000) but still failed having 2 screens supported with 2 graphic cards, bluetooth headset being connected and sound through HDMI. There's no killer app for me on the Mac (I don't use GarageBand or final cut pro), maybe I miss the viewer that is nice for pdf pages re-ordering or pdf merging (could not find free equivalent on Windows).


>I’m an editor at Trim Editing in London, where we cut high end commercials, music videos and films.

So not a programmer, and not someone who was using the function keys in the first place based on that article. Furthermore he claims it's "faster than editing on any windows system" because Final Cut Pro X is integrated so well with the hardware he doesn't need more memory or CPU. Sorry, of all the applications he could've chosen, claiming that a Windows box with more memory and a better CPU would be slower is... asinine. Fanboy alert.


> "A 'Professional' should be defined by the work they deliver and the value they bring, not their gear."

This is absolutely silly. A professional blacksmith can't work without a proper anvil. There really is something to be said about having proper tools for the job. You can't drive in a nail with a shoe, you'll need a legit hammer.


this seems suspicious

No matter what you think the specs say, the fact is the software and hardware are so well integrated it tears strips off “superior spec’d” Windows counterparts in the real world.

could the statement be any broader? what are the "superior spec'd windows counterparts in real world" he's compared it to? Also he's using Final Cut Pro which happens to be an Apple product so if that's faster due to integration how does that help anyone using non apple development software which I presume is majority of macbook pro users. I edit a lot of photos and I don't use any Apple software for it


As I understand, the new Macbook Pro has not been generally delivered yet. Is he has had one for a week, it must be through some special Apple program. So of course he likes it, otherwise he wouldn't have received one.


Apple is not saying anything like this, just a bunch of whiners twisting things. Just like they did after any Mac upgrade.


>WRONG. The removal of a dozen keys from an already gimped keyboard is decidedly anti-developer.

I am a developer and I could not care less about function keys.

Why the duck would developers need function keys? Even for Vim, the age-old advice is to remap Esc so that you keep your hands on the home row.

A flexible multi-touch strip of context-aware keys can do much more things -- e.g. map debugger step moves when I'm running an IDE, or trigger builds, show the SCM status of the current opened file, etc.


If I'm debugging in an IDE or through a browsers developer tools, I absolutely use the function keys to step through/into/out of code.

What should I do instead? Serious question.


I would presume that if the application hasn't been updated for the Touchbar, the Touchbar would default to displaying ESC F1 F2 F3 F4 etc..


Still though, at the moment I can step through code without really thinking about it, partly because I know where the keys are based on how the keyboard feels, combined with the tactile feedback of pressing the buttons.

I would worry that with no physical presence on the keyboard, I would spend a lot more time looking at the keyboard figuring out where the function key I need is than actually getting things done.

I would be keen to see a review from somebody who uses the new Macbook Pro professionally as a developer to see if this is as much of an issue as I imagine it to be.


> I would worry that with no physical presence on the keyboard, I would spend a lot more time looking at the keyboard figuring out where the function key I need is than actually getting things done.

Exactly. In Eclipse, F5 steps into a function call, F6 steps over it, F7 returns from the current function, and F8 resumes execution, and mistakenly pressing the adjacent key can be frustrating (although, with time-travel debuggers, this issue might be alleviated?).


Interestingly, when Lenovo did Thinkpad T430 series, they removed the spacing between the F-keys (in order to fit ESC in the same row). Such a tiny change, yet how much usability it destroyed - suddenly it was impossible to use F-keys by touch.

With T440, the spacing was back.


As a touch typist, can you please explain how that would work for me?


It's the same argument from the Blackberry v iPhone days. Maybe not great for you. The question is - for most people are configurable F keys more useful than touch-typable F keys.


As a touch typist, can you please explain how function keys are particularly relevant to touch typing?

Stepping through the debugger is not typing, and function keys change role according to the selected app anyway. And when they are system function keys (brightness, volume, etc) they are even more irrelevant to typing and/or touch typing.

Besides, there's nothing particularly hard about finding a touch based F6 key compared to a physical F6 key. A key's position (which won't change) gives more of a clue than the key's boundaries.

Heck, it's called touch typing -- a touch strip doesn't sound that alien to it.


You can feel the boundaries of physical keys, unlike virtual ones on a touchscreen. The nubs on F and J serve a similar aligning purpose as, and enhance the functionality of, the interkey gaps on the function key row.

Stepping through the debugger is not typing,

I disagree. E.g. when you're deciding to step in vs. step past vs. step out vs. run etc.; finding the right key is extremely important.

...and I challenge anyone to hover their fingers over the respective keys continuously for hours without touching them, losing their alignment, or unnecessarily tiring the muscles of their hand.


Thanks, you answered the question perfectly. I never look at the keyboard whilst typing normally. I have transitioned to using Visual Studio 2015 in the last year, and am now also developing my muscle memory of the function keys.

I use a Logitech G910 keyboard (love those Romer G switches now, even though it took a while), and e.g. setting a break point with F9 is easy as it's the first key of the third block. For a debugging session, the rest follow - I can rest my fingers on the buttons and just step through / skip over etc - no need to look at all, the focus staying on the code.

Actually knowing when I press the button too is extremely important; there's no mistaking the action on a physical keyboard.

I also have a X1 Carbon laptop, the first gen. Fantastic keyboard (and thanks to getting the i7/8/256 version which was rather outstanding back then, it still serves me well today even though the 8gb is getting limiting). In its 2nd iteration, they went for capacitive function keys, much to just about everyone's chagrin. Thankfully, Lenovo listened to feedback and in its the 3rd generation the function keys are back to normal, i.e. same as mine. If they bring a 32gb model out by the time I feel I need to upgrade, I'll probably look at another one (in 1-2 years).


There are many applications, that do use F-keys for shortcuts.

Not only debuggers, like others mentioned. But also some popular file managers (windows: Far, Total Commander; linux and osx: Midnight Commander).

When using these applications, I can copy files using F5 - and I know it is F5 without looking, because it is in the middle and has an empty space to the left. Similarly with F8 (delete) - in the right region, has space to the right.

With touch strip, you pretty much have to look away from the screen, onto the strip.


>When using these applications, I can copy files using F5 - and I know it is F5 without looking, because it is in the middle and has an empty space to the left. Similarly with F8 (delete) - in the right region, has space to the right.

Well, no such space on the physical keyboard I'm using now. Not after F5, and not after F8.


You are exception, then :)

Yes, there are such keyboards, from the popular ones Thinkpad [TX][245]30 series for example and the previous rMBP too. However, most keyboards do have the spacing.


Actually Xcode has been updated for touchbar, and has some useful code editing commands. But does not show any debugging commands in the touchbar. Perhaps it will in future.

It is possible to show F keys by holding the fn key and you can configure the touchbar to always show F keys by default.


Are you kidding me? How can you not see that this is nothing more than an arbitrary habit that you've grown comfortable with? Imagine the function keys had never existed to begin with. Don't you think we would have come up with a different way of stepping through code with a debugger? I understand that it's annoying to have to change your habits, but you're a developer for crying out loud: Your job is literally to change how other people do work, to make it more efficient, easier to learn and so on. We all know that our users often resent us, because we change how they have to do their work. But we do it anyway, because we believe deeply (and mostly rightly) that the benefits of progress outweighs the short term annoyances of having to change habits. But when we're the ones who have to change, hoo-boy, suddenly the sky is falling. Give me a break.


Well...pretty much everything you do is an arbitrary habit that you've grown comfortable with.

Sleeping in a bed is an arbitrary habit that you've grown comfortable with, why not sleep on the floor, or in the bath?

The fact is that when developing and debugging, the function keys represent the most efficient way of stepping through code, and a part of this is to do with their physical presence on the keyboard.

I know that I don't have to use them, there are other ways to achieve the same thing, but those things have always been there and I choose the function keys because they are the best option.


Sleeping in a bed is an arbitrary habit that you've grown comfortable with, why not sleep on the floor, or in the bath?

Because contrary to what you say, sleeping in a bed is not just an arbitrary habit. The bed is a special purpose piece of furniture, optimised for sleeping in. The use of f-key to step through a program is OTOH simply an accident of history. The f-keys were chosen because they were there. Had they not been, some other solution would have been invented, using the keys that were there, and (this is my point) the solution would have been just as good!


I must admit, I rarely use the function keys as they seem to be different everywhere... Mainly from years of VS usage, actually...

But I use the escape key hundreds of times a day.. not having that as a dedicated key will hinder me as much as removing the backspace key.. I mean nobody needs to go back, and you can just use the mouse with cut..


>Sleeping in a bed is an arbitrary habit that you've grown comfortable with, why not sleep on the floor, or in the bath?

Because your back will hurt, so not that arbitrary after all.

You will have no adverse effects of using an alternate method to step in the debugger.

>The fact is that when developing and debugging, the function keys represent the most efficient way of stepping through code

Citation needed. They are at best a random accident. Any other keys or key combos could be used.


> The fact is that when developing and debugging, the function keys represent the most efficient way of stepping through code, and a part of this is to do with their physical presence on the keyboard

Emacs users are blinking skeptically.


> an arbitrary habit that you've grown comfortable with

You mean a workflow?

Who are you (or anyone else) to decide what works best for me? Am I not capable of making my own decisions? Do I really need a hardware company making those choices for me?


>Who are you (or anyone else) to decide what works best for me? Am I not capable of making my own decisions?

Most people aren't -- from politics to personal finances and relationships, there are tons of bad decisions everywhere one looks. (Including my decision to answer this comment some would say -- heh).

We have schools, best practices, guidelines, standards etc, to try to enforce some good decisions upon people.

That said, if one feels strongly about it, there's always the decision NOT to buy such a laptop.


> That said, if one feels strongly about it, there's always the decision NOT to buy such a laptop.

And many people are making that decision, so what seems to be your problem with this?


>And many people are making that decision, so what seems to be your problem with this?

No problem with this.

My problem is that they frame it as if their personal habits/users are universal, and a computer that doesn't cater to these is inherently bad (as opposed to just bad for them).


Who are you (or anyone else) to decide what works best for me? Am I not capable of making my own decisions? Do I really need a hardware company making those choices for me?

Did you not read beyond my first sentence?

Who are we as programmers to decide what works best for our users? Were the clerks at the bank not capable of deciding for themselves if their pen and paper workflows worked better for them than the computer programs we made to replace them? The typographers of yore were almost certainly more comfortable and faster using a linotype machine than this new fangled desktop publishing software, that we invented. I simply cannot wrap my head around people in our profession who kick and scream because the march of progress once in a while makes their lifes a tiny bit uncomfortable for a short while.


> Who are we as programmers to decide what works best for our users? Where did I say that? You seem confused.

And your argument regarding publishing and banking software is a strawman intended to shift the focus away from the real argument - that of choice. Forcing a change on my workflow can have very real effects on my ability to generate income. Why should anyone be ok with that?


> Who are we as programmers to decide what works best for our users?

Where did I say that? You seem confused.

I am saying that we as programmers force people to change their habits all the time. We do it to in the name of efficiency and progress. We eliminate workflows, we make entire jobs redundant. We of all people should be able to recognise that even though change is uncomfortable, it is inevitable, and mostly for the better.

And your argument regarding publishing and banking software is a strawman intended to shift the focus away from the real argument - that of choice

Please. Even if we pretend that you don't still have the option to use a third party keyboard, or buy one of the Macs that still have the f-keys, what about the people who would prefer the new touch bar to the f-keys? What about their choice?

Forcing a change on my workflow can have very real effects on my ability to generate income

I'm sorry, but that's ridiculous. You are not going to feel a very real effect on your ability to generate an income simply by being forced to learn a different set of shortcut keys to step through a debugger.


>Forcing a change on my workflow can have very real effects on my ability to generate income. Why should anyone be ok with that?

How is that different to any workflow used (and could be preferable) by millions of people that's deprecated due to new software programs?

Not to mention software that entirely kills their job and their ability to generate income from doing it altogether?


Well, in this case, it's not deprecated. It's deprecated by one computer manufacturer. There are more than enough other computer companies still willing to sell you a keyboard layout like the one you've been used to for the last 30+ years.


> Who are we as programmers to decide what works best for our users?

The problem is, say you're an iOS developer, you get no choice; you HAVE to run a Mac and be at Apple's mercy.

Other programs usually have decent alternatives or you can customise them to suit you,


The problem is, say you're an iOS developer, you get no choice; you HAVE to run a Mac and be at Apple's mercy.

Firstly, it's not true that you don't get a choice. Apple still makes laptops with f-keys. And you can always plug in a 3rd party keyboard.

Secondly, an more importantly, of all the options Apple don't give you (and there are an infinite amount of them), this one is so minor. Why, other this is how you are used to it, are the important reasons for using specifically the f-keys to step through a debugger? What is wrong with any of the other keys?

I agree of course that change merely for the sake of change is not a good idea, but surely, surely we can all recognise that Apple did not make this change on whim, simply to try something different?


You can use Visual Studio on Windows to write iOS apps with Xamarin or Cordova, using a network-attached Mac solely as build server, without ever having to use it (except for updating stuff, via VNC)


It kind of is... this reminds me very much of moving from a blackberry to an iPhone. Sure, the features of the iPhone were great, but for my (at the time) primary purpose of using the device for sending emails, it as a MASSIVE step backwards. I could touch type nearly as fast on a blackberry as on a regular keyboard. Moving to a touchscreen meant I had to look at the screen while typing. It slowed me down tremendously, and IMO was a massive step backwards in usability.


It's an interesting comparison, but IMO not that apt. Going from a physical keyboard to a touch screen, something is definitely lost (even though much is also gained). The removal of the function keys will at worse force people to memorise different hot keys (any reason why the number row could not serve the same purpose exactly as well?), and at best it will make providers of IDEs and other productivity software revisit old assumptions, and improve the usability of their software.


They aren't just function keys. Losing the tactile feedback of volume and screen brightness, while not THE END OF THE WORLD will very quickly become an every day annoyance for me. In order to... have a contextual touchscreen that forces me to take my eyes of the monitor to use?

Sorry, I'm just not seeing the draw. I'm also struggling to buy into the "not everyone is a touch typist" excuse. Anyone under the age of about 40 has had a typing class. Anyone under the age of about 25 (who is using a computer for their job/attending college) knew they were going to spend the rest of their life using a computer and probably paid attention.


>They aren't just function keys. Losing the tactile feedback of volume and screen brightness, while not THE END OF THE WORLD will very quickly become an every day annoyance for me. In order to... have a contextual touchscreen that forces me to take my eyes of the monitor to use?

Most people "take their eyes of the monitor to use" the function keys. Given this, the function row strip will finally be more usable and more obvious for lots of other uses besides volume and brightness (things that people at best adjust a dozen of times a day).

Most laptop users are neither programmers not touch typists that use the function row 2000 times a day. Nor does being a "pro" users means you are either of them. A graphic designer might not be a touch typist or care for the f row, but he is a professional. Same for a doctor, an architect, a musician, a videographer, a CEO, an accountant, etc etc...


Sorry, I'm just not seeing the draw

"Sorry, but I'm just not interested in this motorized wagon you've invented. It's noisy and ugly. Could you please just go and invent me a faster horse instead?"


So you're comparing a touch bar, which has already been done before and met with the exact same pushback, to a radical new form of transportation? Fanboy alert.

This is the next Apple Watch in the making.


>This is the next Apple Watch in the making.

So, a product that went from non-existent to being the #1 sold item in it's niche, the #2 in overall watch sales, and outsold competitor smartwatches 10 to 1, becoming a multi-billion dollar thing?

And all that in it's first 2-3 years (it took more for the iPod to become ubiquitous from its 2001 introduction), and while not of course being expected to become the next iPhone anyway...

Yeah, some failure.


No, I am making the observation that most users are conservative and resist change, even for the better. And that most people can't recognize progress even if it hits them in the face.


> any reason why the number row could not serve the same purpose exactly as well?

Then you lose the ability to change the code while you're debugging it, or end up with a modal UI. No thanks.


Do modifier keys (control, alt, command, fn, shift) fall under your definition of modal UI? If so, how do you deal with the fact that your keyboard doesn't (I'm guessing) have dedicated keys for cut, copy, and paste? I think its safe to assume that everybody uses those significantly more often than they step through a program with a debugger.


Imagine the function keys never existed.. okay! That made it all better.


If you're using a MBP at the moment - then you're already pressing the fn key to use the functions keys for anything other than the media-stuff they've been designed around for years. Your workflow is exactly the same - you're just using soft keys instead of physical ones.


Or change the preferences so the function keys work as expected and you have to use the fn key to access the media options. I probably use the function keys 100x more often than the media ones.


There is option on Macs (and basically ever other computer) to flip that so Fn is required for media functionality instead.


You can change that setting, and I imagine most developers who rely on function keys do, so no I'm not already doing that.


Well then it's still exactly the same? You can either press fn or change your preferences. You opted for the latter. Do that again?


So the touchbar has a setting where it behaves precisely like the current MBP? Esc, row of 12 keys, which are either f1-f12 or media with a setting, and are toggled between with fn? That's strictly worse if I only ever use it like that because it's the same but with soft keys, but sure that's basically the same.


No you don't, because this is a Mac, and macOS doesn't use the F keys ever.

If you're using Windows, the F keys appear on the Touch Bar. /story


The debug step keys will show up in the touch bar? If they are not stupid enough to swap them around sometimes, then they should stay in the same position.


>What should I do instead? Serious question

Use the strip bar in debugger keys mode, or function keys mode for non-updated apps?

Or use printf statements (seriously -- I never advocated for much debugger use, unless it concerns very focused stepping. A lot of people step all around and examine everything and anything for ages with no clear idea of what the bug might be).


Use Fn and numbers as function keys and you're done.


More than a few developers have spoken up on the issue here on HN. The escape key is commonly used, as are many other function keys. Other vendors have rolled out similar gadgets to apple and they've been rejected by the development community at large - which apple would have known because they don't implement things like that willy nilly.


Since we're talking about what is basically a rebrand of Xamarin Studio, the current release of Xamarin Studio for Mac uses the function keys for stepping in the debugger


Well, now they can use the touch bar to show those functions depending on context. Isn't that better overall for usability rather than having to remember keybinds for every action?


I usually don't have to _remember_ keybindings, I use them out of muscle memory and feeling the actual keys with my fingers, using the larger cracks in desktop keyboard between the function keys to find the right way quickly without averting my eyes from the screen. No touch bar can replace that kind of efficiency.


Maybe, but for the average user who doesn't have keybinds committed to muscle memory, the touch bar is going to be a real boon.


So... the common non-pro user that would look down on the keyboard anyway?


If I've got to constantly look between the screen and the keyboard to make sure my finger hasn't wandered off "step into" and onto "continue", then that feels much worse for usability.


It seems like it would be better in the short term (don't have to learn the bindings), but worse in the long term if it's something you do often (no way to find the keys' edges without looking every time you press the key, with the occasional glance down again to correct for drift).

Whether that's "better overall" or not depends on individual usage patterns. Personally, I don't really stray from the alphanumeric part of the keyboard often, and I can rebind esc, so the Touchbar's mostly a moot point for me. My strongest reason to dislike it is that it adds yet another set of potential points of failure for the machine without providing (for my purposes) much benefit.


For those apps you can just enable "function mode", if it's not already enabled by default.


> they've been rejected by the development community at large

I remember the time when everybody was getting one of these: https://elitekeyboards.com/products.php?sub=pfu_keyboards,hh...

Apple tinkering with the function keys has allowed the development community to recongratulate themselves with IDE it seems.

Seems also that people confidently touch type debugger instruction, despite the key that you want being literally squeezed between 2 keys that will ruin you debugging session and make you lose the next 10 minutes.

I think the developer community at large is more prone to overreaction.

Note that I think people are going to be impacted. I think the touchbar is a downgrade for people that needs to use Windows either in the VM or Bootcamp. Some IDE are especially F-keys happy like Eclipse will be slightly worst of.

I still type on an old MBP with physical F-keys and they are actually awful for touch typing. They are smaller, evenly spaced (no grouped) and not aligned with the lower key rows. If you are serious about touch typing potentially workflow ruining key combination like debugger ones, you must use an external keyboard already.


In Terminal, I have mapped common commands that I use all the time into macros assigned to function keys. For my Ruby workflows, I have various rake commands a single key press away. For when SSHed into Linux servers I have stuff like "sudo apt-get update && sudo apt-get upgrade" ready in a split instant. I tell you, it really is a party trick at work.


You know the touch bar would have been awesome if they added it above the function key list.

And used the 2nd function (volume brightness etc) as the default for touch bar.

That way the touch bar is useful and functionality of function keys is kept. I would have bought the new mbp if that's what they did.

Now I'm going to get a dell :/


>Now I'm going to get a dell :/

Give it a year or two, and most PC laptops will come with a touch bar.


I care. I don't remember that advice, and I never have trouble relocating my fingers in sub seconds.


So you'd rather press F5 to build and F6 to debug instead of tapping "Build" or "Debug"?


Absolutely. You shouldn't be looking at the keys anyway, and that becomes a lot harder if you don't have tactile feedback.


I don't know if it's just the usual whining of people complaining about new things.

From the exterior (I'm too young and the only Mac I've ever used was the Macintosh Plus -bragging, with 4Mo of RAM- while I was a child), it really seems like Apple has dropped the ball for professionals, both developers and graphics people.

There are now equally well designed PCs (laptops and desktops) from other manufacturers, Windows has a lot of support, if you want to use Linux, the kernel now supports a wide range of hardware.

In the meantime, Apple doesn't have a good desktop offering, and their laptops seem gimmicky to me while not offering a performance edge over their competitors.

A few months ago I thought about buying a (my first) Macbook and waited for their announcement, now I'm looking the other way.

And around me, I'm the guy people go to ask when they have a computer related purchase to make.


> equally well designed PCs

That bit is debatable.


I agree, PC manufacturers havent quite caught on that less is more (at least aesthetically)

I see machines like the HP Spectre and feel they are still somewhat overdesigned.


I don't now where you lived the past 20 years but theses are maybe the most classical arguments against mac in business.

Mac have never been "for Pro" if you go that way. So I guess it confirm that it's not the company that shifted it's focus, but maybe there is more potentials customers expectation of Apple going more "pro".


They may be classics, but are they wrong?

Yes, Apple has never targeted businesses, but they did target individuals professionnals at some point. Video and sound editors used to work on Macs, and now move to windows. It's a classic, but is it wrong?


I guess as always it will mostly depend on your use case.

However this recent announcement from IBM might emphasis the need to reconsider theses year old arguments to take accounts of new facts : https://www.jamf.com/blog/debate-over-ibm-confirms-that-macs...


I knew about that (and Google's Macs).

But for standards businesses with relatively technologically inept people, when you need an Active Directory, you just need it.


Microsoft announces VS for mac, HN thread turns into I hate macs rant. Sigh.


I feel so strange about this.

On one hand microsoft stepped up their attention to individual professionals (e.g. designers, developers, etc., more or less like Apple 15-20 years ago) and apple seems to forget them.

On the other hand, microsoft is infamous for privacy, for their forced upgrades, their updates, whereas apple emphasizes privacy.

I like macbooks as laptops, but on software front apple seems to rest on their laurels and developers seem to use macbooks, either because unix or there's more money in App store than elsewhere.


People aren't going to do development on a tablet or a smartphone anytime soon.


Don't speak too soon.

"Why would you buy a PC anymore?" - Tim Cook on the iPad Pro launch.

Don't be surprised if XCode for iPad Pro is a thing soon. It's becoming clear he sees iPads as the future of computing, it's all he uses personally and developing it's own apps is one of the last things it actually can't do.


You suggest you will develop any serious software without access to a file system? And on a 10 inch screen?

Also developers like to you use their own tools. That doesn't fit well with the "you will do as you are told" approach of Apple.


and only $150 for their external keyboard with touchbar!



iPad Pro using MOSH into a Linux box running tmux and vim. A great development environment for me - 11 hour battery life, built in LTE connection, huge screen that I can layout how I want, easy to carry around with me and I can use it for drawing and sketching UI concepts for my clients before I start on any work.

I don't use it 100% of the time (I also have a desktop machine) but it works extremely well and offers me things that a laptop cannot.

EDIT: It also doesn't have an Escape key - but I use Ctrl-C in VIM as I don't have to move from the home position then


> People aren't going to do development on a tablet or a smartphone anytime soon.

My set up proves you wrong: Nexus 7, bluetooth keyboard, a Debian chroot with git, vim, python and apt. I take that with me to sketchy places instead of my laptop. I think a 10-inch tablet could be more comfortable to read, but I love the Full-HD screen and pocketability of the 2nd-gen N7. I find it amazing that 3 years later and no other Android tablet has a screen resolution that's comparable.


While I agree with you there is that thing for trying Swift on iPad.


It's a toy for learning, nothing more.


It could be an MVP.


It is, for learning basic Swift. For developing you want to use a setup that allows you to handle thousands if not hundreds of thousands of lines of code. A single-process touch interface isn't that, I think.


yeah, recent Microsoft attitude for development tools is excellent. They opensourced a lot of codes, released Visual Studio Code,and integrated Windows with bash shell. I assume Apple is becoming less dev-oriented company, seeing they are making light of pro tools like MacPro and function keys.


While I agree... MS DevDiv must be an interesting place to be right now. The Linux subsystem for Windows sucks so bad, and the windows containers for docker feels half baked as well... Use the msys bash that comes with git instead.

That said, I'm happy to see this and do hope to see a similar effort for Linux as many Mac users are starting to move on. This may well be an indication that the next VS on Windows may well be based on the MD base... Should they want to unify that.

I've been very happy with VS Code for my needs all the same. Can't recommend it enough for js/Node.


For those of us that only care about XCode and Swift/Objective-C frameworks, Apple is doing just fine.


And those of us heavily dependent on UNIX tools just install homebrew and are fine as well. It's actually better than it ever was.

I have no idea what these self-appointed "pros" actually do. Their work seems to involve a awful lot of swapping hardware components, attaching peripherals in a jurisdiction that frowns upon adapters, and pressing ESC.


That's quite an exaggerated presumption. Office for mac has been around for years. VSC will more than likely give sublime a run for its money as they are very similar on the Mac platform. Just because microsoft ports VS, it doesn't reveal anything other than giving developers that prefer Macs another IDE choice. You can get a MBP without function keys and just because Apple has decided evolve their approach to the keyboard, it doesn't mean anything beyond that. It's very interesting that this has become such a hot topic in the dev community. Apple isn't abandoning devs, pure and simple.


Office for Macis also a shadow of the windows version. Excel is still single threaded!!! Madness.


Do most developers really need anything with a CPU more powerful than an iPhone 5 to build apps or web pages? In most cases if they do they're doing it wrong.


Yes, in many cases. Building is very CPU and memory intense. Many times you have to run local severs to debug on. I often find myself running local servers, building my apps, running my IDE, creating assets in Photoshop, and more all at once.


It helps. No reason to artificially limit processing power available. If I spend seconds more waiting for my Go command line program to compile, or Rails to boot up, or a complex web page to load, or my VMs to respond, or Elasticsearch to boot etc. etc. it all adds up. It's time that I can spend doing other stuff.


True, but we were all using most of that same technology when a core2 duo was a top end laptop, and it's not as though productivity has gone through the roof just because we can get an i7 laptop.

Mobile CPUs are approach mobile core 2 duo performance.


Until you need to work with something that requires babel/gulp/etc builds, it's fine.

The second either is added to the project, you will need a much more powerful computer.


True those are slow. But I think most projects could compile a lot faster if the dependencies were compiled into a separate bundle once a day and the app code into its own bundle when it changed.


yes.

it's not needed if you're building a JavaScript app with Sublime text.

But compiled environments such as .Net take a lot of cpu and memory. It's not uncommon to have 2-4 visual studios open which can take up 10 gigs of ram. Couple that up with continuous build/unit test frameworks and 5-10 VS add-ons and you need a powerful machine to make it responsive


I'd argue that some of what you describe should probably be an anti-pattern, or is at least a byproduct of very fast, cheap CPU power. Emacs has worked for years on much lower-end hardware and has been used to build some very elaborate systems.

These days a typical developer laptop is 5x more powerful than the system the code will be used on (mobile device, virtual server, etc.)


I do not disagree with anything you said. The only reason for powerful developer workstations is that each second of delay is compounded when you use your machine as a tool. I do use lightweight tools but Visual Studio is still extremely powerful and not easily replaceable (I've been using it every day for past 10 years). Visual Studio is NOT fast. Especially when you add ReSharper into the mix. You could go on arguing about what is the best IDE and if ReSharper is necessary. The fact is that lots of people still use it and a powerful machine is needed today to make those responsive


Macbook pros haven't been for 'pros' for at least 4 years: http://daringfireball.net/linked/2016/11/08/reddit-mbps


I think Microsoft is saying that they want the market, they want their products running on all platforms, and not that "Mac OS is a serious platform".


Um, Apple also makes an IDE.


Page has gone. Other people (buried in the comments) have posted the google cache of it, but here is it again for visibility:

https://webcache.googleusercontent.com/search?q=cache:Vk2On-...

Maybe a Feature Request for HN, would be for a 'alt' link (that mods update) as part of the clickables under the post title?


Since popular HN submissions will often hug the site to death, a nice feature would be to automatically check the top 3 caches (archive.org, Coral, Google) immediately on submission, before the page goes live on HN. If the cache doesn't 404, the content could be quickly parsed to check it matches the submission and, if so, automatically include the cache link at the top of the submission page.

This would save people manually posting these all the time and would, in many cases prevent the case where it becomes impossible to retrieve a cache, because nobody thought to access one before the slashdot effect occurs. Or, as in this case it seems, the article is pulled.

It would also be nice if, 24-48 hours after submission, the only cache link remaining is archive.org (if they have the page), so the content is retained permanently as-submitted. It's rare, but sometimes a page will be updated so the comments no longer make sense.

It would also be nice to include a link history in the same area (have requested this before), in case the original submission is changed by the mod. Usually when this happens the notice is the top comment, but sometimes it isn't and the discussion can be quite confusing as a result.


Or, they could just have a live link AND a cache link and let people click what they want. A story with a cache link is likely to be upvoted more, encouraging people to do it.


I wasn't suggesting only displaying the cache link, but rather providing a list of alternate links (if available - not all sites allow archival, or the cache might be stale) so they are always there at the top of the page.

It's not exactly clear what is best netiquette regarding linking because larger sites that rely on advertising and can handle the traffic will welcome more clicks, but smaller sites would rather be cached. Better for HN not to encourage or discourage either, but give both options and let the readers decide.


That webcache link is throwing a certificate error in Chrome for me. I can't go there due to HSTS.

Here's an internet archive link that works for me: https://web.archive.org/web/20161114070745/https://msdn.micr...


Strange, not sure if this is a bug or if MS published too early and retracted it?

Edit: Thanks for the tip BTW. I had linked to this from a blog post, but have now removed the dead link (https://unop.uk/cross-platform-native-mobile-app-development...).


Almost definitely the latter. It was tagged `Connect()` which is a MS conference happening later this week. It's probably going to be announced there.


Ah OK, it was probably under embargo and got posted early by mistake. I've been to Connect(); before in a time zone ahead of the US and nothing gets announced in the morning until the US wakes up and the embargo is lifted.

Edit: FYI Connect(); is November 16th-18th starting 9:45am EST https://connectevent.microsoft.com


So as for now it is the rebranded and polished Xamarin Studio - hopefully, they have improved it's usability as in the past it was pretty lacking compared to VS 2013/2015


As I understand it, Visual Studio for Mac is simply a re-branded Xamarin Studio and will continue to be. It includes the improvements they had planned for the next release but I doubt they will do a rewrite.

I actually prefer Xamarin Studio over Visual Studio (on Windows) in some respects. For example, the Xamarin.Forms XAML previewer is much better. Looking forward to a full designer.

P.S. I've got a four part blog post series this week on Xamarin.Forms. Starting with this today: https://unop.uk/cross-platform-native-mobile-app-development...


Visual Studio for Mac is a re-brand of Xamarin Studio. However, perhaps more importantly than the software itself, opening the Visual Studio brand to the mac environment signals Microsoft's commitment, yet again, to meeting developers in their chosen environment. Xamarin Studio is also based on MonoDevelop, so maybe this could accelerate a VS linux port, if they're so inclined.


I just started looking into Xamarin Forms and couldn't figure if it was just blindly typing xaml or there was a way to preview it. Thanks for the posts.


Yeah, for now you have to type the XAML but a designer is coming. There is a preview of a previewer but I couldn't get the one in VS to do anything. The XS previewer did work for me though. Make sure Android Studio / ADK is updated if only iOS is working.


I've only tried in VS so far. I'll check XS. Thanks.


I think Microsoft will end-of-life Xamarin Studio and will convert it to Visual Studio brand with just support of OSX and Linux


I wouldn't bet my money on a Linux release


You should, Microsoft will probably release an MS Linux distro soon. (Besides the Debian or whatever they now have for Azure users, I mean.)

MS is going all Oracle-y on us, withdrawing from the consumer market as fast as they can, and heading for the green pastures of the corporate market, where fat happy companies are there to be milked. They are sick and tired of fighting for consumer money one penny at a time.

(XBox is probably not long for this world either.)


The massive investment in the surface line disagrees with you.


.NET Core is for running apps on Linux. For a large chunk of the server market, that is plainly a requirement these days.

But most people writing the apps which then run on Linux, develop them on Macs (and the point of WSL is to give them a reason to consider Windows for this).


You'd probably be better off going with JetBrains' Project Rider on Linux, if you were just doing general .NET work than waiting.


+1 for Project Rider. The EAP is pretty good at this point, it supports debugging for .NET Core and decompiles dependencies when using Go To Definition.

It still has a few bugs and hangs up on me once or twice a day. I think they should have those ironed out soon though.


Really? They seem to be opening up quite a bit more with regards to Linux (.NET Core, WSL, etc).


To me it looks like they are giving people running linux the minimum tools to run that stuff, but at the same time not giving enough to encourage devs to stay on Linux (WSL goes in the opposite direction, its purpose is to help devs move to Windows). But this is all IMHO, take it with a grain of salt.

By the way, since it's based off MonoDevelop, you can just use it con Linux.


I've been using Visual Studio Code on Linux to do some TypeScript development. It's actually pretty nice, and works really well in a general JavaScript environment (npm, etc).


We are waiting since years for a simple xamarin studio for linux, even just supporting only ubuntu and letting the other distro to manage the packages for them... The reality is it seems they don't want at all to bring Xamarin on Linux, sadly Miguel was very clear about it in the years.


IIRC he said they discontinued Xamarin for Linux because of a lack of demand


Thats what this move is. VS for Mac is a rebranded Xamarin Studio.


I'm glad to see more Microsoft dev tools on other platforms but don't lose sight of why this is happening. Microsoft is shifting their business to the cloud. They make their money off Azure and other services. In other words, they are making their money mainly off of developers now and its in their best interest to get on the good side of devs which is why they suddenly have a vested interest in open sourcing tools and helping Mac/Linux. Given the love and lavish praise I see heaped on Microsoft in every thread they do something it's clearly working. I'm not saying don't praise them when they do something good but don't be deceived into thinking they are doing it out of good faith.


> don't lose sight of why this is happening. Microsoft is shifting their business to the cloud.

Why is this a bad thing? Microsoft is a company, they exist to make money. A huge new market for them is 'Cloud' and they're doing everything they can to make that as appealing as possible.


If you want to be the tallest tree in the forest, you can take in the most water and sunlight to be taller than everyone else, or you can chop down all the other trees so you're the only one standing.

Back when they made money from selling operating systems, they were definitely doing some tree chopping with some of their practices.


Did they say it was bad?


No business operates out of "good faith". For a long time, people treated Google like it was a nonprofit with all of the talk about making the world a better place, rather than recognizing it as, first and foremost, an ad company. As you point out, making people like your company is just good business, and successful businesses do that.


What's the long-term business goal for Azure lock-in? If .Net runs as well on Linux as it does on Windows, then there really is no reason to use Azure over any other cloud provider like AWS where generic CentOS or Ubuntu boxes are no different than their Azure counterparts.

Back when .Net was Windows only, they gave it away because the goal was the developers would pay a lot of money for MSDN, GUI apps on Windows, SQL Server, Office, and Sharepoint integration, etc. But .Net core is mostly server side, so I'm having trouble figuring why they'd bother giving away VS to Mac users without being forced to run on Azure in production.


> so I'm having trouble figuring why they'd bother giving away VS to Mac users without being forced to run on Azure in production.

It's the trickle up theory of modern development. Get folks using your languages and tools, and if you make it easy to integrate with your cloud services in those tools, that's where the revenue comes in.

It's probably an interesting number how much revenue is generated by various cloud providers for folks who just forgot to shut down their VM


Because AWS treats the cloud as infrastructure. Microsoft treats the cloud as infrastructure + software & services and they have superior software. Using their tools means exposure to their software which means that a lot of people will chose paying for other software.


Literally every business does this. Apple Google


Most interesting part for me:

"""For the functional programmers among you, it includes excellent F# support, powered by the same F# compiler used in Visual Studio."""

I've heard that F# is great from multiple people I trust a lot (and a quick cross check showed it does indeed look very cool) so I might give it a try once this is released.

I do some C# development (Unity Engine stuff) on my Powerbook so this is also good news (MonoDevelop is fine but I'll obviously test VS for Mac)


F# is well supported by Visual Studio Code (with the Ionide plugin) too if you want to try it sooner.


This will essentially be a rebranded Xamarin Studio and tweaked to look more like VS for Windows. And Xamarin is basically based off MonoDevelop.


I started converting some c# code to f# as a learning exercise about a month ago. After a few fits and starts, something clicked and I am amazed at how much I like it. I have been following the articles on "F# for fun and profit" and highly recommend it.


OCaml guys don't seem to like F#, as its implementation differs from many in the FP world believe is "correct", but any form of FP on .NET is a good thing, I think.

More

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: