One of the many inspiring stories on HN. The methods sound great, the company sounds great, and I'd want to either work for them or start my own company like that.
But I'm a student and I guess I should finish my studies before dropping out. Despite the great tales of Bill Gates & co, it seems like the right thing to do. It also gives me an opportunity to watch behind the scenes of "real life" companies. Internships.
And that's what I want to ask about: All companies seem so traditional... There is no space for the methods described in all these HN stories; no time for innovation. "The customer wanted these features last week, do you think we have time for [you name it]?"
So I don't know what my question really is... It's just, these stories sound so good, but during internships, it turns out they're no more than stories. Or are they? Or is it an American thing or something?
In addition to finishing your studies, consider working for a while for one or two startups in entry roles (as opposed to internships). Most internships that I've seen have not been given enough responsibility and insight into the company to really see the mistakes that were made or the decisions that lead to success. Even if the startup fails (and most do) you get to see first-hand how easy it is to screw things up.
FWIW, it took me 15 years of working for others before I felt like I was prepared to start my own company. YMMV.
Much of what you hear on HN regarding agile and modern startup practices is a case of William Gibson's aphorism: "the future is already here, it's just not evenly distributed."
I've worked in startups that fit the HN stories but also have other classic failure stories that led to their demise (agile development, a few good customers, but lack of Steve Blank style customer development, for one).
I've also worked in large companies that don't have time for innovative processes and approaches, beyond "the way they've always done things". But they do often have good executives or managers that have the ability to provide air cover for pockets to thrive with new approaches. These tend to be fragile to politics, however.
The algorithm I've used is pretty simple:
a) Hunt for a place you can learn from
b) otherwise, hunt for a place open minded enough that you can learn by leading and make mistakes
c) Goto A
I guess I've been in your situation as a CS student. I dreaded the thought of working for one of the big Dutch IT companies or consulting firms and their string of boring and/or failing projects. None of the Dutch start-ups appealed to me, I didn't feel ready to start my own, and I figured I could learn more from success than from failure.
If you're studying computer science in university, I can really recommend getting an internship at a major American company. It's very useful to get a peek in the kitchen of hugely successful companies. They can be a bit tricky to get into from Europe. Some universities give better access than others. I did an MSc in Parallel and Distributed Computer Systems at VU Amsterdam. An internship at Google, Facebook, or Amazon is almost a standard part of the program. Of course, you do have to pass the interviews yourself, and work very hard in general. Most of the TU's also have pretty good access to Google and Microsoft internships.
I did a 5 month internship at Amazon right after I finished college and it was a fantastic experience. Amazon is a start-up-like environment, with small, autonomous teams churning out new services and features all the time. You typically get a good deal of freedom to plot your own course and rethink the way things are done if you dare. "Disagree & commit" they like to call it. There are tons of really smart and experienced people to learn from, very interesting problems that force you to really understand distributed systems top to bottom, and some brilliantly simple solutions to problems such as large-scale software deployment. One of the best parts of the experience is figuring out how the business works and why things are done in a certain way. I was part of launching a new Amazon Web Service called CloudFront during my time there, and came up with a couple of new concepts that are still in use today. Different companies have different cultures, but I know many people who had similarly good experiences as interns at Google or Facebook. The experience does depend heavily on the team that you end up in, but also on how well-rounded and adaptive you are by then. Take your time.
I can also really recommend studying/finishing Computer Science. University is a unique chance to develop independent, analytical thinking together with your peers and build up a solid background in the field, which will give you a huge advantage throughout your career. Remember that for every dropout millionaire, there is someone like Larry Page, Sergey Brin, or Jeff Bezos with a prestigious cum laude MSc degree outsmarting them on the long run. Speaking of which, try to aim for cum laude, not because anyone cares, but because working hard on hard topics makes you smarter.
After the internship, I went on to do a PhD in cooperative self-driving cars, because I found that a hugely interesting problem, but I kept a part-time software development job at Amazon.
btw, in case you didn't notice, Mendix is originally a Dutch company, so you might not have to look very far.
I have seen many Java based Process Modelers Fail to live up the the promise of allowing "business users" the ability to create workflows without writing any code. It appears however that Mendix may have figured out how to make it work. I think that it has to do with the entire closed-loop solution (develop, deploy, monitor, app store, SVN integration, Domain model management, DSL's etc). Without those supporting pieces all working together, developers would inevitably need to be involved, thus stunting business user adoption.
Agile is mentioned quite a lot in this article, but something doesn't feel right. It is rather hard to conquer some ground in the Enterprise world as an Agile start-up. I don't think big companies are ready for this.
agile is particularly hard once you have paying, enterprise customers on any kind of SaaS - they demand stability as they have live, costly processes running over your stuff. any bug, regression, UI change will be met with hostility (down to legal).
that kills a lot of agility as QA suddenly becomes way more important, simply iterating something that isn't working right is simply not possible. and getting rid of technical debt is becoming a very interesting exercise.
as long as you're building that plane on the ground, in the hangar, it's technical nerdvana. but once it's in flight, with paying customer who pay for eternal uptime, you suddenly realize why SaaS is freaking hard - and pure agile does not cut it (pure as in what's out there, I am sure someone here will point out that it perfectly applies in theory).
Very true. Right there now. We have a lot of huge clients and are an enterprise SaaS. The killer for us is having not automated QA up front. The manual monkeys with typewriters approach is killing us.
They were not transforming any enterprises, they were selling working software. That is a much simpler problem. Most companies do not care about your internal methodologies, they only care about delivering what they need.
This does raise an interesting point. Maybe this is just me, but have others been noticing a great disillusionment with IT? I can universally bash IT to anyone I know and no one ever defends them. If what the people I am talking to is true, IT departments everywhere are focused on preventing change and delivering things without value.
Regarding your latter point, I think this comes from the exposure of technology increasing every day.
I've built and operated a helpdesk for a 150,000 employee organisation with a high security requirement.
People bash the IT guys because they think they understand it, but they don't.
Users really are absolute fucking criminals most of the time purely due to either ignorance or arrogance.
IT departments are primarily focussed on protecting people from each other, the internet and from making massive fuck ups. Every user who comes whinging to the helpdesk over a trivial issue or a brick wall is usually unaware that it's there for their own protection. It is the thing that is probably stopping that secure document pissing off out of the building on a USB stick.
The 150k member organisation in question eventually had to lock their PCs into custom built frames with a hole for a finger to turn them on, all USB ports epoxied up, MAC address locking on all switches and oodles and oodles of cable ties and body search on entry and exit to stop USB sticks, phones etc being taken in.
That's the front line for you. They need defending and I stand up for them regularly.
The above dystopia is in fact why most people hate IT. It's not the front line, it is the policies such as expoxying USB ports for data loss prevention that are reviled. Some organizations really do require high levels of DLP - national security for one, production support for secure managed services is another. Others are utterly draconian for questionable cost/benefit reasons.
It should not be considered criminal to use documents on multiple devices. Thankfully saner heads are prevailing as bring-your-own-device policy management and app containerization becomes mandatory now that the CEO and VP of marketing insists on using their iPad.
The incentive structure of an IT worker is similar to that of a government bureaucrat: When in doubt, ban.
Because the one time something happens, you're gonna hear that the systems were 'improperly secured.' Then they'll call in external auditors fresh out of school working from an over-photocopied checklist to tell you how incompetent you are.
Personally, I'm happy with trends like BYOD etc. If the CEO loses data because he wanted email on his iPad (and there is absolutely nothing wrong with that), then IT won't be blamed for it.
> Personally, I'm happy with trends like BYOD etc. If the CEO loses data because he wanted email on his iPad (and there is absolutely nothing wrong with that), then IT won't be blamed for it.
Yes, it will. For the same reason "the government" is blamed when someone eats a lump of poison out of a can labelled "poison".
I'm a former Big-4 sysadmin. I wrote a comment sometime ago explaining things from 'our' side. I'll go see if I can dig it up.
Edit: Found it. A thread called "Why everyone hates the IT department." I'll copy and paste to save y'all the trouble.
-----------------------------------------
I was a sysadmin for a financial services firm, and then for a Big-4 accounting firm, so I may have some perspective:
1. "For starters, I was not allowed to use my own equipment. They rattled off some gibberish about security and support even though my mail client supports SMTP and I can read Office files just fine. Also, for an agency focused on security their insistence on using Windows XP was baffling."
The first rule of System Administration [0] is to start every host in a known state. The reason for this is predictability - as the sysadmin, you know what to expect - certain software is installed, certain settings are configured, etc. You simply CANNOT manage systems at scale without this approach. Without it, testing, upgrades etc. are a shots in the dark.
Another issue is ownership. Let's say you get company email on your personal BlackBerry. You're mugged, and the device is stolen. A competent administrator will immediately issue a WIPE command from the BlackBerry Enterprise Server, so that all data is erased. But wait! Those picture of your daughter's recital were on the phone. They're gone, and you're gonna be pissed. With a company-owned device, the expectations are different.
A friend of mine once had to re-image the laptop of a senior executive. Turns out, the only pictures of her daughter's high-school graduation were on it. A year later, he's still having to feed her stories about ongoing efforts to retrieve the data...
Regarding Windows XP, OS upgrades are not to be done lightly. It requires very extensive regression testing for all Line-of-Business apps. Believe me, there isn't an IT guy there who doesn't want the upgrade to Win 7, but after a time in this business, you learn to tread carefully, as information systems can break in all sorts of subtle ways and management doesn't want to hear it.
2. "Secondly, I was told I could only use a certain browser because of another incoherent argument relying on "security." Interestingly, nothing was done to keep me from putting a pocket version of Firefox on a flash drive and connect to my own secure proxy. This is because the IT guys had no idea such wizardry was even possible."
Again, standards. When their enterprise web-based ERP system that's been tested in IE6 and works fine breaks when you're using Chrome 15.0.874.121 m, who's gonna get the call? Oh that's right. IT. Multiply that by a few hundred machines, and your network is unmanageable.
There exists technology to control USB drives, but in most organizations, it won't fly - they're simply too convenient. Besides, how do you expect the VP of Sales to load his iPod? Definitely shame on your IT guys for not blocking outbound connections to your own proxy at the firewall.
3. "Thirdly, for some reason print jobs were routed out of the office to a data center in San Angelo and then routed back to the printer down the hall. Printing a single page was non-deterministic and painful, never mind my final reports. This was a result of some state mandate about consolidating IT."
This sounds like the state's fault. They were probably sold a solution that promised centralized tracking / routing of print jobs, based on parameters such as job submitter, color vs monochrome, time of day, printer availability etc. Not IT's fault.
4. "And then there was the time I tried to install Notepad++ to do some minor dev work (they hired a CS undergrad to do financial work so I thought I'd do more than estimate results). 2 weeks later I was approved to use a similar text editor on the grounds that I already have a task bar to manage multiple documents - a tab bar is completely unnecessary and Notepad++ requires further scrutiny. 3 weeks later I had a Perl interpreter."
I was a sysadmin, and I had to get written and signed approval from my Manager, the Security Manager and the CIO to install any non-standard application. Pleasant? No. Necessary? Yes - everything needs to documented; otherwise these things spiral out of control quickly.
5. "At my current job they forced me to let them change the root password on my issued machine to something they knew and I didn't. I get why you do this: because you cannot fully trust people and I dealt with sensitive data; fine. Afterward I re-installed my OS and set my root password back. I don't understand why they don't think these things through."
Don't take this personally, but sysadmins hate people like you - you make unauthorized changes and make our lives difficult. Your IT guys sound incompetent though - why weren't BIOS passwords in place to prevent booting from CD? And since you say root password, sounds like you were in a UNIX / Linux shop. If you were in an AD environment, after reinstalling, you wouldn't have been able to join your computer to the domain without domain administrator credentials.
I understand that the situation sucks, but there are good (at least for a particular meaning of good) reasons why it is so. There is room for improvement on both sides.
Whew. Felt good to get all that off my chest. No hard feelings?
[0] Tom Limoncelli - The Practice of System & Network Administration.
--------------------------------------------
To a comment he made further downthread:
"I can't imagine it would take more than a cursory look at Notepad++ to vet it."
This, right here, is the issue. You simply don't know what is involved. "I can't imagine why you need to draw blood to see if I have AIDS, Doctor."
The specific program is not the issue. IT in the enterprise is all about centralized/standardized management and configuration. Every deviation from the standard is gonna increase the burden of maintenance.
How it normally works is that there is a single base OS / apps specification, and defined optional applications. EVERYTHING is tested against these, and guaranteed to work. This allows deployment against a global fleet to go smoothly. It's all about known quantities, which allows them to quantify everything.
Please remember that no IT guy wakes up in the morning and says "Yay! How can I make gatlin's life suck today???" It's more like, "Oh shit, last months patches are not being installed on those PCs in accounting. I wonder if it's the SCCM [0] client?" You find out 8 hours later that there's a conflict between $NON-STANDARD APPLICATION and the SCCM client. Have fun scouring Technet and forums to find a solution. These are the things that fill us with dread.
I'm not saying it's right. Just want you to see the other side.
[0]- Microsoft System Center Configuration Manager
What you say has strong kernels of truth. The problem I have is the unstated conclusion, "therefore users must suffer". There are many mitigating approaches that DON'T require making the users' life hell.
Before I begin, please understand I am not directing my criticism at you personally but rather the policies that you are defending.
To your points:
1. Large scale environments with heterogeneity have approaches available like desktop virtualization or app containerization that ensure apps can be packaged in a self-contained way so that there's less need for full-OS control.
Secondly, your remote wipe scenario makes no sense. If the device is stolen, you WANT your personal stuff wiped too! The root issue is highlighted by your re-imaging scenario: lack of regular, automated network backups. But that has very little to do with ownership amd control, and everything to do with the backup service quality and user training.
2. You're absolutely right about compatibility, but you'll notice the dodge to the original point. "Security" was defined as the reason for the use of a certain browser, not "Stadardization". Which we both know is horseshit - IE 6 has been long deprecated , unsupported, and a massive security hole, for example, to the point that its just budgetary negligence to keep insisting on that version. But again we get to "why hate IT" - its not standardization in reason, it's lying to your customers to cover up poor service.
4. I've run ops for multi tens-of-thousands of desktop environments and I have rarely seen that level of sign off required for non-standard applications except in highly secured (classified) contexts. It's draconian and unnecessary. Also note the misdirection: draconian approvals have little to do with "this requires documentation", it has to do with "preventing change". Of course exceptions require documentation, and at least line managerial approval, but requiring senior approval is a big red neon sign that the stated policy of the organization is "fuck change", rather than "embrace change and manage risk".
Finally, to your point:
"The specific program is not the issue. IT in the enterprise is all about centralized/standardized management and configuration. Every deviation from the standard is gonna increase the burden of maintenance."
I agree with the first part. The latter part is where I say "that's the cost of doing business". The problem we have with much of old IT today is an operations-penny-pinching "change is bad and our processes are designed to prevent it" mentality rather than a balance between delivering new value and keeping it running, where "change is the norm and our processes are designed to manage it".
Look man, I get where you're coming from, but after dealing with senior business execs for a long while, I feel IT-as-is is untenable. without a rethink, the abject hatred that many business people have towards IT will grow and rogue systems will flourish. Its like the PC days of 1983 agajn. The price to be paid by technology professionals for this acrimony will be large, similar to what mainframe professionals went through.
Thanks for a though-out comment. I particularly agree with your conclusion:
>after dealing with senior business execs for a long while, I feel IT-as-is is untenable. without a rethink, the abject hatred that many business people have towards IT will grow and rogue systems will flourish.
At one of my former jobs, the CEO hated IT, and didn't hesitate to let us know it, either. I was told through the grapevine that he felt it was the one area of the company he couldn't control.
Fantastic response, and I do defer to your more extensive experience :)
IT is mostly rewarded with preserving what is and fighting politics... not delivering new value. This varies from place to place of course, but it's effectively due to the knowledge gap between "the business" and "IT" that leads to a misunderstanding of the vast spectrum (10x or more!) of how well IT organizations perform.
The closest analogy is with auto manufacturing in the 70s having to face the Toyota product development and production system. It was utterly foreign and took decades for the top companies to figure it out. Now look at IT, which is at most organizations only one of many contributing departments like marketing or operations. Is it any wonder CEOs have no idea they could do things differently?
Especially when you have hundreds of thousands of people whose salary DEPENDS on the CEO not figuring this out (I'm looking at the big system integrators, enterprise software vendors, and managed service providers here). True story - If you try to (say) fire IBM out of a data centre, are you prepared for the president of IBM to call the chairman of your board of directors to subtly question the competence of your CIO? Who does your board trust more, a brand with 100+ years of exposure, or John Smith the CIO? You had better have prepped the CEO, CIO, and CFO on your plans and get them on-side, so they understand. Could you imagine doing this with a complex specialized topic like Agile/Lean, Cloud, Open Source, etc? It is like convincing your grandmother to drop AT&T or Bell South for VOIP, or convincing them that AOL actually isn't the internet. Not impossible but requires a lot of patience and trust.
As a Chairman or CEO I would straight up sack someone who, of their own initiative and with warning began a "big bang" change to the company's IT entire DNA without some sort of consultation.
Progressive rollout over 5 years? Local trial programs? Experimental work? All fine. Go ahead. Report back.
Bet-the-company without so much as a peep? Please leave.
That wasn't my point. Of course the board and CEO should be consulted. That's a far cry from having to prepare them from snipers. Vendor politics is poison.
While I am skeptical about a continuous release cycle, Agile methodologies work well for Enterprise software. I've used them at all of my Enterprise gigs, including selling multi-million dollar license mission critical sales compensation software.
There are two things that should not be conflated with Agile: Software Quality and Willingness to Accept Change
In many ways, Agile can make life better for both if you have a SaaS rather than On-prem business. For example, when building the business plan for Callidus Software's foray into SaaS, I surveyed Customer Service departments at a number of Enterprise companies and asked 1 simple question:
How many of your open tickets are fixed in the most current release?
The answer was staggering: 40-70%
So, ensuring customers are always on the latest release can eliminate 40-70% of trouble tickets with commensurate increase in customer sat. Sounds like Agile can help.
The need for SaaS comes about because Enterprise software has the nasty reputation of needing an expensive re-implementation with each new release, at least for On-Prem. With SaaS, you can do more frequent and less traumatic releases.
Agile works extremely well for Enterprise if you do it right.
Can you try to explain a bit more what "doesn't feel right"?
And yes... as a startup that does agile, cloud / PaaS, and MDD (Model-Driven Development) you have a hard time conquering ground in the Enterprise world. You need to convince people of 3 new things...
The main lesson: don't focus on these "things", but focus on the value. You do not sell them tech, but a solution to a real-world problem: they need an app as soon as possible as time-to-market is key.
But I'm a student and I guess I should finish my studies before dropping out. Despite the great tales of Bill Gates & co, it seems like the right thing to do. It also gives me an opportunity to watch behind the scenes of "real life" companies. Internships.
And that's what I want to ask about: All companies seem so traditional... There is no space for the methods described in all these HN stories; no time for innovation. "The customer wanted these features last week, do you think we have time for [you name it]?"
So I don't know what my question really is... It's just, these stories sound so good, but during internships, it turns out they're no more than stories. Or are they? Or is it an American thing or something?