Hacker News new | past | comments | ask | show | jobs | submit login

I’m with you here. Once you get the hang of it you even miss the most awkward pipelines once they’re gone and you’re just passing around single dimensional strings.

The memory overhead with sets of objects and their properties vs vanilla strings is significant though and easy to bump up against when working with large data sets. Best to try to keep all of the processing for that dataset in a single pipeline if you can. But that’s tradeoffs.




Personally I only use shell scripting for simple tasks, not anything that would run into a memory limit. If it requires more heavy lifting, I'll write it in a programming language like C#. Nothing we do in production relies on shell scripts, they exist only as shortcuts for our workflow


I think that's a good rule of thumb but not universally applicable. Powershell is used a lot with Microsoft sysadmin/identity/mail software and other vendor software used to manage or integrate with said MS software. If you're processing tens of thousands of AD objects you can easily hit memory issues. You could have your staff learn a few best practices and continue using the wealth of existing tooling/knowledge that exists for Powershell or write a bunch of custom .net code for every simple integration or job they want to run to process the objects in their directory.


Personally we use Linux boxes for everything remote, and I just develop on Windows locally


Yeah the memory usage of the pipeline is real. Often using foreach instead to save memory is needed. Or good old fashioned array chunking.


Exactly. If you really have to you can take advantage of its ability to use .net libraries directly and force immediate cleanup/gc of variables but it's super hacky and always found pipelining to work better once you get the hang of processing data in flight.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: