Tracking units is a thing that I rarely have a use for (as a non-engineer), but when I do, I’m totally mystified by how bad almost every language is at it. F# is the only actual programming language I found that even tries. It does a pretty decent job, but even it has some disappointing holes (e.g. not being able to understand g and kg as scaled versions of the same dimension).
I guess the overlap between physicists/engineers and type system/PLT people isn’t large enough.
When I was a physics student, I would have loved a spreadsheet that tracks units, and automatically does conversions as needed. There is another huge thing that this enables: Tracking uncertainties. It would be awesome if you could not just write "10", but "10 cm ± 1 cm". Then the spreadsheet could do error propagation! Even the simplest case of non-correlated errors would be useful. Also you could have functions like "weighted mean" that take the errors into account. We used to do these calculations manually with extra columns and a few macros, but it was always error prone.
Also, why is there no CAD program that allows me to enter errors? When I try to model real objects, say a room, the lengths never add up 100% and I always have gaps once I'm around the room. It would be great to say all these measurements are ± 1 cm, this one is ± 5 cm, and these angles are 90° ± 1°. And then it would do least-squares, and give me a really accurate model, or tell me where I have to measure again.
For a while I toyed with writing an extensible spreadsheet engine/app that would allow extensions to do that kind of thing - maybe I should dust that idea off again! ;-)
Seems like many people started on this but never finished it, me too. I got carried away trying to speed up my custom grid control, and borked it up when I tried to use threads :-P.
I wanted to write something that would be "spreadsheet" like that would work as a normal desktop app, as a component in a server application and on the command line as well.
Combine that with all of the other grandiose plans (writing extension in WASM) then it's no wonder that I wasn't making too much progress... interesting though!
I built a Go library that would track units + errors when I was doing my undergrad... but never published it anywhere and lost the files, sadly. Anyways, I agree this would be neat.
Julia has the Unitful package [1], which does a decent job. It's quite ergonomic, and Julia's design helps with making existing code/libraries play nicely with Unitful-encoded units/values.
I guess the overlap between physicists/engineers and type system/PLT people isn’t large enough.
C++ has supported units, as a library, since the early 90s. The Barton Nackman book had the initial implementation [1] and it was picked up years later by Scott Meyers [2].
I’ll second the utility of “pint” for anything unit or quantity related in Python.
I’m literally in the middle of writing a rate limited continuously run service wrapper around some existing one shot task code and while I initially thought about just parsing the rate definition out myself because it’s pretty simple to just split on “/“ or “per”, but after thinking through the usage scenarios I just ended up using pint to convert anything it can understand as a frequency.
So anything of the form x / time unit or x per time unit, and converting it to the appropriate scale for use by the rest of my code. A whole bunch of string matching code replaced by a one line (two if I count the defensive unit type check to prevent false positive parsing problems) that looks as simple as this…
I've recently written a units library for Nim [0]. It's still WIP, but it's already proven extremely useful for me as a physicist.
Thanks to Nim's strong type system and metaprogramming features, it allows for a fully compile time design, without any runtime overhead (in form of special unit objects or such things; everything is a `distinct float`).
In addition Nim's unicode support, the code even looks nice!
A more complex use case (I can link more if desired): [1]
The Elm programming language ecosystem has the great "elm-units" library, which prevents one from mixing illogical units, and it produces logical unit combinations:
Unit conversion is not the issue. It's strong typing the values themselves so that you don't actually plug in a value in feet that was thought to be in meters. This means each numeric value must have it's own type (or equivalent strong compiler or runtime backed constraints).
So instead of float, we have (eg) float_meters, float_feet and so on.
Julia has beautiful support for adding units to your programs. The nice thing about it is that you can plug unitful numbers into any other package and do math on them.
Sympy[0] is the major Python package I go to for symbolic calculation, it’s been more than enough for what I’ve needed (except for a few shortcomings I found in higher dimensional symbolic geometry, but I’m not going to fault them for not having my own weird esoteric use cases covered)
How so? Every MATLAB code base I've seen rolls their own unit computation. The symbolic toolbox costs money and I've seen it on university installations only. The unitConvert command is rather new. Neither mechanisms are native to MATLAB and have awkward call patterns.
If you have a language that supports tagged types and can overload operators, you can basically get there. I think a language like Idris and Julia (with the right packages) should be able to support this.
This is rather a math-specific operation (type conversion on math operators), not the scope of general purpose language. A lot of dependent typed language can achieve this naturally and most languages with ADT can simulate this.
Math is a kind of a core use case for, really, almost any programming language, and is central to much of what computers are used for (as is hinted at in their name.)
Though dealing with this aspect of units is less “math-specific” than central to the relation between math and real-world meaning, which is arguably even more important than abstract math in computing applications.
Yep, it allows you to tag something a metres per second, but it doesn’t automatically make multiplying it by a duration yield a distance without writing any custom code. F# can do that, and it’s rather cool. There are dimensional analysis libraries for other languages, but none that I’m aware of baked in.
When I see this sort of thing I always want to know something that is seldom made clear: what is actually implemented so far?.
The video begins "This is Atlas, a concept for a new engineering IDE". (To me, that strongly suggests that rather little is actually implemented yet, and that what we're about to see is a lot of canned demos, because otherwise it would be a new engineering IDE, not just a "concept" for one.) But then most of what follows, and pretty much everything on the webpage, says that Atlas does this or that, not "will do".
And the trouble with "concepts" is that unlike actual implementations they aren't much of a guide to what you're ever actually going to get. It's easy to think "X would be cool" and say "Atlas does X", but some Xs are much more realistically implementable than others, and how interesting Atlas actually is depends on which of those Xs are ever actually going to be implemented.
But reading the page and watching the video I have literally no idea where the project is on a scale from "we wrote down some things it would be cool to do and rigged up a demo that pretends to do them" to "everything here is already implemented, though there are a few gaps and bugs here and there".
This is a little odd to me - I generally prefer to write my math in code if its going to be executed. I find my preferences to be pen and paper (or whiteboard)> latex, so perhaps I am not the target demographic here.
We must be evil twins in opposite universes. I've always wanted to be able to code the equations the way they look on paper. I actually am quite annoyed that most languages won't let you use Greek characters as variables. To me, it just makes it easier to spot errors.
Julia lets you use all sorts of Unicode characters as variable names and even as operators. It works quite well in the REPL, since it supports a TeX-like input method for Unicode characters. It's probably my favorite feature in the language, since it makes the code much more concise and expressive. (I've always hated having to step down from the mathematical beauty of LaTeX-typeset equations into the ugly reality of spelled-out Greek-letter names in a codebase that didn't support Unicode.)
However, input method support in editors isn't as straightforward, which can become a limiting factor if you have to work on a Julia codebase that uses Unicode characters and your editor doesn't make it easy to insert those characters.
The problem with Julia is it only supports what Unicode supports directly. Which means you cannot have x,y,z as subscript for coordinate variables because Unicode, for some weird reason, does not have the full alphabet in subscript or superscript form. Also, when I first dived into Julia I found quite a few Unicode symbols that, when used as function names, lead to all kinds of weird compiler errors. Kind of defeats the purpose.
A language hoping to make this complete would have to do special formatting that allow super- and subscripting any Unicode symbol. But that won't work in a normal terminal...
I mean I guess that makes sense for a mathematics-oriented language. Generally, literate coding would encourage you to come up with friendlier variable names as software engineers aren't mathematicians per se and code is meant to be accessible to more than just academics/mathematicians.
In particular, I see a lot of this done in machine learning python code. Lots of "z", not "zeta", etc. Pretty tough to understand as a non-researcher.
It seems backward to try and understand a known math technique or fact from code when there are likely pedagogical presentations of the same material. And if we're talking about novel math, it seems more than fair to ask that the reader is comfortable with the domain.
Can you imagine someone learning math by contemplating the sum of effects in a program? Heaven forbid there's any emphasis on performance whatsoever.
Trying to abbreviate greek variables ends up causing a lot of issues too. It's common to have both "zeta" and x,y,z in the same scope. As well as a capital and lower case zeta. And maybe a zeta prime, and some subscripts. Unless you plan well ahead, it's hard to be consistent across all parts of the code and be concise at the same time.
Trying to come up with more descriptive names isn't practical as the variables often have very complex definitions, and it'd be impossible to explain in a variable name.
In general, that depends on the programming language though, and not all things in math are easy to parse and rely on precedence rules. Math is far from optimal in this aspect and I guess you just have experience in reading math.
Mathematical notation is not perfect, but it does a pretty good job of choosing clarity when ambiguity might be a problem. For a basic example, the fraction bar for division makes groupings obvious, where a single '/' in a long string of text can get visually lost. Of course it's understandable that most programming languages would shy away from special notation like that in favor of plain text, but it gives mathematical notation a strong clarity advantage, in my opinion.
Same here, writing code seems to map better to what you would step through solving the equations by hand. You can see how the equation “works”. But it may be, that I just haven’t gotten my mind wrapped around the concept of programming with equations yet.
This is a competitor to MathCAD which is used for things like structural calculations. I've used it for traction drive calculations. Generally you're implementing equations that are defined in papers and it's really nice to be able to make them look roughly the same.
In MathCAD you can annotate them with actual snippets from the papers too. Definitely beats coding for some specific applications.
You prefer to write code as code, which makes sense as it skips the translation part - but technical code is the implementation of the math so any given equation can represent multiple possible code-implementations.
From that perspective, it might make sense to separate the two, with good tooling. Since tooling generally isn't good, it's just as hassle.
This reminded me of http://engineerjs.com/, a hobby project we did with some friends many years ago. One can write a JS code extended with unit support, linear algebra, complex numbers, create plots, share scripts, create libraries with interactive documentation support (this one never went public), etc. Everything runs in the browser on the user side. The test version (test.engineerjs.com) even has google drive support and can read/write files on it (although Google changed something in the meantime, and it started complaining about not being a verified application). The idea was that everyone could create libraries that others can import use directly. It doesn't have this nice GUI though.
I've done a lot of napkin math in NaSC [0]. Unfortunately it was buggy and often crashed so I had to stop using it. I'd love for something like this that could also be checked into repo and used to "compile" into constants that can be used in code. For example: plan the resources required to run for `demand(requests)` and load that into code and run that to scale up/down replicas or something.
Plugging, my own site, https://calcula.tech. It's not meant to compete with something like Atlas but I like to think it's great for napkin math, even though it's still quite early in development.
Looks like fun. I like it. Right now my tool of choice for calculation is Jupyter/Python, and it would be interesting to try both alongside one another on a moderately difficult problem.
I think this kind of this is called a notebook, isn't it? Not an IDE, since it's not for development and doesn't seem to integrate anything. I can imagine it would make a nice plugin for an IDE, though, for people doing that kind of work.
This looks neat. Writing tip for the person speaking (or captioning): starting a sentence with "so" makes you come off like...something I don't have the word for, but it's bad. Don't do it. Condescending? Demeaning? It's like you don't think your audience is capable of following. I had to stop watching after four sentences in a row started that way.
Or we could not police people's language. "So" is a way to start sentences; like anything else, it can be overused. Just like "like", etc.
There's a good podcast episode from John Mcwhorter on this "So" phenomenon. [0] TL;DL It has some interesting applications (including functioning as a discourse marker), and isn't going away anytime soon.
Like it or not, perception matters. Take accents as a similar and more obvious case: rural accents will often cause you to be looked down on, and so some find it extremely valuable to deliberately adjust their accent (in some lines of work, it’s pretty much essential to hone a certain accent). So it’s often worth at least contemplating such feedback, if you care about being taken seriously. (But don’t get too hung up on it if you’re just a normal person, either.)
Just to emphasise this the more clearly: I’m talking about facts of life here, not ideals.
This whole convo is reminiscent of the "filler world" phenomenon - e.g. especially if I'm feeling relaxed, I'll say "Like" a lot (I lived in CA for a decade, grew up/went to school in Boston) and I've seen that especially people from Texas will get really hung up on it (ok, big state, like CA, I have double digit individual data on this from a skew of states) and I thought about applying an audio plugin to filter out my "likes" as these convos were in a video game. I'm not sure if I'm right about the "Texas doesn't like the CA slang" hypothesis, but I find it fascinating that people get so hung up on it and expect me to adhere to their expectation of how I should behave. I'm just saying "like" instead of "filler word" of "empty air" - the later of which the critics advocated for. I guess they might just be advocating for open air for others, and I'm overthinking the whole thing.
That doesn't have anything to do with Texas; like, overusage, like, of "like" is, like, incredibly annoying. Growing up in the midwest, we were taught to not use filler words/sounds like "uhh" or "um" in general.
I get the mindset, I just don't hold that view. I also judge people that get so hung up on it, just as they judge me for saying it. Functionally, saying "like" when bridging two parallel thoughts seems reasonable enough to me.
My use of the word “so” there occurred naturally and I deliberately left it there. However, it’s a different sense of the word “so”: it’s concretely meaning therefore, rather than just being a meaningless filler.
> My use of the word “so” there occurred naturally
What was unnatural about the original use? English isn't exactly standardized, there are plenty of dialects in which "so" is a common way to start a sentence. It might not be part of what you perceive as the prestige[1] dialect, but it's still natural.
Nothing. jchw was just querying whether I’d put it in deliberately. I didn’t; it was the wording that flowed out of my fingers (and then I decided to keep it).
Engineers (at least in my locality in California) tend to prefer direct, casual language and to a great degree even have a grudge against overuse of formal language and formalities. We love things that speak to us in our day to day language.
I suppose if you're pitching a product to lawyers, then sure, bring on the notwithstanding hitherto aforementioned gibberish and I'm sure they'll feel right at home thereinafter. :)
One usage of it that I find particularly annoying is in answer to questions. For instance:
"Where did you go to lunch?"
"So, we headed downtown..."
I think it's aggravating because it feels evasive, as if the person is pretending they're not answering a question and are instead just starting a new story.
It's very common in American English as well, and it functions the same way, as a filler. I'm not sure where the OP got the sense that it was condescending.
It's common in the Northern California tech industry. People will prefix questions by saying "Question" or statements with "So". It's sort of phatic, like a header that prepares the listener for what's coming. No one would take offence at it there.
However, I agree that it is out of place in a demonstration video since people are looking for the info in the first place and already know what's coming.
Would you mind sharing where you're from so I know which region I should be careful to not offend people by speaking like this?
This is really cool. Is the main use case for something like this over wolfram alpha the use of writing the equations directly? Are there any other features that this has hidden that I can't get on wolfram alpha?
Something I think would be cool for a tool like this is to export code in various languages. Work out your algorithm in more "mathy" tool, but then spit out plain C#/F#/Python/Javascript to implement it in an app. Definitely a niche need, but I've had times where I think this would save a ton of time. This could also be useful where you have engineers who need to work similar to what the video shows, but then coders who need to implement in business software etc.
The problem here is that each language has different features and different strengths and weaknesses.
For example, I would have very different implementations of, say, the A* algorithm in C++ vs Python, to the point that they would not translate into each other very well
All those things, except units, can be done in Matlab. You can just drop a file into your workspace and they'll parse it!
They also have fantastic code generation support, so your "data cleanup" is actually reproducible.
And Mathematica is really good at typing equations.
The state of the art is done in other languages now, specifically Python with numpy and extensions, R, or Julia. There was a time for proprietary systems like this but I think it has passed.
Sure, in the fields of computer science, data science, etc. However, this is not the case in more traditional engineering fields (electrical, mechanical, etc.) which tend to teach MATLAB throughout undergraduate education and subsequently rely on it throughout graduate education, research, and industry.
MATLAB has some significant benefits over the FOSS alternatives, such as its first-party commercially-supported extensions/libraries (known as "Toolboxes"), and commercial support for the core product itself.
I suspect Matlab will have a long tail in engineering education, for a number of reasons: 1) It's effectively free for educational use thanks to generous site licenses. 2) A lot of curricula have been developed around it. 3) Indisputably, it works.
But outside of academia, let's face it, the most widespread "engineering IDE" is Excel. Relatively few engineers continue to use math and equations after they graduate. A lot of their calculations can be done in Excel, or within their CAD software. The biggest threat to Matlab is the automation of engineering calculations.
As a result, it's more mixed, with the people who are actually doing quantitative work coming from a variety of fields including math and physics, but also from areas that have no prior loyalty to Matlab such as biology and statistics.
For instance in my department, any deeply quantitative work is done by a very small handful of "math people" who may or may not have "engineer" job titles, but whose degrees are in math or the physical sciences.
I once sold a program for large network optimization problem in Excel. The calculations of the sewer planning for an entire district, a few million homes in northern Italy. Excel was the best for the job, integrated into a CAD package.
Safety? Wrong planning could lead to some problems as seenm in poorly planned cities (e.g. Houston), but I guess they are largely ignorable. "Common Mistakes in Planning of Sewer Networks and STPs" https://www.springerprofessional.de/common-mistakes-in-plann...
My district was properly planned, Excel was the best. Would not recommend doing something similar by hand.
There's a lot of engineering that just isn't at that level of critical. If it is, then they do it in their CAD software. If it's safety related, there might be just one person in the department who handles that kind of stuff.
Now, I have a relative who is a retired nuclear power plant operator. They did things a bit more formally there. ;-)
Many universities are moving to Python for their engineering education.
I believe there are two reasons: 1. the popularity of Python for ML in particular, many students want to learn general skills that are relevant beyond a narrow "engineering domain". 2. Somewhat surprisingly a push from industry, where licences make up a significant chunk of the budget. I know several R&D engineers who switched over to Python, because they got annoyed that licences where not available when they wanted to work. This has changed the overall perception of Python in these companies (who previously regarded Python as a "hobby" project).
Regarding the toolboxes, in my experience quite a few are of extremely poor quality despite significant costs. To give you one example, the instrumentation toolbox is so buggy, that we essentially have to open and close instruments every time we want to send commands (otherwise we would get random freezes which often could only be fixed with a complete computer restart). This adds significant time, when we moved from matlab to python for our instrument control measurement time for a single measurement (of which we would often need >100 in one experiment) went from >5min to less than 30s.
Yes and no. I work at an engineering/R&D firm. It's a mix. If you check research in those areas it tends to, at least by my estimation, newer software packages typically based on Python or C++.
I'm aware there are existing tools that see heavy use but that does not make them state of the art or cutting edge, merely sufficient. And yes, that can be enough. But for new work? Why throw in behind these guys?
I took civil engineering. Our introductory computer science course was in javascript, but course work later was in Fortran (which I didn't mind) and then python. There was also some matlab for labs but the programming was mostly fortran. One of my professors was doing some research in spring and dampener systems and the models were all in fortran as well.
Actually I'm interested in any tool that saves me from writing code. I much prefer working at the symbolic level because as soon as I open a regular IDE I start thinking about my programming language instead of the problem I'm working on. I don't program full time any more so it's not a subconscious process, and really I would like the computer to do the work of translating my symbolic concept into syntactically correct language. If it works well, I don't really care that it's proprietary.
>There was a time for proprietary systems like this but I think it has passed.
spoken, truly, like someone that has never worked in a [^software] engineering firm. civil/electrical/chemical engineering firms all uniformly rely on matlab, comsol, solidworks, autocad, etc.
Why would you say that? OSS dominating seems to me to be a phenomenon largely restricted to the software industry, probably because we like to have a hand in our own tools and can contribute.
Outside of software and data science, is there any industry where the software landscape is dominated by the OSS players?
> Outside of software and data science, is there any industry where the software landscape is dominated by the OSS players?
Academic publishing, maybe. Many of them rely on LaTeX, especially the mathematics journals, but even this is changing if I understand correctly—more journals are accepting submissions in Microsoft Word format.
True, but LaTeX only exists because of computer science researchers (largely Leslie Lamport from what I understand.) They had a need, and built a tool to meet that need, and the rest of the industry benefited.
There's less overlap in needs between software and mechanical engineering, for example.
Even tho I am not a engineer as such, when I develop or wonder something I have to use Numi, and visualisations are always DIY. Really excited about the Atlas block exchange part, I want a library of common math/physics functions & values so I dont have to google.
Why the editorial in the title (currently "Atlas, a (hopefully) better engineering IDE")?
Page title is "A better engineering tool". Hard to say exactly what it is.. a competitor to Mathematica maybe? But it's not an IDE in the way most software developers would understand an IDE
I guess the overlap between physicists/engineers and type system/PLT people isn’t large enough.