Hacker News new | past | comments | ask | show | jobs | submit login
Lon Lat Lon Lat (macwright.com)
336 points by dbaupp on Feb 6, 2022 | hide | past | favorite | 222 comments



Lat/Lon is really the standard. The fact that some software internally represents it as Lon/Lat doesn't mean it's not a settled issue. If you look up the coordinate of a location - it's always in Lat/Long. You can't rewrite all the books and change all the maps to make it nicer for programmers

As the linked post says, Lon/Lat is generally easier to deal with b/c it matches to X/Y (and the North/Up way we look at maps). But you still have annoyances. For instance Lon goes 0-360 but Lat goes -90 to 90. This is also mathematically inconvenient

Add on top of that the X-Y coordinate on images generally have the X flipped and starting at the top left corner. So changing to Lon/Lat doesn't fix everything.

What I personally lean towards now is converting everything on read-in to a South/East coordinate system so it matches the flipped X-Y of images (like GeoTIFFs) and just always working in that system. Image manipulation, drawing to screen, output etc. - those systems/libraries I can't really modify myself. Everything else I can manipulate in whatever coordinate system I want. So it makes sense to choose the most convenient. Plus only dealing with positive coordinates is a big plus.

That all said, I'm a total noob and I have no "geospatial" background (just writing some software to deal with rain data right now) So this isn't pro advice. I'd just be curious what others think


While it is common to casually use the lat/lon order, the lon/lat order was decided on as a matter of international treaty via ISO. This was inconsistent a long time ago even in ISO standards, but was resolved to the lon/lat format many years ago, is required for government systems, and is essentially settled. People may not like it but that is arguing with the weather. Everything is lon/lat from here on out and all new systems are designed this way.

For people that actually work in geospatial, there is no advantage to the lat/lon order and the lon/lat order has been the standard for decades already. Arguing about the order of polar coordinates is fighting a battle that was literally settled decades ago.


When I look up ISO, I find ISO 6709, which specifies latitude before longitude by default.

https://en.wikipedia.org/wiki/ISO_6709#Order,_sign,_and_unit...

EPSG:4326 also uses lat/long: https://epsg.org/crs_4326/WGS-84.html


The vast majority of software that stores geographic information conforms to ISO 19125, which specifies lon/lat order. If you use PostGIS, for example, you are using ISO 19125. While conflicting ISO standards such as 6709 infamously exist, which is confusing for people that don't deal with geospatial data on a routine basis, the dominance of ISO 19125 for storage has strongly influenced all downstream representations.

Many early standards conflicted (e.g. coordinate order) or were under-specified (e.g. polygon winding). The Open Geospatial Consortium (OGC), whence many of these ISO standards came, sensibly made the decision many years ago that all future standards should treat these in a consistent and fully-specified way. Because of the pervasiveness and vast quantity of data stored as ISO 19125, adopting the ISO 19125 conventions were the least costly way of resolving these conflicts.

I've been working with GIS data from every type of industry for a long time, and the only place I ever see lat/lon order is in user interfaces. At the data and software level it is always lon/lat for compatibility and interoperability.

There is little to be gained by re-litigating the lon/lat coordinate order, since no value is gained by changing it (the choice is arbitrary), virtually all modern software systems use the same conventions at this point, and in many cases these conventions are codified in law across many countries.


Unless I'm misreading, ISO 19125 (2004) is later than ISO 6709 (1983), so using 19125 as an authority is questionable to say the least.

Isn't the reality that a bunch of software engineers wrote their software to use lon/lat because x, y is the intuitive ordering in the context of a cylindrical projection way of thinking, and ISO 19125 codified that common software practice without anyone really thinking about it or the conflicts it created?

That's hardly definitive, particularly when it contradicted ISO's own earlier standard (which codified the long-standing common communication practice of lat/lon). ISO standards themselves are weak without logical and practical commonplace usage backing them up, because the standards are paywalled.

Your experience, and the table in the OP article, shows that lon/lat has more usage in the software world, but that's not sufficient. There's enough confusion, and enough major players defecting from that recent "standard" of 19125, that the issue should be looked at again, re-litigated, and re-standardized. Everyone outside the narrow field of GIS software engineers and data wranglers expects lat/long. The issue has never been litigated at all outside of that insular world.

It seems like standardizing on the common practice (outside of the digital technical realm) of lat/lon is the right way to go:

The software world has the advantage that if there was consensus to change to lat/lon universally, they could patch it once, convert any data files once, and be done. The software world has much less inertia.

In contrast, the non-software world can't be readily converted to lon/lat. It's much more difficult to change people's everyday usage. Look at the mess caused by Americans using MM/DD/YYYY, or English-inspired use of Imperial over metric. It's difficult to convince people to change even when there are significant problems with the historical choice, and objective reasons for why they should change.


You are arguing with the weather. The use of lat/lon is obsolete in the same way XML is obsolete. This was already settled when I was learning about geospatial two decades ago. You are asserting "common practice" that isn't actually common practice. ISO 19125 conformance requirements are ubiquitous; I've never seen a requirement for ISO 6709 conformance in real application. That ship sailed a long time ago.

You are also grossly underestimating the scale of the lon/lat installed based. The lat/lon diehards are the inconsequential and insular part of the market by comparison.

There is no practical way to "patch" the countless exabytes of lon/lat currently sitting in cold storage. These are the largest data sets that exist, and it would be exceedingly unrealistic to expect the world to reformat this data, which they've been happily using up to this point, because a few people don't want to remember that the order is lon/lat. You might as well decree that everyone use XML to store their data while you're at it.

I literally don't care, because it doesn't actually matter. No amount of wishful thinking will cause a return to the imagined golden age of lat/lon.


EPSG:4326 doesn't really use lat/lon. Sure, it's in the spec, but that part is irrelevant. Everything is stored in lon/lat for any relevant format when the projection is EPSG:4326. There have been some unfortunate decisions lately to make common APIs expect y,x if the projection is EPSG:4326 and x,y for pretty much everything else, but that's a backwards compatibility break that's going to wreak a fair amount of havoc.

It's also worth pointing out that a lot of this is conflating display order in UIs (which usually is lat/lon, often in DMS) with storage order and API conventions (which are usually lon/lat).


Thus the need for the One True Standard, which surely will be perfected this time around.


Just prefix the Longitude with L, for Longitude, that will clear up any ambiguity


Yep. Maybe use a small l for latitude. Judicious font choices should ensure little room for confusion.


You say it like it's a done deal, but virtually every book map paper and UI disagree with you.

Outside of the programming bubble we're not in a world where you ever need to think twice about the order. I don't go to Wikipedia and think "hmm I wonder which order they wrote their coordinates". I don't need to check the publication date of a map or book. I don't see that ever changing

And as I explained the benefits are kind of half-assed. If you want a more intuitive internal system than I think you can do better. You should full-ass it and re-zero it to whatever makes sense in your situation (or even change the units, or remap to go from degrees to the 0 UINT_MAX range and have your unsigned values automatically wrap around) - but keep the API in standard notation.


> or remap to go from degrees to the 0 UINT_MAX range

Some systems do this internally, it has advantages beyond the wraparound. They still export standard ISO compliant data format though since that is what everyone knows.


Congratulations, you’ve just (re)invented projections.


Sure. That kinda misses the point but you can think of it that way. And the mirror universe projection of lon/lat is not a great projection but you're free to choose what you want.

But maybe don't expose your projection in your API?


Well, I need to be projection aware, as there are at least 6 that I’m using on a regular basis. (Wgs 84/espg 4326, web Mercator, itm (old and new), an equal area one, and a few local area lat Lon subsets.)

None of them are more correct, they’re just more convenient.


> remap to go from degrees to the 0 UINT_MAX range and have your unsigned values automatically wrap around

:-/


It was a random thought I had today. Please tell me why it's a bad idea haha :))

A lot of weird bugs happen at the edges/meridians. In a South/East system the edges are at the poles and in the middle of the Pacific Ocean. So you already eliminate weird southern hemisphere blow ups. Using wraparound would solve the Pacific problem I guess

Maybe unsigned long.. for some extra precision..

EDIT: unfortunately since I'm doing everything in Clojure unsigned math is kinda clunky :(


Sub-cm resolution everywhere with only 32 bits.

> Using wraparound would solve the Pacific problem I guess

I worked with such a coordinate system years ago, and there's definitely some advantages (fixed resolution, speed on some processors, memory usage) and some disadvantages ( not all software is wraparound aware and needs ported, debugging outside of decimal degrees is harder for me but I know some mathematicians who are more used to radians than anything else - perhaps you get used to it).


Thanks for the sanity check! Yeah, my napkin math was showing a very high resolution without any floats needed. I'm happy to hear it's been done before. It's going to ultimately come together as a GUI (it's a set of scripts so far) so fortunately I don't need to interface with any other software.

The debugging aspect may be a bit annoying I'll admit. Before printing you'd need to constantly convert your u_ints to degrees to make sense of them.

The main problems I'm seeing so far is that lon spans 360 and wraps around cleanly, but lat spans 180.. and when you go past the pole you don't actually wrap anywhere (you don't wanna jump to the other pole :)). What actually happens is that you end up on a different longitude. I'm not sure how to work around that. But then maybe you don't need to b/c that scenario shouldn't come up - while crossing the pacific will. At the moment I'm primarily shooting to select rectangular regions and then extract data (like elevation or precipitation) in those regions. Such a region could cross a meridian/equator - but it wouldn't make sense for it to go past the pole

On the JVM unsigned code is going to be a lot clunkier to write, but I'm hoping in the end it'll save me a ton of time debugging edge cases


I've seen lat being half as large as long done two ways.

1) Double the lat resolution and use two different scaling factors for converting each axis to degrees(less fun to debug, but hey, free 2x resolution or repurpose the LSB for something else)

2) Allow intermediate results on latitudes to temporarily enter the no-mans land between 90-180(and southern equivalent), then manually wrap back.

> On the JVM unsigned code is going to be a lot clunkier to write

Wikipedia has a comparison: https://en.m.wikipedia.org/wiki/Binary_angular_measurement

Having x-y=5 and y-x=-5 can be quite handy, but there's also some concerns about integer overflow in C/C++.


I actually think it's a really neat idea. It gives off a hacky vibe, but as long as it's well-documented it should work well.


Do you have a citation for that? The only standard I can find is 6709.

https://en.m.wikipedia.org/wiki/ISO_6709

By government systems do you mean US government systems?


ISO 19125 is the governing standard for representing geospatial data. This is used in government systems globally, which is the basic purpose of ISO standards, and in many countries (e.g. Western Europe) government systems are required to conform to these standards. This standard is ubiquitous in all of the software tooling for dealing with geospatial data.

Because the storage and interoperability of data is specified to use lon/lat, it strongly influences things that may not be specified by standard. Human user interfaces can do whatever they want, but software systems expect everything in lon/lat order.


Thanks.

There does still seem to be controversy on this point in tooling though? Popular tools like proj and pyproj now confusingly return different ordering for a different CRS, and a diff between UI conventions and code etc is really unfortunate and likely to cause bugs, so it doesn’t seem completely settled. e.g.

https://github.com/pyproj4/pyproj/issues/225 https://github.com/OSGeo/PROJ/commit/6a7e24dce79f93b73f4919f...


Those issues are specific to cartography, which used a mess of non-standardized conventions for centuries. Like with calendars, there is an almost unbounded set of edge cases when writing software to support it. Geospatial data from modern systems, which rounds to "all data", is virtually always presented as EPSG:4326 + ISO 19125 so the experience is pleasantly consistent, and as a practical matter you can't avoid this representation.

The vast majority of geospatial data is not intrinsically cartographic in nature and the primary use case is not making maps even if maps are sometimes used as a presentation layer. Consequently, cartographic conventions are correctly treated as legacy interfaces that require conversion into a modern standard; it wouldn't make sense for a modern software system to speak legacy standards natively.

Most modern geospatial data systems use an internal spatial reference system derived from WGS84 (not even straight EPSG:4326) that is optimized for efficient data processing. Remember, the vast majority of this data is primarily consumed by machines and virtually all of the data is required to be interoperable with machines. There is a very thin long tail of people using myriad legacy cartographic formats but they don't influence how geospatial data is handled because it is such a negligible percentage of the computing done on geospatial data these days.


I work as contractor for EUROCONTROL. 90% of the time, the usual order (Lat/long) is used in software. When this is not the case, it is a good indicator that I need to be careful about the conventions used to represent all data. I have already encountered the reversed wind direction and more frequently the use of m/s instead of knots (for wind speed).


Aren't m/s the correct unit for wind speed? https://en.m.wikipedia.org/wiki/Wind_speed

> Meters per second (m/s) is the SI unit for velocity and the unit recommended by the World Meteorological Organization for reporting wind speeds

> Since 2010 the International Civil Aviation Organization (ICAO) also recommends meters per second for reporting wind speed when approaching runways


This is one of those amusing areas where folks learn that SI units have only been adopted for elementary topics. Storage in a hard drive? Not SI.

Kind of interesting, as well, to consider this when talking about lat/lon. Do those attempt to be SI?


>Do those attempt to be SI?

Time for SI is still seconds.

Which isn't base-10, yet is SI. So depends on if you accept adopted SI, or want to jump on the base-10 SI clock/time bandwagon.

And lat/lon is based on angles, not distance, with lon coming from time (delta time between solar noon in two locations).

But the SI unit(less) for angles is radians.

So... no, but it depends on how you interpret multiple SI inconsistencies.


Right, I knew that not all SI units were base ten. But I also knew that lat/lon is in degrees, not radians. I also saw the Wikipedia section that knots are still used in industry rather heavily due to their relation to nautical miles. Also not SI.

I recall that astronomical units are also not SI.

Mentioning computers is just a cheap shot, I admit. Still, is valid.

Cooking is an odd one. The old units that were largely defined in thirds are quite useful. Weight is, of course, more reliable for baking, but you can go very far at home quantities with cups and spoons.

To be fair, I'm a large believer that the units are arbitrary and whatever you learned will be good. Such that if you learned SI, it had advantages off the bat. But I am in less agreement that they have an intrinsic advantage.


Knots and nautical miles are interesting because the nautical mile was originally based on latitude: one minute (1/60 degree) of latitude was 1 nautical mile.

So an airplane traveling due north at 120 knots would cover 2 degrees of latitude per hour.

Most of the US Customary and British Imperial units actually have similar logical definitions or derivations, but they aren't regularly taught anymore.


Right. My point was more that in industries where there is some advantage to keeping a non SI unit, they are want to do so without major external pressure.

So, knots persist because lat lon persists. Home cooks persist with imperial in some places because nobody cares to reprint all recipes and measuring devices. Astrological units because at that scale... Nothing scales. And computers, because binary won. (Curious to consider if ternary had been the winner...)

I confess I am actually personally moved by some of the intuitive arguments for older measurements. Usually very physical based and very in tune with numbers actually used in an industry. It is odd to think of a sixteenth inch wrench, but it is just the natural result of dividing by two, four times, after all. (That is, you have a measuring rod, put a midpoint on there. Four times. Now, do the same for millimeters?). (granted, in the age of computers, any measurement is much easier to do at the machining level.)


16ths and even 32nds are common wrench increments.

>Usually very physical based

Land records in the US are all feet, acres, furlongs, arpents, sections, and townships.

That's not changing.

And they all divide easily from townships of 36-square miles to 10-acre quarter-quarter-quarter-sections to 66x660ft-acres.

66 ft is a gunter's (surveyors) chain = 100 links

10 chains = 660ft = 1 furlong = 1/8 mile

It's a brilliantly thought out system, but most people promoting SI won't take the time to see the benefits.

Personally, I find it much easier to make blunders with SI, because a decimal shift one place over isn't always an obvious mistake.


Sounds like you and I would be in violent agreement. :D


A typical case of the governing body decides one thing but in practice the "since-always" used convention is still preferred I guess? At least this is my experience with dealing with reqs from customers within aviation.


Well, if the aircraft speed is in knots (which is generally the case), you'd rather have wind speeds in the same unit.


It's easier to store and process everything in SI units and then convert to the user's preferred unit in the presentation layer.


As a rule, yes, but not necessarily and not everywhere. The important part is to have a coherent set of units, which usually is going to be SI units, but not always. If the user's preferred unit never is going to be SI, does it make sense to base your program on SI units?

For example, aviation deals heavily in flight levels (multiples of 100 ft). If flight levels are a first-class concept in your program, then you're already using non-SI units below the presentation layer. At this point you've established that some altitudes in your program are expressed in feet, and it might be a better idea to use feet for altitudes everywhere rather than introducing a lot of unit conversions. Or it might not be.


I took it as in the presentation layer, but I can see where you're coming from. In that case I would agree with you, and I'm not sure why GP would complain about that.


Internal representation isn't the issue. Software can internally represent coordinates any way that's consistent (and so bug-free).

The conflict is about interfaces, specifically interfaces using unlabeled number pairs, between software and humans, or software and other software. It's unfortunate that anyone would implement software with lat/long inputs/outputs, and think that they have the privilege of defining that order however suits them; what they obviously should do, and should have done, is take a moment to think, "How have people communicated earth coordinates in the past? Are there any standards that might be applicable?" and used that order. Of course, if tradition and standards were bad for some objective reason, that might justify it. But not just because some programmer wants to export their way of dealing with coordinates in x, y coordinate order from a cylindrical projection. That's not sufficient reason to upend tradition and introduce confusion and ambiguity in cases of |longitude| <= 90°.


I think you are thinking about this in the wrong way.

Sure Lat/Lon is the common presentation format for this particular coordinate system.

Right now in Sweden the time is 10:41 (it would be great if it was a couple of hours later, then I could say it's 14:41 to demonstrate the 24 hour time format). Yet, in software, I would represent that as time in UTC. Only when presenting to the user would I convert that to the users time zone.

My last name contain the letter "ö". In software, I would use an unicode string internally, then when writing out I would encoded that to utf-8. (20 years ago, I would have used an old character encoding called ISO/IEC 8859-1 or something like that, but you get my point).

For some damn reason I till don't understand, the decimal separator in Sweden is the comma and not the period. Still I would represent numbers internally as an integer or maybe float, and then when printing to to the user would I convert that to "123,4" (123.4) or something like that.

In Sweden, WGS84 is not the only common coordinate system. There are many others: SWEREF and SWEREF TM for example. Yes internally, depending on usecase, I would probably use a representation of WGS84 as reference, then convert that to present to the user...

This is how I think about coordinates.


> For some damn reason I till don't understand, the decimal separator in Sweden is the comma and not the period.

I'm curious to know the history of why (some?) Euro countries went with the common and the Anglo world went with the period. Some details:

> In France, the full stop was already in use in printing to make Roman numerals more readable, so the comma was chosen.[13] Many other countries, such as Italy, also chose to use the comma to mark the decimal units position.[13] It has been made standard by the ISO for international blueprints.[14] However, English-speaking countries took the comma to separate sequences of three digits. In some countries, a raised dot or dash (upper comma) may be used for grouping or decimal separator; this is particularly common in handwriting.

* https://en.wikipedia.org/wiki/Decimal_separator

ISO seems to say use a comma:

* https://en.wikipedia.org/wiki/ISO/IEC_80000#Part_2:_Mathemat...


(Fellow Swede here). I really want to agree with you, but in practice, the difference between "internal" and "external" is not always clear. For example, if I was producing a printed document in Swedish, I would definitely use decimal commas. (By the way, the time now is not 10:54 — it's 10.54, if you are to believe "Svenska Skrivregler" :-) ) However, when printing output in a terminal window, a comma means that I can't copy-paste that number into that interactive Python session in another window. Coordinates as lat, lon means I can't copy-paste into PostGIS. Makes one long for those LISP systems of the legends, where the system kept track of where things on screen came from, so if the program printed a coordinate and you copied it from the terminal, it was copied as a coordinate, not as text...


So this is sorta above my pay grade but I think this is a tangential issue.

Even if you are reading exotic data in some parochial format you internally probably wanna use one consistent format. If you have some massive data sets and don't wanna convert on read-in then there are way around that (you can setup a memoization system to convert and cache on as-needed basis). This format should be intuitive and convenient. I proposed a South/East format.. but you're free to choose whatever you want here. But the point is that Lon/Lat is likely never a good choice at this junction. It still has issues and you generally can do better.

At the interface the default we've arrived to as a society Lat/Lon WGS84 :) The merits here are irrelevant. If you want to support other i/o formats then you may. That really depends on your usecases - but they shouldn't map to whatever format you've decided on internally. And even if they did, you probably shouldn't be using Lon/Lat internally anyway


What you use internally really depends on the application requirements. Different projections have different properties. Depending on the area you need to cover and the questions you need to answer you may need to use multiple projections and to match performance requirements you may need to store all of these projections so you don't always need to convert on the fly


> In software, I would use an unicode string internally, then when writing out I would encoded that to utf-8.

I don't understand what distinction you're drawing here? UTF-8 is Unicode. In what way would you be modifying it at the presentation layer? (Unless you're dealing with true UI code, and are saying "I would map the characters to font glyphs according to the UTF-8 standard".)

I know UTF-8 isn't the only way of encoding Unicode codepoints, for what it's worth. I'm just struggling to see how you would be using just 'Unicode', as opposed to a particular encoding, at the storage layer. It's still just bits and bytes.


I think it would be more accurate to say that UTF-8 is a Unicode Transformation Format which by its name is logically distinct from Unicode itself. There are good reasons to store and process Unicode in UTF-8 format internally in many cases, but UTF-32 / UCS-4 would probably take over for internal processing if it weren't for memory usage and efficiency issues.


Hey I've been working on geodata in several contexts for about 5 years now and I completely agree with you! You may be "noob" but your observations match mine to a T.


Thanks for the sanity check

I have a dirty prototype I need to rewrite soon so I was considering a coordinate system change. It'd make debugging much easier if the values drawn and the values of the coordinates had the same "directions"

I was a bit apprehensive to just yolo my own internal coordinate system bc it's a design decision you end up having to live with


I tend to roll with GeoJSON and the language's equivalent of a map/dict.


>Add on top of that the X-Y coordinate on images generally have the X flipped and starting at the top left corner. So changing to Lon/Lat doesn't fix everything.

Oversimplified... but:

Most raw/original geo-referenced imagery in the US are projected in a SPCS (state plane coordinate system) in survey-feet, which aren't defined in x,y but n,e (Northing, Easting) or sometimes e,n. This order isn't as universal as x,y, for various reasons. (UTM in meters is also common for larger areas.)

SPCSs aren't all the same type of projections, so Lambert conic (parallel defined) vs Transverse Mercator (meridian defined) (the two most common) means the "first" measurement traversed isn't always the same. (On a sphere, if I tell you to go 100 miles north, then 100 miles east, you get to a different point than 100 miles east, then 100 miles north.)

Being grid systems, this shouldn't matter, but surveyors who used to do manual transformations would get in the habit of the order they regularly used.

So "flipped" is a matter of perspective.

It's one of those things that's a tell for someones educational background and localisms. Surveyor/Geographer/Geodesist vs Scientist/Mathematician/Software Developer.

Did a progammer-first person develop a particular file format, or did a cartography/geography-first person? Or some combination of a compromise?

>I have no "geospatial" background

Checks out.

If you're a learn-by-doing person, I recommend looking up the transformations for a few common CRS/datums/projections and going through the equations manually, either with a quick script or in excel. Check/compare to results from the NGS tools page. The transform steps and math will reveal where order of operations matter.

https://www.ngs.noaa.gov/ https://geodesy.noaa.gov/NCAT/ https://geodesy.noaa.gov/INFO/datums-transformations.shtml


Err, sorry I have a typo in my original post. The Y is "flipped" (not X). I feel a Northing Easting system should appeal to both surveyors and mathematicians b/c it matches the X-Y plane we're used to in math class. But unfortunately it doesn't match display coordinates which conventionally start in the upper left corner. If I output say an SVG and it's got some weird artifact or things are cropped strange due to some off-by-one errors then it's much easier to debug if your internal geographic coordinates are aligned to the final display coordinates. You don't need to do any mental translation and offsetting to figure out how the values correspond between the two systems

In fact I go so far as to output SVGs with degrees values (degrees relative to the upper left point) and then there is a global scale factor for the final image to fit whatever size I want - so the image units are in real degrees - this makes life a lot easier.

After reading all the responsese here my argument would be that you should use whatever internal representation is most convenient programmatically - and then you expose in your API or UI or whatever a system based on other criteria (though it's hard to see how lon/lat would ever be sensible here). Your internal and external systems don't have to have any relation.

As you've deduced my problem space isn't too concerned with projections at the moment. Both precipitation and elevation data comes in a GeoTIFF (ie. lat/lon grid). In fact doing any kind of projections would distort by grid and greatly complicate my life :) But it's something I will keep in mind. When you start looking at large areas then you effectively have different data densities at different ends of the region and this is not great (poleward things have disproportionately more data). Thanks for the extra information and references - it may come in handy eventually.


> For instance Lat goes 0-180 but Lon goes -90 to 90

Surely this is several typos? Longitude, which marks east-west, must span 360 degrees (-180 to 180, perhaps? although 0 to 360 is just as reasonable). Latitude, marking north-south, spans only 180 degrees, and therefore is presumably -90 to 90 (so that 0 is the equator).


Cool.. So now I can finally remember. Longitude is LONGer because it goes all the way around.. LATitude I dunno.. is like a latte filling up?


Latitude

Altitude

Vertical

If I ever need to confirm which is which, I use the the mnemonic above. I usually imagine latitude corresponding to rings on the earth moving up and down along the surface. This image is triggered by thinking about "altitude" in the context of latitude for me.

The latitude defines which of these "rings" the point is on.

https://cdn.britannica.com/07/64907-050-7ACA69C8/Facts-paral...


Try thinking of it like latitude being like a ladder's rungs. Though this does have the horrifying side effect that sometimes you'll pronounce it laddertude in your head.


I always say "Latitude? More like flatitude."


lat is swedish for lazy, so I remember it as laying down, ie horizontal


The lateral line on a fish goes from head to tail. :)


LATitude is from the LATin for sideways.


yes, typo. Thanks for catching that. edited it to match


Graphics libraries and windowing systems almost universally map (x, -y) - That is, Y=0 is at the top of your screen and Y=MAX is at the bottom, making Y coordinates backwards from conventional plotting graphs. X is standard. But talking about it is hard, even graphics programmers I know regularly make the same polarity mistakes one time out of ten, like you post originally had lat and long reversed in ranges (but used correctly elsewhere).

Just be glad we don't universally use Minecraft XYZ coordinates.


right,an internal rezeroed south east system makes it match the graphics system. So pixel positions and coordinates covary and are always positive. It makes debugging a million times easier

More radical would be to go from degrees to the 0 UINT_MAX range and have your unsigned values automatically wrap around. You'd have the range wrap around naturally. But it's nice to have degrees for debugging and it's not tied to a type so you can have arbitrary precision. I also think you'd still have corner cases to deal with (but I'll have to think about this some more)


Why not signed?


yeah, good question. My limited experience has shown me that you end up coding in assumptions that make the sign flip problematic :). There are all those infamous cases of systems crashing on planes and ships when they cross the equator. Ideally you'd test in all four quadrants of the globe but this ends up being cumbersome.

There is also the convenience of having your coordinates change in the same way as your display coordinates. So if something displays off of where it should then you can easily diagnose the problem. Everything is always in units of degrees relative to the top left corner (with maybe a scale factor) - whether you're looking at the whole globe or just a region. With a signed coordinate you'd be relative to the image center which is not how your display works. You need to track offsets and image widths/heights. Debugging gets nastier and you have off by one problems and whatnot. That's just been my experience so far


That's what I'm saying, you map the negative latitude and longitude to negative numbers


It is striking that the top response to a thoughtful, nuanced article about coordinates written by an expert in geospatial systems is someone who boldly declares "X is the standard" and concludes with "I'm a total noob".


Indeed. I appreciate the self-deprecation of the person recognizing they aren't an expert but are in the space anyhow.

Usually Dunning-Kruger is present.


> Lat/Lon is really the standard. [...] That all said, I'm a total noob and I have no "geospatial" background (just writing some software to deal with rain data right now) So this isn't pro advice.

The standard is X/Y+Spatial Reference. (the SR is usually out of band) If your spatial reference maps X to "latitude" and Y to "longitude", then so be it, but in general, I would expect a spatial reference which is applicable to an area that doesn't include the poles to map +X to East-ness and +Y to North-ness. (Spatial references near the poles get... interesting...)

If your data doesn't have a spatial reference, then you don't have data, you have numbers.

There are several reasons for this. First of all, there's the whole XY plane and all that. Secondly, there's also the fact that lat/lon/height coordinates, if you wish to preserve the right hand rule, translate to +north/+east/-up or north/east/down[0] coordinates, where elevations above sea level are negative, and the more you climb some mountain, the more negative your Z coordinate becomes. On the other hand, lon/lat/height becomes east/north/up, and the right hand rule is preserved.

Source: this is my day job.

https://en.wikipedia.org/wiki/North_east_down


> "Lat/Lon is really the standard." [citation needed]


ISO 6709 for starters.


I learnt something today. It does indeed appear to be a nearly universal standard. Thank you!


I think culturally lat/lon is the standard - it is what looked right to me before I started working with geocoords.

Cartographically, in terms of standards, that seems to be the case. I don't think because libraries have been written which are backwards, that there should be some popularity contest - that's too small a bubble, just makes programmers sound like entitled asshats...

Even a govt. standard should not be the measuring stick - govts change policy with the wind, or different govts. The lon/lat programmers with the pitchforks are the upstart rebels here...

Lat/Lon


It's not just about internal representations. Web APIs and geospatial data formats specify them either way. You can't get away from the ambiguity unless you carefully choose the API and data format to match the order you like. (and pray that anyone providing you with data is aware of the choice made by your preferred formats)


Just a small detail, but longitude goes from -180 to 180. Or, if you use the hemisphere abbreviations, 180W to 180E.


The numbers in British National Grid references are eastings then northings, so lat lon isn’t universal in national mapping standards.


yeah, lat/lon/alt is more common, but I like lon/lat for the same reason you do (you can plot lon=x, lat=y and get a crappy Mercator projection). I also often work with stereographic coordinates a lot which uses x-y, and that also colors my perception.


lon=x, lat=y is not Mercator at all, it gives the "equirectangular" projection. This has none of the nice properties of Mercator (angles are heavily distorted).


yeah, you're right! but for showing a relatively small part of the world, it doesn't matter (and in fact, usually I'm plotting things in Antarctica where Mercator's area explosion is an anti-feature).


The more you zoom in, the better Mercator should be. That's a big part of why Web Mercator exists: there are no other options beyond mercator if you want a projection where north is always up, and right-angle street intersections look correct at any latitude. You can't notice the scale difference when zoomed in somewhere.

Going super close to the poles should be a challenge for both mercator and equirectangular projections, but at least Mercator won't distort shapes at the same time (and neither is globally accurate for area).


To add a bit more detail: the amount you have to zoom in for shapes to look right will increase with absolute latitude. So the equator pretty much always looks good on Mercator, Europe looks good if you're looking at the continent scale, but then in Greenland and Antarctica you'll need to be zoomed in to a regional or city-size, or even closer in. So when it's really important to get a broad-scale view near the poles, map makers will switch to something like a polar stereographic projection or similar. (I've personally just used a transverse mercator projection (whatever UTM zone is most convenient) when doing data analysis near the poles.)


plays around in QGIS

yeah, you're completely right!


My point was that simply flipping lat/lon to be lon/lat is a cheap fix that still leaves you with problems.

What I propose is to go "all the way" and make an internal positive/unsigned South/East coordinate system that naturally matches the display coordinate system (the inverted X-Y). It's sufficiently different that you'll never get confused and flip things at API boundaries. It's also sufficiently weird that you won't expose it to others :)


Well, display coordinates are different from typical plot coordinates in any standard plotting package (of course, many plotting packages also support GIS, sort of making this a moot point). Yeah, if you're going to display coordinates you'll have extra problems.


It's not a moot point because you wanna debug stuff :)

If things are looking weird or wrong on your display then you need to go back to your internal values to inspect what's wrong. When the internal value covary with th display values then this makes life significantly easier

Not to mention things have a tendency to blow up when you start testing in the southern hemisphere haha

That said.. I don't personally use any magic plotting packages. I just generate my own graphics with svg. It's quite easy and everything is debuggable


lat also sorts before lon, alphabetically. So easy to remember.


Reminds me of two days of bug fixing, after our Python mapping tool suddenly returned bogus data, without any change in the code base.

Turns out, we did not pin the pyproj library [1] and they introduced a switch from lon-lat to lat-lon order, per default.

It was possible to retain the old behavior with `always_xy: bool = False`. I saw hundrets of similar reports, who knows how many users were effected. Conclusions, a) Always pin your dependencies, b) explicit is better than implicit.

Besides, I have a personal, subjective preference for lon-lat, but I understood the lat-lng order to be the officially accepted norm.

[1]: https://pyproj4.github.io/pyproj/stable/api/proj.html


That is awful. They should have changed the method name with such an abusive change.


Looked it up, just to prevent that the pyproj team gets blamed for this: The change in behaviour was actually introduced in the PROJ library, which is used by pyproj. This is the specific merge with some additional discussion [1].

The basic motivation was that PROJ was changed to correctly follow the axis order of EPSG codes. Anyway, it was quite confusing at that time, but over the years working with geographical data made more sense to me and I am now always aware of the specific order.

See also an extended discussion here [2]

[1]: https://github.com/OSGeo/PROJ/pull/1182

[2]: https://pyproj4.github.io/pyproj/stable/gotchas.html#axis-or...


And it looks like indeed, like proj, pyproj updated the major version number when this happened at least. Still think the API should have changed...

edit: it does seem like PROJ DID introduce a new header file with a new API[0] and eventually deprecated the older one sometime later, handling the transition much more gracefully. I guess pyproj didn't do the same thing?

[0]: https://proj.org/development/migration.html#api-migration


Ah, well here comes the problem - if your dependency changes a major version but your code seems to continue to run correctly, i.e. tests pass, how much more time do you need to spend to investigate the cause for the major version change? I think this example exposes the futility of semantic versioning.


Indeed, this was entirely our fault and as I said, we should have correctly pinned, at least the major version. Our tests did not catch the specific issue, because all code worked - just the output looked scrambled. There are tests that could catch this issue, but we did not anticipate it and therefore had no tests prepared.


A the the time of encountering, I dislike such issues. Stress, downtime, frustration.

But a few weeks later, I love them. They are a great opportunity to learn how well a test suite is setup. How it can be improved, cleaned up, refactored, or just kept the same.


> Ah, well here comes the problem - if your dependency changes a major version but your code seems to continue to run correctly, i.e. tests pass, how much more time do you need to spend to investigate the cause for the major version change?

You need to investigate the reasons for a major version change before updating the dependency and running tests any time one occurs, to see if any documented behavior you are expecting has changed.


Yeah, PROJ handled this correctly, introducing a new header file and eventually deprecating the old one after a while, making it hard to do the wrong thing. But for some reason pyproj must have kept the same interface across the breaking change?


In general I agree but I'm making an argument that a non-functional change for you may be a functional change for your users.


I think Latitude historically comes first when spoken or written because of the rule of ablaut reduplication, which makes it more natural for the 'a' sound to come before the 'o' sound when using phrases like this. For instance, 'bish, bash, bosh' sound natural/correct, but 'bosh, bash, bish' sounds wrong.

I get the order wrong when working with geojson with embarrassing frequency...

http://www.macmillandictionaryblog.com/a-hotchpotch-of-redup...


There's also the notion that determining the longitude is much harder than determining the latitude. Figuring out latitude is easy if you can measure angles to stars or the sun at noon and is something that people have been doing for quite long. Figuring out the longitude requires an accurate clock and was something that had a lot of strategic value in the seventeenth century. To the point where multiple kings and goverments at the time created rewards to incentivize scientists of the day to work on that problem.


More importantly, it needed to be accurate even when jumbled around in a storm crossing an ocean. Which ruled out for example pendulum like clocks.


We should rename them to langitude and lotitude. Much clearer!


I doubt this holds up in other languages though.


Cannot tell about all other languages, but it does hold up in Russian: широта (latitude) coming before долгота (longitude), 'i' before 'o'


How does a clock sound in other languages? In Swedish: tick tack


Tick tock.. But I’d argue it’s not the same thing. The ’tick’ sound comes first, it’s onomatopoetic.


Is it not just the way we prefer to order it in our heads?


French : tic tac


Indian: tick tick


Tamil (in India):

tik tik tik

Citation: https://en.m.wikipedia.org/wiki/Tik_Tik_Tik_(1981_film)


In every language I know the words are essentially the same...


In German, they are called Längengrad and Breitengrad.


I think we should just ask GPT-3 what is the correct order.


Well, there's Earth Centered Earth Fixed, which is what you actually get from GPS systems. Latitude and longitude are just for human-readable I/O. ECEF has the Z axis through the poles, the XY plane through the equator, and the X axis in a plane that goes through Greenwich.

This is converted to latitude and longitude from GPS using the WGS-84 standard, at least for the western world. Russia uses the PZ-90 reference frame for GLONASS. China uses the BeiDou coordinate system. There's also an obfuscated latitude and longitude used by China for public consumption, GCJ-02. This introduces an error of 100 to 700 meters. Not that anyone is fooled; the obfuscation algorithm is known.

But everybody uses the same ECEF 3D coordinates, and combining satellite data is done in that frame. So if you need precise locations, there's something to be said for working in ECEF.


I am with you, https://en.m.wikipedia.org/wiki/Earth-centered,_Earth-fixed_... would be my preference, too. Especially if we’re talking internal representation. My background is having to track objects that fly or go below the surface. Though as others have mentioned, asking “is this object on the ground currently” is a tiny bit complicated.


I've spent many hours looking for open-source geo data sets, but don't recall finding any that preserve the raw ECEF. Anyone know of any?


Yeah but usually people want to do something silly like place a point on the surface of the Earth :).


But we want to do more than that, and dealing with elevation can be surprisingly complex and fragile.

Imagine you are building an offshore wind turbine. You want 20m of clearance between the sea and turbine blade. That means 20m above highest possible tide. Tidal data is only available for a port 200 miles away and will be different for you based on local gravity. So you use modelling from satellite based gravity data to make an assumption. And then give instructions to a ship out at sea who has a contractual obligation to place the turbine within 30cm of the required position. Which coordinate system do you use? And when you are modelling a microwave link from that turbine to an onshore location on the horizon that is surveyed using a local datum what coordinate system do you use?

In reality geocentric coordinates are unlikely to be used. But maybe it would be easier if they could be. The world it not flat and it is not easily modelled as a sphere. A set of coordinates always carry with them a set of assumptions that may not work with your particular task. Geocentric coords can remove some of those assumptions.


> In reality geocentric coordinates are unlikely to be used. But maybe it would be easier if they could be

I agree. When objects interact with each other, having everything as (x,y,z) makes many things a lot easier.

So in your example, ideally, you send the ship the (x,y,z) coordinates. Internally, we would have to worry about tides and figuring out exactly what the coordinates should be, but crew installing it would know exactly where to place it.


Not even counting the tides, sea level is not flat across the globe. We live on a spheroid with varying densities and other things which goes around bigger sphere and has big sphere next to it. In the end real world is quite messy.


Yeah, it is actually silly if you don't have a decent DEM, though you can get away with a geoid approximation if you only want to render a pretty picture for someone navigating to the nearest ATM.


fortunately, between SRTMv3, ASTER, ArcticDEM and REMA, a free DEM is available for practically the entire world (maybe missing some subantarctic islands?).


The ordering of lat/lon is not the only tricky part of working with geographic coordinates, sadly. "Between" the pair of numbers and a real point location on earth, there are two levels of abstraction used in the definitions of latitude and longitude: 1. physical surface is modeled by a geoid, a surface which approximates the mean sea level over the oceans and its continuation under the land masses. 2. The geoid is approximated by a reference surface, a geometric shape simpler to describe (in closed form, with few parameters). For example, an ellispoid like WGS84 (the "84" is the year of the introduction of this standard) is commonly used.

Furthermore there is the distinction between "geodetic" latitude ϕ (without qualification): the angle between the normal and the equatorial plane - this should be given with a specification of the ellipsoid and "geocentric" or "spherical" latitude θ (or ψ, q, ϕ′, ϕc, ϕg): the angle between the radius and the equatorial plane. (Figure below).

Quoting from https://en.wikipedia.org/wiki/Latitude :

"The importance of specifying the reference datum may be illustrated by a simple example. On the reference ellipsoid for WGS84, the centre of the Eiffel Tower has a geodetic latitude of 48° 51′ 29″ N, or 48.8583° N and longitude of 2° 17′ 40″ E or 2.2944°E. The same coordinates on the datum ED50 define a point on the ground which is 140 metres (460 feet) distant from the tower.[citation needed] A web search may produce several different values for the latitude of the tower; the reference ellipsoid is rarely specified."

Whereas confusing lat/lon order usually is noticed as a bug because some points end up in an ocean or close to the poles, use of the wrong datum (reference surface) leads to more subtle errors (as the above example shows) and is therefore more likely to go undetected. What I recommend is to use test cases with well-known points and check in your GIS processing pipelines that a few test locations are still where you expect them to be at certain check points in your process.


Each datum -- WGS-84, ED50, ... -- specifies one particular geoid, one model of the earth's surface.

> Commonly an ellipsoidal model is part of a more encompassing geodetic datum.

https://en.m.wikipedia.org/wiki/Earth_ellipsoid

A brief history of datum-geoid pairings can be found in the section Historical earth ellipsoids.

For WGS-84 datum+geoid, an authoritative reference: http://www.unoosa.org/pdf/icg/2012/template/WGS_84.pdf


Surprisingly colorful account of ED50's origins:

https://www.gim-international.com/content/article/european-d...

Toward the end is info on ED50's relationship to WGS-84.


> whether -87.73 is the longitude or latitude

Is negative longitude even legal?

Jokes aside, it is always (lat, lon). Latitude is a universal value, while longitude was not just about 100 years ago: every country had its own zero meridian. When finding coordinates we are first measuring a sun angle above the horizon at local noon, that gives us a latitude. Longitude is just a time difference between zero meridian noon and local noon.


No, it's not always that way. There are plenty of libraries mentioned all over this thread that do it the other way, just like in graphics you can't just assume you're in a left or right handed system, and in relativity you can't assume time has the negative sign in the metric. It might be "right" by someone's definition, but it's not universal, so it has to be documented (and there are always extremely important cases you'll run into where people have made the "wrong" choice). Even when it comes to matrix math there are textbooks that treat them as (row, column) and others that go (column, row), it's always best to be explicit.

And yes, negatives make sense and are "legal", by which I mean they pick out locations in a meaningful way and should not be considered out of bounds. What you probably mean is that they are not canonical, and by convention people usually wrap them to points within a certain range. Doing this wrong is a common source of bugs in maps.


Latitude and longitude are geometric values. Programming libraries are secondary abstractions to Earth geometry (third degree abstractions, to be precise, there's WGS84 or other abstraction in between).

Negative longitude, negative latitude and even latitude modulo above 90 degrees will produce a point on the globe, but library-wise I would not allow it to pass input validation.


Programming libraries are also things that need to do math on data points (in a way that programmers themselves can reason about). One of the reasons many of the math focused libraries prefer [lon, lat] data is because it is (close to, in some projections) an "intuitive" approach to [x, y] order and brings the geographic abstraction closer to the Euclidean geometry abstraction to make it easier for the programmers to reason with (regardless of which projection abstractions are in between).

(Needing to do a deeper dive into some of Leaflet's code at one point gave me a somewhat greater appreciation over why there is such disagreement between [lat, lon] and [lon, lat] formats. Abstractions are hard, and when you know you already need a crazy projection abstraction or three, sometimes the simplicity of [x, y, z] Euclidean approximations feels like "home" for as much as you can reason in it and expect the projection math to "just work" with it as long as you are consistent.)


Worse, I've seen a lot of code that just uses the variable name "ll" as an array of two numbers and make no attempt to clarify which is which. Saved a whole 4 characters by not making it "latlon". Then you get to guess what units they're in.


If it’s a variable there’s a reasonable chance it’s used more than once and thus saves more than four characters.


This is also why I avoid indentation; programming is all about saving those precious bytes like it’s 1988.


On top of this, I feel like `ll` is quite verbose and memory intensive. A simple `l` would suffice with half the space.


`l` needs practically twice as many hole punches as `a` for "array" in my EBCDIC punch cards and fewer punches is less likely a card gets shredded in the reader.


Maybe your text is grey cause saving a few chars on variable names doesn't save any RAM. And also confuses sober-you from reading your own code.


Named values are better than tuples for almost everything, but especially for tuples where order might be ambiguous. Name your values, their names might be abbreviated but they won’t be ambiguous.


The problem is named named values often come with significant performance penalties in terms of storage. With data points in the billions it becomes significant


I think that’s less of a problem than often claimed.

For formats such as csv, parquet or a SQL database that store the names of fields only once, overhead is constant, regardless of number of items.

So, it only can be a problem for formats such as xml or json that repeat the names of fields in every record.

Those happen to be formats with variable record length, so you can’t index into such files; the only way to process them is in their entirety.

If so, and storage size is a problem, you can compress the files. Your typical LZW variant will, if the files are large enough, eventually encode both the “(lat=“ and “,lon=“ parts to single codes of (typically) 12 bits each. That will happen after the fifth occurrence of such strings, so fairly soon). That’s 24 bits of overhead per item. Significant, but if you use xml or json, chances are you’re already giving up way more by storing floating point values as text strings.

So, that leaves json or xml files that each store only a few items per file. With a typical file system block size of 8 kB, those already give up 4 kB on average per file.


> If so, and storage size is a problem, you can compress the files

I'm literally in the process of processing 100gb compressed spatial json and it's not enjoyable


That’s a fair point. In my opinion, more languages should support something like TypeScript’s labeled tuple elements[1] for use cases like that.

1: I wasn’t able to find this in the main docs for some reason? Only the release notes. https://www.typescriptlang.org/docs/handbook/release-notes/t...


Yeah, it's a API communication issue that tools like Typescript are designed to solve with "type information". Typescript's labeled tuples are a relatively recent addition, but an extremely welcome one especially here in this specific [lat, lon] versus [lon, lat] communication problem.


It’s also super handy for preserving parameter names when dealing with rest/spread in both runtime and static type positions, and even for naming new ones in static types for composition.


Reading this gives me warm feelings. In the past, when I had the pleasure of working with geospatial data on a few projects, I tried to make a point of naming the struct holding the data in the same order as we'd received it: "LonLat" or "LatLon". My hope was it would help the team avoid silly bugs from flipping them by mistake. (colloquially, no matter the order the coordinates were specified in the data, we'd tend to say "Lat-Lon")


Lon Lat makes more sense when the programmer organizes the way they think about geography by timezone or east/west hemisphere first; or if they consider a typical cylindrical projection and think to represent that very human-centric representation of earth's surface as x, y: x, being longitude, would be specified first. The first thing you know when given Longitude is approximately how out of sync the target is in their day/night cycle, and more broadly whether it's the eastern or western hemisphere.

Lat Lon makes more sense when the programmer organizes the way they think about geography in a more astronomical or climate-centric way first, by sun exposure. The first thing you know when given Latitude is north/south hemisphere, what season the target is in, and roughly (although depending on land masses and bodies of water and terrain) what the climate is probably like.

Lat/Lon is the traditional and historical standard way of expressing location. Why are programmers treating it as if it's a new, unsettled question, and deciding for themselves which order to use for their software?

Interesting that both WMS and WFS changed to lat/long in later versions of their specs. Maybe more people could take the hint.


Having a nice modern language helps. I'm maintaining a small library with some geospatial algorithms and geojson support for Kotlin called geogeometry.

One nice feature with Kotlin is using extension functions and extension properties on type aliases. I use this a lot in my library. So, I can represent points using a DoubleArray, which is a specialized primitive array type.

I defined a typealias PointCoordinates = DoubleArray, and then have extension properties defined on like:

val PointCoordinates.latitude: Double get() = this[1]

The compiler inlines all of that of course so it's all simple array manipulation. But it looks like an object. And I get to avoid the longitude/latitude confusion. Another nice language feature that I use a lot is named parameters on functions. Nothing worse than somebody calling f(lat,lon) when they should have called f(lon,lat). Much less confusing if you call f(latitude: lat, longitude: lon).


The issue really exists only at I/O boundaries, where if there is a format mismatch you have to convert, which incurs some costs (CPU mostly, and memory bandwidth). Most languages can inline the data such that there isn't much overhead in accessing the raw bytes. Arrays with aliases for indexes (like you did, or in JS maybe I'd do something like `const LAT = 0, LON = 1; point[LAT]`, C-family languages can pack the bytes into tight structs where field names act as aliases to the memory location, Haskell has the UNPACK pragma etc.

Btw, give that you have a PointCoordinates type, why not f(p: PointCoordinates)?


The issue is code readability while dealing with memory efficient arrays of DoubleArrays, which are used by geojson to represent geometries.

You are right about the function call :-). Actually you can also do fun PointCoordinates.f() {...}.

The reason I have function calls with latitude and longitude in there as equivalents for their point coordinate variants is just convenience. If you have some other library using its own point representation, having to first convert to my representation before you can use my functions is a bit verbose. And of course, people into OO programming usually get a bit carried away reinventing their own Point classes. It's one of the reasons I stayed away from using object hierarchies in this library.

The geojson classes are an exception to this and were actually bolted on fairly recently so I can support geojson compatible serialization/deserialization.


I’ve ended up in the middle of the ocean more times than I’d like to admit. Mixing up lat/long really reminds you how much of the world is water.


I wonder how much time has been wasted by devs who get the order wrong. I know I've been guilty of it so many times. It's really frustrating, but I like that someone took the time to write down which order goes for which software- this is a nice reference.


Aside from this, there are annoying abbreviations inconsistencies for named fields too, where some APIs will use "lon", while others will use "lng".


The author says neither is right, but clearly prefers lon, lat ordering.

"Geographical tradition favors lat, lon. Math and software prefer lon, lat."

But why does math and software prefer lon, lat? Unlike endianness, where it appears little-endian is the better choice at the hardware level, I'm drawing a blank why one ordering is better for computation.


This part of the FAQ links the author's explanation of his preference.

> Do you have a preference? Yes I do: longitude, latitude.

http://macwright.org/2016/07/15/longitude-latitude-is-the-ri...

The bullet points:

* Almost every geospatial format puts longitude first

* Almost all open source software uses longitude, latitude ordering

* longitude-first is the equivalent of XY ordering


One reason that comes to mind is that (longitude, latitude, altitude) is right handed, and we tend to prefer right handed coordinate systems.


I agree with the point, but software has at times frustratingly used left-handed coordinates for drawing on screens. DirectX and browsers' use of `z` come to mind.


When plotted on a mercator-style map, longitude corresponds to X and latitude corresponds to Y, with X, Y being the usual order for coordinates. (That's my guess, anyway.)


The X and Y you describe are also conventions based on putting north at the top of a map, and really only map one-to-one in Mercator projection anyway.


Well, by standard convention longitude is labeled θ and latitude is labeled φ. People think more about θ because it also appears in two-dimensional polar coordinates.

But in math you care far more about labeling the angles correctly than about which order you list them in.


It is computed, that eleven thousand persons have, at several times, suffered death, rather than submit to break their eggs at the smaller end.


My pet peeve is that it wasn't named "lonitude" (no 'g'). Then both would be equally wide in code.


What about "width" and "higth"? And "topp" and "left"?


Top/left aren’t the same. Width/height are similar problem but still… when you’re writing geo code you write latitude and longitude all day long and the misalignment is maddening. Pet peeve, I know.


Actually in the original language (Latin), the "a" in latitude was long, which is lost in the modern transcription.

You can align them by restoring the vowel length and writing "laatitude" & "longitude" :-)


related: there are a truly shocking number of ways to write down “a representation of an axis-aligned bounding box of a region of an image”, and every flipping time it’s a list of four floats; heavens no, it can’t be anything with names!

is it:

  [xmin ymin xmax ymax]
  [xmin ymin width height]
  [xcenter ycenter width height]
  # all of the above but in normalized [0,1] not pixel coordinates
and so on and so forth and i swear i’ve seen at least a few deficient formats that use nonstandard (for image space) coordinate axes so the origin is somewhere else.


While Lat/Lon (or Lon/Lat) is intuitive, it is not an ideal system for addressing a location on a 2-sphere. It has uneven Longitude width vs Latitude and discontinuities at the poles. I wonder what system we would use if the poles were densely populated.



Nothing springs to mind. The problem is that location on any surface is a 2 dimensional parameter. However, two coordinates before any kind of "processing" define location on a plane. All spherical coordinate systems are therefore isomorphic to map projections, where the coordinates are simply a cartesian location on the flat map. And no map projection avoids singularities.


Probably something a lot more like the Earth-centered, Earth-fixed coordinate systems [1] that are for instance the native reasoning tools of things like GPS services.

[1] https://en.m.wikipedia.org/wiki/Earth-centered,_Earth-fixed_...


> Math and software prefer lon, lat.

I had always thought math prefers lat, lon ...well not lat (which is the altitude angle) but polar angle. But anyway, this order: polar, azimuth. Turns out I have been getting all my spherical coordinates math from physics, which uses polar, azimuth. But math uses azimuth, polar.

https://en.wikipedia.org/wiki/Spherical_coordinate_system

But then again: Where does math even use spherical coordinates that is not physics related?


Order aside (I stick to geojson convention myself), I also prefer 'lng' to 'lon' due to the pronunciation. Longitude has an 'ong' sound, not an 'on' ...


There is no controversy IMHO. [Lat,Lon] is just different coordinate system of the same space as is [X,Y].

It never happens that you try to generate {X} from {Lon} alone.

    def LatLon_to_GoogleTiles(lat, lon, zoom):
        lat_rad = math.radians(lat)
        n = 2.0 ** zoom
        xtile = int((lon + 180.0) / 360.0 * n)
        ytile = int((1.0 - math.log(math.tan(lat_rad) + (1 / math.cos(lat_rad))) / math.pi) / 2.0 * n)
        return (xtile, ytile)


I’m shocked this kind of post is written, and there are so many comments, yet no one has mentioned spatial references. Not a single time. When you understand the underlying complexity of spatial references/datums/coordinate systems the deviation in library choices isn’t surprising.

Aside: If you are using a modern language (like C#) you can just use parameter labels and avoid this whole mess by being explicit each time. Underscores as separators in numeric literals are also pretty neat if you ever need to hard-code a scale or lat/lon for some reason.

Some of the libraries under complaint only support WGS84 (basically what anyone using GPS thinks of when they say lat/lon). If you have never seen a coordinate from anything other than GPS or Google Maps, then it would seem strange that someone would expect coordinates in x,y.

Other libraries support a wide variety, including projected (rather than geographic) coordinate systems where x,y is more appropriate (WebMercator, for example). Libraries like Google Maps never expose the underlying xy coordinates, but others will.

State plane coordinates, for example of widely-used xy system: https://en.m.wikipedia.org/wiki/State_Plane_Coordinate_Syste...

More info on projected/Cartesian coordinates vs geographic/spherical coordinates:

https://www.esri.com/arcgis-blog/products/arcgis-pro/mapping...


Latitude and longitude are, in fact, real-world data items. As a guy who sometimes uses them for stuff like store-finders. I'm lucky to live in a region with latitude about 45 (temperate northern hemisphere) and longitude about -70 (west of the prime meridian).

So, every time I get the lat/lon order wrong around here. I get a brief trip to Antarctica.

It's important when hacking this kind of data (or any data) to develop at least some sense for what it means. If there's an error in the data, you want to be able to think, wait, that isn't right, that's in the Atlantic someplace east of Cape Hatteras (or whatever).

And, of course, when using map products like USGS quads or UK Ordnance Survey maps we use those mapping agencies' coordinate systems, be they US State Plane projects, Universal Transverse Mercator, or whatever.

I inherited one app for USA use where the original developer decided the longitude values should be positive rather than negative. Wait, what? Kazakhstan? Must be wrong.

Like any physical measurement, lat/lon makes some kind of physical sense. Unlike many measurements, lat/lon has some constraints. For example you know a priori a latitude of +130° is bogus.


I found a bug once in one of our geospatial pipelines. We were removing all points with invalid coordinates, i.e those with abs(longitude)>90 and abs(latitude)>180. Problem is, it's longitude that goes from -180 to 180 and latitude that goes from -90 to 90. We never noticed because all the points we were interested in were in Africa. It's a really easy mistake to make.

I always say "lat/lon", it sounds better in english. Just like click-clack, zig-zag, criss-cross, ding dong, king kong. It feels like one of these rules but I can't put my finger on it: https://www.bbc.com/culture/article/20160908-the-language-ru...

On the other hand, if thinking about them as x and y then x is longitude so it naturally comes first.

In the military I learned the rule lon then lat and was taught to think of an elevator: first you step sideways in to the elevator (lon), then you go up or down (lat).


The formats that do Lon-Lat supports different SRS ( https://en.wikipedia.org/wiki/Spatial_reference_system ). Those are actually X-Y, not Lon-Lat


English also has a preference to producing lists with vowel sounds generated with the tongue posture at the back of the mouth first, followed by those made towards the front. "lat" comes first with this unwritten rule.

"fi fy fo fum" "friend or foe?" etc


Slightly off-topic, but does anyone else have to check each time which of the words "latitude", "longitude" corresponds to which measure?

Why aren't we using some self-explanatory terms?

I propose "rotitude", "iclinitude". Guess which is which.


I prefer the certitude of vertitude and and horitude myself. Remarkably the only name clash is with pole dancers in LA.

https://www.google.com/search?hl=en&q=vertitude


Mnemonic: think of latitude and longitude as parts of a ladder. Lines of LONGitude are the LONG bars that run from South-to-North and lines of latitude are the rungs (East-to-West).

h/t https://www.geographyrealm.com/remember-difference-latitude-...


Mnemonics aren't any more self-explanatory, they just rely on clumping extra arbitrary concepts, in the hope that something will stick.


Well you got it backwards.


No, longitude lines indeed run north-to-south (and, therefore, the longitude number measures how far east-to-west you are).


I read it wrong, as in latitudes are north-to-south gradations. Wasn't thinking of level lines.


I used to get it wrong. But at some point I started to remember that latitude tells something about the climate at a location and then it clicked for me. Or if it was that latitude determined which stars cover the night sky. Either of these. They are more permanent, longitude is just a matter of time.


The range of longitude is longer (-180 to 180, total 360), latitude (-90 to 90, total 180).


Make a mnemonic for yourself. The one I've heard is latitude -> ladder.


They are self-explanatory, if you understand Latin.


Broad and long? Not self-explanatory.


This just drives me nuts. Lat/Lon. That's the answer.

In any kind of graphic design work if you're communicating with printers, pre-press, etc. you say Width x Height. Like paper size. 8.5" x 11". But once in awhile, a client requests something that's described in height x width. Usually the giveaway is that billboards aren't 17 feet tall.

But in some cases, if they're from the South, I know I have to ask them three times to make sure they're telling me width and height. The third time they'll get that width is the left to right size.


> This just drives me nuts. Lat/Lon. That's the answer. In any kind of graphic design work if you're communicating with printers, pre-press, etc. you say Width x Height.

To me that would make (lon,lat) preferable, then? Longitude would be the width/left-right/x-axis coordinate, and latitude be the height/up-down/y-axis coordinate?

Or maybe I misunderstood, and you just highlighted that we every discipline should have it’s standard and as opposed to stating that geographical coordinates should match the printing and other domains?


My favourite story about this is from when I worked at an Agritech company - we were working with two systems using different orders - we discovered a highstreet store had the same issue when we discovered a bunch of their stores just off the Seychelles on Google Maps: https://thoeni.io/images/hnlondon/pundstretcher.png


Whenever I worked with lat/lon pairs in the past I stumbled over those issues at some point. That's why at my last startup we used Geohashes everywhere. It's one string and so much more convenient to handle. There are libraries to convert from Geohashes to lat/lon pairs for every platform for those times you need them. You can even shorten them when accuracy isn't needed or you'd like to group similar locations.


I feel like mixing up your lat and lon is almost a rite of passage for working with geodata - everybody does it once and (hopefully) learns from the mistake


I have had the pleasure of fixing this silly bug in my weather app after switching location input autocomplete provider. I didn't even notice it for a while, but then Stockholm having 20 celsius in winter is unusual.

It is common to see lat/long, but in fairness long/lat makes more sense as it reflects x/y axis.


In Cartesian coordinates we use x,y notation so lng,lat seems like the more familiar convention for math students.


I agree with this, which is why I prefer lon/lat (x,y) but I think technically they aren't cartesian, which is why we have projections that project the spherical planes on to X,Y coords data planes.


In the same way that most people put the horizontal before the vertical for continuous cartesian stuff, but as soon as it's discrete, all hell breaks lose:

– Some like to think like a matrix then, usually with row, column. – Some like to keep the continuous convention. – Some like to split the difference.


I was always taught in school that you "crawl before you climb" - so on graphs x is crawling, y is climbing, on maps lat is crawling, lon is climbing.

Which then leads to the confusion when doing spatial stuff, as it swaps between a lot of JS libraries (for showing) and backend systems (for querying)


> I was always taught in school that you "crawl before you climb" - so on graphs x is crawling, y is climbing, on maps lat is crawling, lon is climbing.

What does this mean?


Horizontal coordinate comes before vertical


But latitude goes vertically on most maps. And they write "lat is crawling".


Are you a Tolkien dwarf? Lon is crawling, lat is climbing on regular human maps.


I’ve done a fair amount of geospatial work, and been caught by this sometimes. A standard would be nice, but in general it’s something you catch pretty quickly in development and can fix easily. “Why doesn’t this pin show up in the correct place? Oh, yeah, let me reverse those variables.”


There actually is a standard. Lat, lon is the correct order

https://en.m.wikipedia.org/wiki/ISO_6709


There are other things like this. For instance, between WMS version 1.1.1 and 1.3.0, the order of the coordinates in bounding boxes changed, and you choose a projection with the "CRS" parameter instead of "SMS".


Lon/lat it should be. First because of x/y, but also because in astronomy, it is right ascension, declination. (Obviously it is just matter of convention, but certainly for an astronomer lat/Lon is very unnatural)


Since nobody has commented yet, in 3D land, Unreal the game used y for depth and z for height.

So to this day in UE5 y is depth and z is height. This causes fun inconsistencies with all other 3D apps.


If you're trying to point things, this is right up there with the azimuth and elevation debate. Some industries/sciences use an ElAz convention, and some prefer AzEl.


Working on a project doing geo lookups I found that storing by lat, lon made it easier to do bounded checks as distance by latitude is effectively consistent.


Do you have that the wrong way round?


Oops, yes I do. Thanks!


Haven't we reached a point in programming with sufficient abstractions and syntax such that ordering of parameters shouldn't matter?


Unfortunately, keyword arguments aren't everywhere yet. But you could pass in a struct.


I'm a simple man. A comes before O, so Lat Lon.


I don't care which order, but for the love of all things precious, please tell me the order before you give me 60,52.


As a side note, in btree based database indexes, it's not the same to have a composed index on lat-lon than lon-lat.


Lat Lon is how it's always been. I didn't know people used anything else. If you're using Lon Lat you've learned something historically incorrect.

> Do you have a preference? > Yes I do: longitude, latitude.

The author is just wrong here... and is writing this post to try to pretend that there is more difference of opinion than there actually is.


> Geographical tradition favors lat, lon.

Geographical tradition and the English language. If you say "longitude and latitude", it sounds off, like "white and black" rather than "black and white".

> There's some consensus growing around longitude, latitude

Naturally, some dweebs trying to "fix" things get it backwards.


And images and matrices are often indexed (row, column), so clearly lat lon is right!


Chilean prefer lat/lon?


Agreed.

lon, lat is x, y.



Lon lat, like x, y

Who says "y, x"?


> Who says "y, x"?

Row-major 2D arrays. Matrix indices in math.


Grrr. I ve never seen a map sideways though




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: