
Why OpenCV uses BGR color space (2015) - rcshubhadeep
https://www.learnopencv.com/why-does-opencv-use-bgr-color-format/
======
cesarb
> E.g. in Windows, when specifying color value using COLORREF they use the BGR
> format 0x00bbggrr.

Wait a little bit... isn't Windows little-endian [1]? Following the link in
that sentence:

> typedef DWORD COLORREF;

> The low-order byte contains a value for the relative intensity of red; the
> second byte contains a value for green; and the third byte contains a value
> for blue.

Given that Windows is and has always been little-endian, the byte order of a
COLORREF is red first, then green, then blue, that is, the order is RGB. So
the reason for BGR can't be Windows' COLORREF.

[1] "A decision was made VERY long ago that Windows would not be ported to a
big-endian processor."
[https://blogs.msdn.microsoft.com/larryosterman/2005/06/07/th...](https://blogs.msdn.microsoft.com/larryosterman/2005/06/07/the-
endian-of-windows/)

~~~
kosma
That's precisely how little endian works: first memory location is LSB.

    
    
        union {
            uint32_t l;
            uint8_t b[4];
        } u = {
            .b = { 0x12, 0x34, 0x56, 0x78 }
        };
        assert(u.l == 0x78563412);

------
kalenx
Well, not only it is quite long just to say "BGR was used a lot back then",
even the back story is wrong : [https://www.snopes.com/fact-check/horses-
pass/](https://www.snopes.com/fact-check/horses-pass/)

~~~
girvo
The article said that as well, so I’m unsure what your complaint is.

~~~
kalenx
No, the post about OpenCV and BGR talks (for about 80% of it's length) about
how everything we know in transportation, up to space shuttle, is linked to
horse width.

Which, as pointed out in the link I provided, is simply false (or at the very
least a gross oversimplification).

~~~
yreg
The author stated that the story is "not-so-true" and copied from Snopes, but
he didn't include the debunking.

~~~
ballenf
One of the key points of the "debunking" seems to itself make a logical error:

> The eventual standardization of railroad gauge in the U.S. was due far less
> to a slavish devotion to a gauge inherited from England than to the simple
> fact that the North won the Civil War and, in the process, rebuilt much of
> the Southern railway system to match its own

But the article earlier states that the North won the Civil War _because_ of
the logistical superiority of its rail lines that had settled on using the
English standard gauge. Even if the South had settled on a single alternative
gauge, they would have been unable to easily acquire rail equipment from
abroad without added expense.

Snopes' attempt at debunking achieves the opposite in my reading.

------
liuliu
Before deep learning is a thing, most computer vision tasks are done either in
greyscale, or in a more reasonable colorspace (HSL, YUV etc.). Thus, you most
likely just call cvCvtColor the first thing when extracted the frame, and it
doesn't matter BGR or RGB.

Only with deep learning, people start to use RGB directly, and this becomes
something more interesting.

~~~
h4b4n3r0
For DL the colorspace is irrelevant as long as it's consistent between
training and inference. I.e. if you train your model in RGB, it better be
receiving RGB and not something else from the video feed or input images.
Interestingly, models can also be trained in YUV (including its downsampled
variants like YUYV), which can make them a tiny bit more efficient, both
because there's less data, and because most camera chips can output YUV raw
data directly. The net will learn whatever transform it needs to learn to make
sense of the data.

~~~
dosshell
Interessting. Why is not the "raw" sensor format, for example Bayer BG8, used
more in deeplearning? Would it not contain a little bit more correct
information? and if used, we could skip the Bayer-> RGB conversion completly.

~~~
lovelearning
Public raw image datasets are rare, probably because most cameras, except
DSLRs, don't support storing raw images.

Also, the nature of the inference may not require high fidelity raw images
since they're going to be down-sampled anyway. Unless research suggests that
raw images improve inference metrics, nobody's going to invest effort into
building a raw image dataset. One notable recent exception to this is the
"Learning to see in the dark" paper on low light image enhancement[1] - they
found that training on raw images gives much better enhancement than JPEG.

The data availability problem may start getting solved in the near future
because mobiles (atleast Android) support storing and processing raw images,
and on-device learning and inference capabilities are getting better all the
time on mobiles.

[1]:
[https://www.youtube.com/watch?v=qWKUFK7MWvg&feature=youtu.be](https://www.youtube.com/watch?v=qWKUFK7MWvg&feature=youtu.be)

~~~
dosshell
Interesting link! Yhea, I guess JPEG destroys a lot information low light
images.

> probably because most cameras, except DSLRs, don't support storing raw
> images

This may be true for consumer cameras but not for machine vision cameras. Most
color machine vision cameras do support the Bayer format. Basler, FLIR (Point
Grey), IDS and so on all support the Bayer format. If you roll your own camera
you definitely have to deal with the Bayer format from the sensor you are
using.

And if you do not want any compression on your data-set images the Bayer
format also takes less space than the RGB format.

I'm not arguing it is a good idea, I'm just interested if it works :)

~~~
lovelearning
It's useful information about industrial machine vision cams, since I don't
have any experience with them.

You are right about custom cameras - in fact, I have some camera modules for
experimenting with surveillance, night vision and machine learning, and some
of their datasheets mention registers that, when set, output Bayer format
images. Machine learning in surveillance is still a rather under-serviced
market, but it's potentially a rich source of raw image datasets.

Raw format training and inference definitely work. The obstacle is only data
availability.

------
jcranmer
Somewhat off-topic, but I just hate that Snopes article. It's wrong, and
rather off-base.

Functionally speaking, the main disconnect in the article is the difference
between loading gauge and track gauge. 1435mm, or 8'4.5", may be the standard
track gauge and uniform between most countries, but the actual standard
loading gauges varies quite dramatically between different countries. The UK
has a positively tiny loading gauge (about 2.5m), while the "standard" EU
loading gauge is 3.1m, the US is slightly wider at 3.2m, and Sweden opts for a
roomy 3.4m.

The original notion that the SRB has to be "slightly" larger than the 1435mm
track gauge is ridiculous, since as you can see, the loading gauge is over
twice the width the track itself. Of course, there's also rooms to play tricks
here if you need to. Schnabel cars can laterally offset the loads they're
carrying by up to a meter or so, which gives you some extra height to play
with if you're dealing with a double-track tunnel with an arched roof.

The other point to mention that's completely omitted is there is a rather big
difference between rail carriages and horse-drawn carriages. On rail
carriages, the cars sit above the wheels, while horse-drawn carriages sit
between the wheels. Something like this:

    
    
          bbbbbb    bbbbbbbbbbbbbbbnbb
          b    b          b   b
        w b    b w     w  b   b  w
        w bbbbbb w     w  b   b  w
        waaaaaaaaw     waaaaaaaaaw
        w        w     w         w
        w        w     w         w
    

(Note that this is one of the reasons that the standard gauge was chosen over
a broader gauge).

As for why standard gauge was chosen, well, keep in mind that a) there was a
lot of variance of track gauge in the 1800s, and b) the actual tolerance of
track gauge was pretty damn massive. A track gauge differing by as much as two
inches or so wouldn't have impeded the ability to transfer rolling stock.
Stephenson built the first practical steam locomotive on a line that was 4'8",
and so people who wanted to reuse his locomotives would have been drawn to a
gauge around 4'8"-4'9". Gauges around 5" and slightly larger were selected in
part to prevent interconnect between different railroads (basically, the same
concept as vendor lockin).

Contrary to the Snopes article, the South mostly standardized on 5' gauge, but
it didn't have as dense a network as the North did. The South eventually
converted to 4'9" gauge (the Pennsylvania Railroad's gauge)--in 1886, 20 years
after the Civil War. Conversion thence to 4'8.5" took much longer, but was
much less dramatic because 4'9" and 4'8.5" are within gauge tolerance.

------
phkahler
So because of legacy _and_ unwillingness to change.

------
childintime
The article makes it sound as if moving OpenCV to RGB is as difficult as
moving the US to the ISO metric system. Neither is impossible. If Europe
manages to abolish national currencies, the US surely can abolish imperial
units (and drive faster ;).

But I guess that will only happen once under pressure of chinese overlords.
I'm sure the day will come, so why wait?

------
faragon
May be the "Roman horse" here is that BGR is RGB in little endian format.
Also, it has lots of advantages: little endian pixel handling allows using
similar code for 24 and 32bpp (rr.gg.bb.rr.gg.bb vs rr.gg.bb.xx.rr.gg.bb.xx),
so you can loop over pixes with offset += pixel size.

------
CardenB
TL;DR because it used to be popular among camera engineers and other people
who manipulated image data early on.

~~~
avian
This begs the question: why was BGR popular among camera engineers while RGB
seems to be common in software?

~~~
smallstepforman
Big endian vs little endian hardware?

~~~
olliej
That’s my understanding - a lot of the low power devices were big endian as
for whatever reason big endian and risc seemed heavily correlated for decades,
similarly the early mainframes and whatnot were big endian which is why a lot
of network protocols use big endian formats.

~~~
taneq
Probably the popularity of the Motorola chips?

~~~
olliej
Seems plausible - I just couldn’t recall who was dominating embedded chips at
the time

------
barrystaes
Never mind the horses arse! Its 4 feet, 8.5 inches because Human's thumbs and
foots! _chuckle_ (=1.4351 meters)

