
How to Build 1 Bit of RAM Using Transistors - ImGameDeving
https://avrillion.com/stf/363/How-to-Build-1-Bit-of-RAM-Using-Transistors
======
userbinator
I've always wondered why most people seem to draw flip-flops with the crossed
wires and both gates pointing the same way, when I think this representation
makes it far clearer:

[https://i.imgur.com/cwZe7Zf.png](https://i.imgur.com/cwZe7Zf.png)

When both inputs are low, the NORs are equivalent to NOTs and you can see they
form a storage loop. When one input is high, it forces the loop into the
corresponding state.

That said, I'm disappointed this article doesn't show the transistor-level
schematic, because trying to read from a breadboard is extremely difficult.

~~~
osamagirl69
The reason it is not usually drawn that way is it is a cardinal sin to draw
logic gates 'backward' on a logic schematic. It is a schematic schematic
equivalent of 'goto' and makes schematics confusing to follow. For a simple
inverter in a flipflop it isn't so bad, but when you are inside of a larger
schematic it is better to stick with the best practice of signals flowing left
to right for all symbols.

PS - For what its worth, that image has a transparent background, so when the
imgur viewer displays it you get black traces on a black background.

~~~
userbinator
The whole point is that the signals _can 't_ all flow left to right in any
reasonably nontrivial design, because the latter will almost always have
feedback, and at its fundamental level that is how static memory works.

Even if you draw both gates facing the same way, there is feedback and you
still need to follow the signals the other way; but instead of simply turning
one the gate in the direction its output is actually going, and showing that
structure more clearly, you introduce the extra ugliness and confusion of
crossing signals.

~~~
osamagirl69
Sorry I may have spoken inaccurately. Wires can carry signals right-to-left
(as you mentioned - this is necessary in any circuit containing feedback) but
the _symbols_ should be drawn left-to-right in a digital logic schematic.

Certainly there are different fields that follow different rules, for example
in schematic representation of feedback systems the feedback blocks are often
drawn right-to-left. They get away with it because their schematics are
generally much simpler--usually a dozen or so blocks, compared to hundreds or
thousands in a nontrivial digital circuit schematic.

Also - I am not sure I understand your comment about 'extra ugliness and
confusion of crossing signals'. Flipping the inverter backwards does nothing
to remove the signal cross, it just moves the cross outside of the region you
showed. Note how one of the inputs to your flipflop is now on the right hand
side--in most cases the crossing will reappear when you connect the rest of
your circuit.

------
leggomylibro
I once made a single-transistor latch by accident. It acted as a single bit of
memory and retained its value for weeks until I got bored with the project.

I had been making magnetic snap-together circuits, so I had a bunch of small
PCBs with simple 2- and 3-pin footprints and holes that I soldered neodymium
disc magnets into.

I put a big TO-220 N-fet on one of them, and stuck it to a laminated
whiteboard so that the magnets stuck without shorting together, then I hooked
it up to an LED as a simple high-side switch.

When I bent the transistor so that its metal plane rested against the magnetic
whiteboard, its gate would latch after briefly tapping either V+ or ground to
the magnet which was connected to the pin. When the transistor's metal plane
was perpendicular to the board, it didn't latch. Disconnecting and
reconnecting the LED didn't perturb the 'saved' value, and neither did
removing power overnight. And the same thing happened with a similar P-fet
connected as a low-side switch.

It probably wasn't a "real" latch; it was a very over-sized transistor with
low gate capacitance, and I didn't try it with something like a 3904. I think
it might have had something to do with the principles behind nonvolatile
ferroelectric RAM, but I never did get to the bottom of it.

[https://en.wikipedia.org/wiki/Ferroelectric_RAM](https://en.wikipedia.org/wiki/Ferroelectric_RAM)

~~~
ip26
FWIW I'm told that decades ago latches were implemented as a tristate driver
followed by an inverter or buffer. The source & drain cap, along with gate &
wire cap, acted as the memory.

~~~
Taniwha
That's how DRAMs work today

~~~
dreamcompiler
And that's why DRAM is so much denser than SRAM. DRAM takes [about] 1
transistor per bit; SRAM takes roughly 8.

~~~
userbinator
4 or 6 transistors is more common:

[https://en.wikipedia.org/wiki/Static_random-
access_memory#De...](https://en.wikipedia.org/wiki/Static_random-
access_memory#Design)

------
noncoml
If you are a software person looking to learn a few more things about
electronics I recommend the videos of Ben Eater on you tube.

Here is one that he explains the SR latch:
[https://www.youtube.com/watch?v=KM0DdEaY5sY](https://www.youtube.com/watch?v=KM0DdEaY5sY)

Also the lectures by this amazing MIT professor:
[https://www.youtube.com/watch?v=AfQxyVuLeCs](https://www.youtube.com/watch?v=AfQxyVuLeCs)

~~~
kwoff
Also [https://www.nand2tetris.org/](https://www.nand2tetris.org/) and "Digital
Design and Computer Architecture" by Harris and Harris go into building a
computer from the ground up.

------
johnklos
The author's abuse of the apostrophe hurts :(

~~~
caymanjim
Couldn't make it past the list.

------
ampdepolymerase
An even more interesting exercise would be to implement a DDR5 driver circuit
for the 1 bit of RAM. Typical DDR5 interfaces take hundreds to thousands lines
of Verilot/VHDL so quite a few transistors will be needed.

~~~
anticensor
You need at least two words of data to have a meaningful DDR...

------
ipunchghosts
Isn't this just a flip flop?

~~~
metaphor
Almost. It's a gated D latch, which is level sensitive and behaves
asynchronously, as opposed to a D flip flop, which is an edge sensitive
primitive.

~~~
Florin_Andrei
Well, flip-flops can be edge-triggered or level-triggered. They're both flip-
flops.

~~~
metaphor
In uni, I learned digital logic from Brown and Vranesic[1] who explicitly
differentiate gated latches from flip-flops, the latter being defined as:

> A _flip-flop_ is a storage element based on the gated latch principle, which
> can have its output state changed only on the edge of the controlling clock
> signal.

I also just pulled out my Fairchild Pocket Designer Guide (published circa
1985; inherited from a retired former colleague) which explicitly
differentiates 74/54 series flip-flops from latches both in section and
symbology. So there's at least 35+ years of industry convention without citing
standard.

To cite one industry standard, from ANSI/IEEE Std 91-1984[2] § 4.2.1:

> _Cm should be used to identify an input that produces action, for example,
> the edge-triggered clock of a bistable circuit or the level-operated data
> enable of a transparent latch_

...or from § 5.9:

> _The symbol for a bistable element (for example, a flip-flop) does not
> contain a general qualifying symbol. ... When a bistable element is
> controlled by a C input (Symbol 4.3.7-1) it is necessary to indicate whether
> this element is a latch, or an edge-triggered, pulse-triggered, or data-
> lock-out bistable._

In fact, symbol 5.9-2 labeled "D-type latch, dual / Part of SN7475" is
distinct from symbol 5.9-7 labeled "Edge-triggered D-type bistable / Part of
SN7474"...the former being what the blog discusses.

So no, this designer begs to differ.

[1] 3rd Edition, § 7.7 _Summary of Terminology_

[2]
[https://doi.org/10.1109/IEEESTD.1991.81068](https://doi.org/10.1109/IEEESTD.1991.81068)

~~~
PhantomGremlin
You're correct. A flip-flop responds to an edge, a latch responds to a level.

However, let's go back more than 35+ years, to a simpler time. To the 1960s.
To the dawn of the TTL era. Texas Instruments made a device called the 7473.
It was a J-K Flip Flop. But it responded to a pulse, not an edge. Look at the
function table in the datasheet:
[https://www.ti.com/lit/gpn/sn54ls73a](https://www.ti.com/lit/gpn/sn54ls73a)

As a kid trying to teach myself TTL I never did understand WTF was going on.
And this screwy behavior got fixed when TI did the 74LS73.

The data sheet makes clear the limitation, but either that text didn't exist
back then, or I just didn't grok the significance of it. To wit: _For these
devices the J and K inputs must be stable while the clock is high._

So you're correct for at least the 35+ most recent years. :)

But there couldn't be rules without exceptions. :)

~~~
metaphor
> _However, let 's go back more than 35+ years, to a simpler time. To the
> 1960s. To the dawn of the TTL era. Texas Instruments made a device called
> the 7473. It was a J-K Flip Flop. But it responded to a pulse, not an edge.
> Look at the function table in the datasheet:
> [https://www.ti.com/lit/gpn/sn54ls73a](https://www.ti.com/lit/gpn/sn54ls73a)
> _

The 7473 next-state truth table in this datasheet is symbolically misleading;
the specified timing constraints on p. 4 make it a lot more clear, and it's
consistent with IEEE Std 91 terminology cited.

To be sure, 7473 is indeed an _edge sensitive_ device; IEEE Std 91 references
this as the pulse-triggered flip-flop--a.k.a. master-slave flip-flop--and its
description on page 1 of the referenced datasheet corroborates this (my
emphasis):

> J-K input is loaded into the master while the clock is high and _transferred
> to the slave on the high-to-low transition._

In other words, the internal output of master stage is opaque, while slave
stage output Q/Qnot does not change until the falling edge, which is quite
distinct from the output behavior of a latch.

> _To wit: For these devices the J and K inputs must be stable while the clock
> is high._

Reading the datasheet further, the 73A variant apparently improved upon the
original 73 design by allowing for input change _after_ the rising edge (i.e.
while clock state was high) so long as the specified t_su = 20ns min setup
time before falling edge was satisfied. Also observe the 73A's 0ns min hold
time after falling edge in conjunction with no min CLK low pulse duration;
this clearly allows for much faster operating speeds by exploiting clocks with
greater than 50% duty cycle. In contrast, the 7473 was capped at less than
15MHz = 1/(t_whmin+t_wlmin) = 1/(20ns+47ns) per specified timing constraints.

P.S. Props for teaching yourself TTL as a kid. I recall my pops (who's in his
50s now, if that's any indication of my age) once tried explaining clocks to
me as a "computer literate" teen and that went waaaay over my head at the
time. The magical allure of it all ultimitely led to the whole EE thing today.

------
wbillingsley
An online version for the lazy :)

[https://theintelligentbook.com/circuitsup/#/latches/1/2](https://theintelligentbook.com/circuitsup/#/latches/1/2)

------
jesuslop
You can do with two transistors: [https://en.wikipedia.org/wiki/Flip-
flop_(electronics)](https://en.wikipedia.org/wiki/Flip-flop_\(electronics\))

------
slicktux
I recall making this for a CS lab; it was fun!

------
alangarf
A capacitor is 1 bit of memory.

~~~
emptybits
Being an analog component, a capacitor's capacity to retain a value would not
be measured in bits, would it?

Read/write floating point values! Also, design for leakage. ;-)

~~~
boomlinde
A transistor is of course also an analog component.

------
unnouinceput
Four transistors? I can do it with only 2, it's called bi-stable circuit:

[https://en.wikipedia.org/wiki/Flip-
flop_(electronics)](https://en.wikipedia.org/wiki/Flip-flop_\(electronics\))

~~~
dang
It's great to add relevant information, but in the future, could you please do
it in a way that greets and expands on what you're replying to, rather than
one-upping or putting it down? The two styles of responding have opposite
effects on discussion: one opens it up for further exploration, while the
second constricts it or closes it. In improv, that is called "blocking":
[https://improwiki.com/en/wiki/improv/blocking](https://improwiki.com/en/wiki/improv/blocking).
You probably didn't mean it that way, but intent doesn't communicate itself.
Since a comment's impact on future discussion is determined by how others hear
it, the burden is on the commenter to disambiguate [1].

The value of an HN comment is its impact on current and future discussion, or
(to put it in a pseudo-technical way) the expected value of the subthread it
forms the root of [2]. I've been struggling for a way to explain this that
doesn't sound smarmy (like "be nice" or "tone"), since it's not about being
nice. It's about what leads to richer improvisation and curious conversation,
which is what we're trying to optimize for here [3].

Edit: elsewhere in this thread are some great examples of opening-up
responses:

[https://news.ycombinator.com/item?id=22798780](https://news.ycombinator.com/item?id=22798780)
("even more interesting...")

[https://news.ycombinator.com/item?id=22798746](https://news.ycombinator.com/item?id=22798746)
("I once...")

If you ask yourself and sense into what kinds of responses such comments
invite, you'll get the spirit of what we're going for. I don't mean you
personally—I mean all of us. This is a community project.

[1]
[https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=by%3Adang%20burden%20disambiguate&sort=byDate&type=comment)

[2]
[https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...](https://hn.algolia.com/?dateRange=all&page=0&prefix=false&query=by%3Adang%20%22expected%20value%22&sort=byDate&type=comment)

[3]
[https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...](https://hn.algolia.com/?dateRange=all&page=0&prefix=true&query=by%3Adang%20curiosity%20optimiz&sort=byDate&type=comment)

~~~
unnouinceput
Please see my response to other comment and stop chasing me. Jesus Dang, it
feels like you have something personal with me lately. If you want a date,
just ask for it instead.

~~~
dang
It isn't personal. I don't remember your username. That's because there are
too many interactions to remember them all; it looks like I've posted hundreds
of comments since the last time I replied to you.

