Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What's the best resource for learning modern x64 assembly?
492 points by ssklash 11 days ago | hide | past | web | favorite | 109 comments
I've found lots of resources for x86, but considerably less for x64. Pentester Academy has a promising course on it, along with shellcoding, but I'm not sure what else is out there.





If you basically don't know how to code in assembly, learn 16-bit x86 with whichever method you find, it won't be wasted. You can extend most of the knowledge from 16 to 32 bit substituting register names ax,bx,cx,dx,si,di,bp,sp y eax,ebx,ecx,edx,esi,edi,ebp,esp. You can extend that to 64 bit with rax,rbx,rcx,rdx,rsi,rdi,rbp,rsp. Learning 16-bit x86 will have you learn about segment registers, but it isn't wasted either, protected mode consists of quite a lot of complications on segment registers, virtual memory addresses (which you can mostly ignore as an applications programmer as compared to a kernel or driver programmer), and interruptions.

I wrote a book to learn x86 programming and it was published in 1994 back before "real digital age", oriented to be the best didactical text possible (I had to learn the hard way, I wrote what I would have had to read 10 years earlier). Over 20 reprints, recommended in all Spanish-speaking universities (yeah, I know, if I had known then what I know now I'd have done it in English). It was discontinued 10 years ago or so, they asked me to revise it for the modern world, and it didn't really make sense: rewriting it for what assembly language is and is used for in today's world would be a ton of work because it's a qualitative difference, and if I added an extra chapter explaining 32-bit and 64-bit changes and letting the publisher stamp a "2010 edition" logo on the cover would just be scamming people, which I don't want to do. Here is the link to a scanned copy of the original: https://www.dropbox.com/s/sz6rinfhyc8sai6/Lenguaje%20Ensambl... . Could prove useful if you can speak Spanish.

Honestly: I learned Z80 assembly first, in the 80s, and then switched to x86 very easily. Learn whatever assembly language first, what's hard is learning about registers, flags, memory... and if you learn it will, then you can switch to another architecture quite easily.


>Learn whatever assembly language first

On this front, I can highly recommend these two resources, preferably in this order for someone totally new to assembly:

NAND to Tetris, a course that will have you build an emulated general-purpose CPU from first principles even with no prior knowledge. You'll learn exactly about registers and memory by making them. I even recommend this to non-hardware people because the way they divide each layer of complexity is great practice even in software. https://www.coursera.org/learn/build-a-computer

Microcorruption, a series of incrementally difficult MSP430 (an easy-to-understand 16-bit instruction set) exploitation exercises in the browser: https://microcorruption.com

These are both geared towards being a gentle introduction to assembly and CPU architecture principles. They don't touch certain facts of X86/64 processors like pipelining or variable-length instructions, but IMO those are best left until you're comfortable with the basics.


> NAND to Tetris

The book version is also great, and suitable for self-study if you prefer to learn that way: https://www.nand2tetris.org/book


I also recommend Xeno Kovah's OpenSecurityTraining courses on YouTube, some of which are specifically dedicated to assembly. The audio quality can sometimes be pretty bad, but the information is good. Though they try to obfuscate things a little bit, these are clearly workshops given to researchers at Mitre, the CVE project maintainer.


Little warning, Microcorruption may be no longer monitored at this time. Account registration is not automated.

A hacker I closely follow tried to play Microcorruption during a Twitch broadcast, and, to my great disappointment, he was unable to due to lack of response.


Luckily their "Hall of Fame" is open for anyone to see. And some of the accounts in there are pretty obvious junk accounts with the usual junk passwords. Would probably take like 5 tries to "brute force" your way in, which given the context of the site would even seem fair game.

I agree that just about nobody would care in spirit.

It should probably at least be explicitly noted by anyone considering doing so: I'm sure that's still super illegal.


.. in the United States

That's fair.

In any case, I should probably write here that I have since tested account creation with another email and received the activation link right away.


I also learned Z80 first, and I can say categorically that learning 16-bit x86 is NOT a good idea. To do anything useful in 16-bit, register starvation is a constant. Segmentation issues are actually hard problems.

I submit that if x64 existed early-on, no one would have bothered with higher-level abstractions. x64 is actually -pleasant- to code in natively.


I also started with the Z80, and then moved to Intel. Due to the history of the Zilog engineers there is a lot in common, and it definitely helped.

One thing I haven't seen people mention yet is actually using assembly language; just people pointing at documentation. To learn assembly, like anything else, you have to use it. I'd suggest writing a toy program to sum string-lengths, or do maths.

One of my own recent projects was to write a "compiler" to convert reverse-polish mathematical expressions to assembly language:

https://github.com/skx/math-compiler

It's been a few years since I touched assembly, and even when I did I largely ignored the floating-point stuff, so this was a handy refresher.


Started with 6502 (pretty much the only way to get any reasonable speed/working was Assembly).

Switching to 8086 was so awkward: it had built in div, instruction, the rep prefix but most importantly so many bits - 16. I routinely used ah/al and the like.

Learning Assembly is quite easy on its right own, ability to write optimal code and routinely being compilers is a whole another story. Writing the inner loops (like grep) is what's usually left to on modern processors.


Great book, Jon! Pretty much the only material in Spanish at the time. I remember it from my years in Uni.

Thanks for sharing this, hablo español and this seems very helpful to me

There's no magic anywhere and 99.1% of it is documented.

IA-32e/AMD64 is basically an extension of IA-32/x86. What you're looking for is to understand the difference between real mode, 386 protected mode, PAE, paging and long mode because the registers, instructions, addressing and sizes of structures differ.

You can learn all of this by reading the Intel software manuals and digging into it yourself.

Tools you need (for non-Windows):

- gdb/lldb

- assemblers: yasm, nasm, binutils (contains "as")

Optional tools:

- IDA Pro

- Virtualization/emulation such as VMware Fusion/Workstation (because it supports being a gdbserver), VirtualBox or QEMU

- Intel's CPUID app

You also need good references like:

- the Intel manual set (a giant PDF or a collection of several) https://software.intel.com/en-us/articles/intel-sdm

- https://sandpile.org (fairly current)

- https://ref.x86asm.net (outdated by useful)

Another helpful exercise is writing a toy operating system in Rust or only assembly. https://osdev.org has many resources and guides.


In addition to IDA, I highly recommend looking into ghidra. It's open source, so you can peak under the hood and see how it does things.

https://github.com/NationalSecurityAgency/ghidra


Yep. It's a FOSS-quality disassembler; good if you don't have IDA (I wish), supports maybe 10% of what IDA does in total but has some other niceities that it doesn't (think: disassembler-equivalent of LLVM). (Ideally, you'd have both.) For those who don't know, IDA disassembles basically every unclassified, commercially-available architecture and executable/library/object container format (think: disassembler-equivalent of GCC if it were an IBM or Green Hills product -:P). It also has a built-in scripting API in Python(?), IIRC.

An interesting semi-abandonware static disassembler that was really good is Sourcer 8.01 from V-Communications. I think it supports Pentium II/Pro/III at most, which would include real, protected mode and long mode DOS/DOS extended/Win16/Win32 EXEs and COMs IIRC. It did a lot of memory typing and clever analysis long before IDA existed, and still interesting for retro computing.

https://www.vetusware.com/download/Sourcer%208.01%208.01/?id...


You might be interested in this talk from the creator of IDA, Ilfak Guilfanov, explaining how he used Sourcer and it's shortcomings lead directly to him creating IDA.

https://m.youtube.com/watch?v=hLBlck1lTUs


It's way better than IDA Pro. I have used the Pro version with the Hex-Rays decompiler. Ghidra is legitimately better. I wouldn't recommend IDA to anyone at this point because I don't see it having much of a future (sorry not sorry).

It's worth learning the Ghidra Scripting API because you can write ad-hoc scripts with Jython syntax to automate tasks/do custom analysis.


I was using ghidra for one of my uhh research projects earlier and I actually found the decompiler from hex rays still does better in some cases. In ghidra sometimes it times out and fails to decompile. Maybe I'm not using it correctly? I haven't begun exploring what kind of plug-ins and stuff it has, maybe that helps?

For me ghidra is better with c++ disassembling than hex ray, but hex ray is better with c style assemblies

Thanks to the better RTTI Analyzer


It depends massively on platform, architecture, and programming language. Hex-Rays is probably still ideal for embedded and obscure platforms. Ghidra's decompiler seems optimized for ARM and amd64 userland consumer-level stuff that the NSA was most interested in hacking.

> IA-32e

I’ve never seen this name before. Where does it come from?

AMD call it AMD64. Intel call it Intel 64.

Why do we need even more names for the same thing beyond these two?!

x86_64, x86-64, x64, AMD64, amd64, Intel 64, EM64T, all mean the same thing!


It's an older reference but it checks out.

When Intel was developing Itanium they named the new architecture IA-64 and retroactively named their 32-bit x86 line IA-32.

After AMD released "AMD64" Intel started copying the AMD 64-bit extensions. IA-32e was a short lived name and Intel started using "Intel 64" to refer to the 64-bit extension of x86 and continued using IA-64 for the Itanium VLIW CPU line.

https://en.wikipedia.org/wiki/X86-64#History_2

After several years of denying its existence, Intel announced at the February 2004 IDF that the project was indeed underway. Intel's chairman at the time, Craig Barrett, admitted that this was one of their worst-kept secrets.

Intel's name for this instruction set has changed several times. The name used at the IDF was CT (presumably[original research?] for Clackamas Technology, another codename from an Oregon river); within weeks they began referring to it as IA-32e (for IA-32 extensions) and in March 2004 unveiled the "official" name EM64T (Extended Memory 64 Technology). In late 2006 Intel began instead using the name Intel 64 for its implementation, paralleling AMD's use of the name AMD64.


I wonder where the x86-64, x86_64, and (the most strange of all) x64 names came from.

I think we should call it AMD64, since that's what the people who actually designed it wanted it to be called.


> I wonder where the x86-64, x86_64, and (the most strange of all) x64 names came from.

The "x86-64" name is the original one, and came from AMD themselves: https://web.archive.org/web/20000817071303/http://www.amd.co... (and "x86_64" is obviously an alias for where a hyphen is not an allowed character, like identifiers on many programming languages).

The "x64" name came from Microsoft, probably due to file name length limitations (this was before Windows XP unified the Windows 9x and Windows NT lines).

IIRC, the "AMD64" name came later, probably to distinguish it better from Intel's IA-64 (Itanium).


Wild guess, naming it '32bit extended' allowed Intel to still refer to Itanium as the 'real 64 bit' back in those days.

Intel didn't want the x86 to be 64bit.. they wanted the world to switch to Itanium for that. I figure a lot of the naming mess can be traced back to Intel's marketing.


Flat Assembler is actually easier to get your head around early on than any of the *asm variants IMO: https://flatassembler.net/

Assuming you are interested in learning not just Assembly but how to use it in conjunction with a high-level language (almost always C/C++);

Background:

--- Matt Pietrek's "Just Enough Assembly Language to Get By" - http://bytepointer.com/resources/pietrek_asm_pt1.htm

--- Hongjiu Lu "ELF: From The Programmer's Perspective" - http://beefchunk.com/documentation/sys-programming/binary_fo...

Books:

--- Computer Systems: A Programmer's Perspective (3rd Edition) by Bryant & O'Hallaron - https://www.amazon.com/Computer-Systems-Programmers-Perspect...

--- Modern X86 Assembly Language Programming by Daniel Kusswurm - https://www.amazon.com/Modern-X86-Assembly-Language-Programm...

--- Low-Level Programming by Igor Zhirkov - https://www.amazon.com/Low-Level-Programming-Assembly-Execut...


Here is a book on computer architecture that has a good section on x86-64 assembly language. Please note that I have edited this comment to reflect a change suggested by a cold comment. This book (the third edition) introduces x86-64 assembly very well.

https://csapp.cs.cmu.edu/


Excellent recommendation, but why the 2nd edition? The home page[0] defaults to the latest one which is based on x64 from the get go.

[0] https://csapp.cs.cmu.edu/


Came here to recommend the same. You can find the 15-213 course videos online. I have done the course and can’t recommend it enough. Do the labs, sincerely. You’ll learn a lot!

First result in DDG for the book title ;-) But the third edition is the one I like. Thank you.

Working through this now (again) and its excellent. A few observations- do the labs (google them) they are even better. The lectures are also available on youtube and the recitations on something I found called panopto- they are mostly repetition from the book but nice to reinforce.

Thanks for the recommendation. I hadn't come across this one before. But wow, I forgot how expensive text books can be.

I picked up the international edition on Amazon a while back for $20 or so, significantly less than what they’re asking for now. I don’t see that one on amazon today, but I’ve seen that edition on other sites. I’d look for it.

I haven’t finished the 3rd edition, but I made it about 3/4 of the way through the 2nd edition and loved it. I picked up the 3rd specifically for the x64 material.


If by the international edition you mean global edition, then you need to know that it has problems. See [1] and [2].

[1] https://news.ycombinator.com/item?id=22287045

[2] http://csapp.cs.cmu.edu/3e/errata.html


Honestly? Learn old-style 8086 assembly, then 386 code. The new stuff may be architecturally simpler in many ways, but it sits as an edge case on top of a very thick historical stack that seems completely insane if you look at it a priori.

But the early CPUs were actually quite simple! At the time "CISC" was a good thing because it provided straightforward mechanisms for expressing things (e.g. "push", "call", load a struct field with an offset...) that real world programmers needed to do all the time.


Most 64 bit processors only use 48 or 52 bits for physical memory addressing. So you might have a 64 bit virtual address, but that gets crunched down to your real physical memory which might only need e.g. 36 bits for 64GB RAM.

You rarely actually use 64 bit integers. So much of the processing is on strings which use 8 or 16 bits.

Furthermore, your Intel x64 processor still contains the features first introduced with the first IBM PC using the 8088 which in turn was conceptually compatible with the 8008 and 8080. When you try to understand a modern Intel or AMD CPU, you need to be aware of the legacy of 40+ years of backward compatibility. Once you understand the historical decisions, SSE, AVX architectures start making more sense.


I think segmented memory really hampered my ability to learn 8086 assembly well. It really made a mess of just about everything.

It... kinda didn't though. I mean, yes, it seems complicated when you look back and ask "why can't it just have been a flat space", etc... But the same point persists: you have to look at what came before.

Take a look at the complexities of 8080/Z80 or 6502 addressing modes, or the kind of tricks "big" 16 bit architectures like the PDP/11-70 were playing to stretch addressible memory. The 8086 was a breath of fresh air! Your code could be separate from your stack and from your heap, and all three could be a full 64k without any crazy bank switching or copying! And all you had to do was set up 4 segment selectors and then ignore them. And everything you were used to running did so in a clean, unconstrained environment.

Really, read that 8086 datasheet again (it's like 8 pages), it was great stuff at the time.


Never thought in this way in those years. But work is like this. Great perspective.

It also carries over to OS implementations. For example, many OS designed for the 386 failed to implement demand paging for which the CPU had some great support. I believe Windows continued using segmented addressing due to the backward compatibility with earlier releases of the OS.

It really depends on your starting point.

If you know how to write low-level C (i.e. with direct Win32/POSIX API calls), that's a good start. If you don't know what that means, you need to master this first. So much of assembly is built to support higher-level programming, things like segments, indirect references, system calls, etc., so it pays to know the WHY of all of it (e.g. do you know what a frame pointer is? why it's useful and can sometimes be omitted)? Learn this stuff first.

Once you have a good handle on C, you need to start learning how operating systems and memory management work, at least the "client side" of them accessible from userspace.

Then you might want to dip your toe into how CPUs are built, with pipelines, registers, caches, all of that.

If you've mastered those things, gcc -S will provide everything else you need (another comment suggested this).

I learned this stuff in a traditional university computer engineering program. It helped a lot. But for context, this question feels a little like, "can someone explain quantum physics"? It's a huge topic but only like 5% is the actual thing you're asking about, the other 95% is the conceptual machinery (calculus, probability, mechanics) it's built on. Mastering all the other stuff is actually the hard part. For all intents and purposes, assembly is just a notational convenience for expressing program structure in the only way a CPU can understand it. At least 90% of the difficulty is understanding how CPUs work, and how to think at sufficiently low level that you can express human-useful work at such a low level of abstraction.

It would also be nice to know why you want to know this. Writing a device driver is going to be different from writing an operating system, which will be different from writing tight numerical loops in assembly. And in any case, dollars to donuts you won't be able to beat a modern optimizing compiler performance-wise.

Hope this helps.


How can I get started on learning the low level c stuff with direct system calls and such. I learned about assembly, pipelining, cache, and memory allocation during a systems architecture course, so I'm really interested in the low level of the machine. The problem is I don't really have an idea of a project that I could do to learn the level right above. Of course I learned a bit during the course, but I really want to get a solid grasp.

I'd recommend writing something you might have used or be familiar with, in bare-metal C. Maybe try writing a simple web server, which is something most people will be familiar with. Try doing it a couple different ways: single-threaded, prefork (processes), event-based with select/kqueue/epoll. Write your own data structures: hash tables, linked lists, etc.

This sounds simple but will give you a good tour of a lot of different areas: BSD sockets, named pipes/unix domain sockets for cross-thread communication, signals, memory management, threading, basic scheduling, and parallel programming (locks/concurrency), to name a few.


If you really want to understand modern x64 or other ASM, write a simplified compiler for C. I would recommend starting with a tiny subset of the language focused on whatever aspects you want to understand. You can then compare your output with a modern compiler to get a better understanding of the intricacies involved.

You might find my other posts in this thread a good starting point for your study.

Your comment about compilers is most likely true. The exception being if he needs to vectorize or if he had the opportunity use these special-purpose instructions that are seemingly built to accelerate particular algorithms.

On x86 pretty much all of the special purpose and vector instructions are accessible from C or C++ through intrinsics. No need to drop down to assembly for that except perhaps for a very, very specialized use case.

For practice and user-mode stuffs https://godbolt.org/ is a valueless tool to help you view the output of any C program. Simply write any C program on the left, and add "-O1" in flag inputs. Assembly with color representing line-mappings would appear on the right.

> valueless

You mean "invaluable" here (cannot be given a value, as it's too important). "valueless" is the opposite (worthless)! https://en.wiktionary.org/wiki/valueless


Thanks. I had to read the original sentence multiple times; for a moment I thought I was having a stroke.

It's worth noting that the same thing can be had by doing objdump or stepping through code in gdb. Godbolt is very convenient though.


this is hands-down among best that are out there (and it's free). It is known under 2 names[1] which might be a bit confusing but it _IS_ about assembly.

see this HN thread (450+ points) discussing the book at the time: https://news.ycombinator.com/item?id=21640669 It is also constantly updated.

This books is to ASSEMBLY what Richard W. Stevens is to TCP/IP and UNIX programming. Also it IS a beginners books because he makes no assumptions about the readers previous experience. It is very thorough so it's probably the only book you'll ever need on Assembly :)

___

[1] explanation from the author:

> What is with two titles? The book was named “Reverse Engineering for Beginners” in 2014-2018, but I always suspected this makes readership too narrow. Infosec people know about “reverse engineering”, but I’ve rarely hear the “assembler” word from them. Likewise, the “reverse engineering” term is somewhat cryptic to a general audience of programmers, but they know about “assembler”. In July 2018, as an experiment, I’ve changed the title to “Assembly Language for Beginners” and posted the link to Hacker News website, and the book was received generally well. So let it be, the book now has two titles. However, I’ve changed the second title to “Understanding Assembly Language”, because someone had already written “Assembly Language for Beginners” book. Also, people say “for Beginners” sounds a bit sarcastic for a book of ~1000 pages.The two books differ only by title, filename (UAL-XX.pdf versus RE4B-XX.pdf), URL and a couple of the first pages


> Also it IS a beginners books because he makes no assumptions about the readers previous experience

I've just started going through it, and it's definitely made some assumptions about prior knowledge; I doubt I would have been able to get through it if I didn't already have some assembly knowledge.


^^ this is the one

This is a great resource. Why is it getting downvoted?

Especially if you need x64 assembly for performance optimization or during profiling a combination of godbolt (https://godbolt.org/) and Agner Fog's incredible PDFs at https://agner.org/optimize/ (especially parts 2, 3 and 4) are helpful.

The other useful step has been to write C functions and compile the code with a good compiler (good experiences with Intel, gcc and llvm) first at -O0 and later at -O2 or -O3 to understand how high level concepts can be translated to assembly.

This requires to have at least a basic understanding of registers and basic assembly concepts but at least has helped me apply the concepts to real code.


I wrote a guide a while back that may help. It includes a docker container setup to enable debugging via VS code: https://tonycodes.com/assembly

Your sources indicate an interest in reverse engineering, but another motivator for low level machine programming is performance.

For me, a great way of learning the latter is to pick a task with some instant gratification (audio dsp, software texmapping or shaders?) and start writing some of that in inline assembly.

I suggest looking at your C compiler output, learning about SIMD instructions and using a pipeline simulator to see why and how it performs like it does. I used VTune and AMD Codeanalyst back in the day, don't know what's the current SoA.


It looks intimidating, but there's not actually much to learn. You need to memorize a couple dozen common opcodes. Keep the Intel manual open and flip to the page of the opcode. Also the first few sections go into detail about paging, data types, CPUs, etc. if you don't already know about it.

https://software.intel.com/en-us/articles/intel-sdm

Try to reverse engineer something you don't have the debug symbols for. One of the big pain points was figuring out the calling convention i.e. what registers correspond to which function arguments on different platforms:

https://en.wikipedia.org/wiki/X86_calling_conventions

For reverse engineering in particular, this was a good resource:

https://beginners.re/


As far as I know, x86_64 added instructions, modes and memory address changes but the assembly is the same?

https://software.intel.com/en-us/articles/introduction-to-x6...

I mean, you have different modes, and in 64-bit mode EAX becomes RAX for example.


I agree with this, if you are competent at 32-bit those skills are going to translate. It is mostly the same.

Some exceptions that come to mind:

* The old floating point stuff is gone. EDIT: guess I am wrong on that, was thinking of what compilers typically generate for the architecture.

* The C abi now tends to focus on pass-by-register for the first few arguments.

* If you are working in kernel mode, some stuff is going to necessarily look different.


> The old floating point stuff is gone.

Do you mean x87? I’m not sure it’s true that it’s gone is it? The instructions are still documented? Do they not work?


I guess I stand corrected. I was under the impression that C compilers don't tend to use it anymore when compiling for amd64. (Because you can assume SSE support on amd64, unlike a binary that might run on an old Pentium.) Then I misread that situation as it not working.

It does take effort for an OS kernel to save and restore x87 state, so I could imagine somebody dropping support.


> It does take effort for an OS kernel to save and restore x87 state, so I could imagine somebody dropping support.

FXSAVE saves the x87 state with the SSE state, and XSAVE generalizes that to add several optional components, depending on what your processor support (as of right now, AVX state, MPX state, 3 for AVX-512, PT state, and SGX state, IIRC).


> It does take effort for an OS kernel to save and restore x87 state, so I could imagine somebody dropping support.

x86 makes it pretty easy on the kernel devs with XSAVE.


> * The old floating point stuff is gone.

It actually can be used, if you write by hand, at least in Windows.


There are some books suited for beginners and these are:

1. https://www.amazon.com/Low-Level-Programming-Assembly-Execut...

2. http://www.egr.unlv.edu/~ed/assembly64.pdf

3. https://www.amazon.com/Introduction-Bit-Assembly-Programming...

The first one is somewhat written in "weird English" (Russian English?), but it is still readable. It really helped me with x64 assembly as a C programmer. I have used 2 and 3 as reference most of the time and the first was basically my "main x64 assembly book". I would also recommend getting more proficient in C programming either by studying books such as "Expert C" and the "The C Programming Language" or reading "advanced" stuff somewhere on the Internet, e.g.: http://www.pvv.org/~oma/DeepC_slides_oct2011.pdf

The main takeaway in all this for me was learning about the call stack and the different calling conventions which gives you a clue on how recursion works under the hood.

Also when you are done learning about "practical computer architecture", i.e. assembly language programming, learn stuff about operating systems as well:

http://pages.cs.wisc.edu/~remzi/OSTEP/

Fun fact: this is not really related to assembly programming, but functions such as setjmp() and longjmp() are used for implementing exception handling.


"Assembly Language Step-by-Step Third Edition" 3rd Edition by Jeff Duntemann is a great book.

It starts with the very foundation and builds up, up to interfacing with C and implementing data structures, still in assembly.

Sadly, this book still references 32-bit ISA. My guess is that the author got old enough not to bother updating the book anymore (the first editions of the book were about assembly programming in DOS, then DOS and Linux, then Linux only).

Still a very valid book, as x86-64 is backwards-compatible with 32-bit x86. Also, as somebody else has written, once you understand the basics, you can mostly swap things like register names and/or look to the reference pages of your cpu/linker/assembler.


I wrote a Linux general purpose library for x64 assembler: https://2ton.com.au/HeavyThing/ and in recent months I decided to start doing video tutorials about the "how and why" of same. Learning I think is made easiest in Linux by learning how to navigate the syscall interface and how "man 2 write" becomes an important skill. Edit: Videos link as well: https://2ton.com.au/videos/

I would recommend 'Computer Systems: A Programmers Perspective' by Randal E Bryant and David O' Hallaron as an introduction before proceeding on to the official manuals and documentations.

3rd edition has much info on modern x86.


For a soft intro to assembler, you can play video games Human Resource Machine.

Or TIS-100

As others points out NAND to Tetris is a really good entry point to learn assembly. It begins with you designing the hardware from NAND gates. You will make all hardware components like memory, instruction decoder, ALU, and CPU, and put them together into a computer. The next step is to build a virtual machine and then a compiler for a high-level language called Jack. The keystone project is to build a game in Jack and have it run on the computer that you build :)

https://www.nand2tetris.org/

I have since then ventured into Motorola 68000 assembly coding, by reading an old Amiga course in machine code from the late 80'ties. It's fascinating to learn from an old platform like the Amiga 500.

To keep myself motivated, I have written a lot of posts that chronicles my progress through the course.

https://www.markwrobel.dk/project/amigamachinecode/

EDIT: I can also recommend this game, that is inspired by NAND to Tetris. Where you build the hardware in your browser :) http://www.nandgame.com/


I found compiler output to be a good resource.

$ gcc foo.c -S


A little bit more convenient, at least for me:

https://godbolt.org/


Maybe people think you're kidding with this response?

By far the best way to learn IMO, write C and see how it translates.


In order to get original source interspersed in the assembly, run

  gcc -c -g foo.c
  objdump -S foo.o

Others already have given plenty of useful information, I just have a basic one.

Stay with Intel syntax, nevermind about AT&T syntax, beyond learning on how to read it.

PC world is all about Intel syntax, AT&T is lower level on how instructions get expressed and Assemblers macro capabilities just suck compared with what TASM and MASM were already capable of on the MS-DOS days.


I'm learning Z80 for programming the rc2014 kit (it's really nice kit if your also into electronics). There is a surprising lack of information online about the techniques of programming. For me that's the biggest struggle. You can know the instructions but juggling registers in an optimal way is hard. I'm currently learning I can use the stack inside a routine to have more values in the air at the same time, but then I read some example code and learn a particular instruction can be used in a clever way to avoid all my stack juggling. I am finding it can take hours to write a simple routine. I really get it now when people write it's a non-trivial problem to map C to assembly (especially Z80). Hell, it's fun though! Screw all these high level languages, let's do everything in assembly and damn the horses!!

I learned x86 from Xeno Kovah's course and I imagine his x64 will be just as comprehensive.

It looks like his x64 is just slides instead of videos which may be better, sitting through 16hrs of video when I was learning x86 was ... an experience, but a very useful one.

Anyway, whenever someone asks this question I don't hesitate to bring up his work and trainings I think they're really useful. I know this is for reverse engineering but if you can reverse engineer x64 you can certainly write it.

http://opensecuritytraining.info/IntroX86-64.html


This book is a good starting point for x86-64 Assembly with Linux: http://www.egr.unlv.edu/%7Eed/x86.html

x86-64 Assembly Language Programming with Ubuntu By Ed Jorgensen

http://www.egr.unlv.edu/~ed/x86.html


I know this isn't quite what you're asking, but I really recommend "Assembly Language Step By Step, Third Edition" by Jeff Duntemann

One of the best 32bit x86 assembly books out there.


> One of the best 32bit x86 assembly books out there.

I think it is one of the best books about fundamental computer programming overall. Even if you don't program in assembly afterwards, you will take away a much better understanding of how computers operate. Sure, it is still a beginner book and it doesn't go into much detail, but the parts it explains, it explains very well.


I plan on making some videos on this once I get through web vulns. I've written x64 on Windows and Linux. On Windows you need to get ml64.exe, it gets installed with visual studio. You can invoke it from the command line. On Linux I usually use NASM I think, cause it supports Intel syntax.

The only really annoying thing is the stack alignment for calls. I usually just make a macro to check rsp and align if needed. Other than that I think it is similar enough to x86 just with different registers.


I thought this book was pretty good when I took computer architecture course. https://www.google.com/url?sa=t&source=web&rct=j&url=https:/...

Kip Irvine's textbook on the subject is good. It's oriented towards Visual Studio (good, easy debugging) and windows but a fellow student had no problem using Linux tools to follow along and complete the course, which is not a bad way learn.

It's useful to at least look at such a textbook to make sure you don't skip any of the easy, sort-of prerequisite material.


I wish there were an equivalent of "core war" [0]. I "played" with it ~30 years ago and I still remember a good chunk of the language.

[0]: https://en.wikipedia.org/wiki/Core_War



I recommend [Paul Carter's PC Assembly](http://pacman128.github.io/pcasm/) which helped me getting started back in my pentesting days.


Start with this one: "Reverse Engineering for Beginners" https://torus.company/writings/RE4B-EN.pdf

Not x86_64 but Xeno Kovah's courses were really great to teach me x86:

http://opensecuritytraining.info/IntroX86.html


I used this book in university, and it was very informative: http://rayseyfarth.com/asm/

We also got plenty of general purpose registers in x64, apart from the rax stuff.

Plus, you now have the ymm and possibly zmm stuff going on :)


Maybe not modern but a goood introduction "Assembly Language Step-by-Step". Good for self study.

Stupid question... can you use assembly for GPU coding ? I mean like in a semi-practical way ?

I enjoyed the Pentester Academy course, and have recommended it several times.

So when do the patents expire on x86? And can we do a clean room version?

Have you looked at the official docs? Back in 386 days that's all we had.

https://software.intel.com/en-us/articles/intel-sdm


In 1993, as a broke high-school student pre-cell-phone and pre-search-engine, we didn't even have that (although we did have BBS phone-number/speeds/protocols/login lists). I had to take the nascent VTA Lightrail (when it was even more uncool) to Computer Literacy bookstore (when programming was also less cool) to find decent technical references... and I promptly dropped a few hundred bucks on dead tree carcasses when I was making minimum wage. Haha. X)

- Programmer's PC Sourcebook

- PC Intern

- Undocumented PC

- Undocumented DOS

- Programmer's Guide to EGA, VGA ...

- Michael Abrash books on assembly including self-modifying code and graphics programming

- Turbo Assembler Quick Reference Guide

- Code Complete

Later, the MindShare series were really good.


I remember those books. They were all helpful in teaching assembly, bios, and general pc programming like serial ports, etc.

None of them really helped with 386 protected mode programming. GDT, LDT, selectors and descriptors, paging, etc. For that you really needed the official docs. I printed the 300+ page programmer's guide in the school lab which they were not too happy about.


In 1993 I was trying to learn c64 assembly still

i have no thinkint



Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: