Hacker News new | past | comments | ask | show | jobs | submit login

This gets me every time. I expect to see something interesting and it turns to be the other one. One is a fantastic thing and the other is mediocre, pick which way round at your discretion!



Pretty simple to spot LoRa vs LoRA.


Memory mnemonic: Capital A for "AI"


What exactly is the confusion? Does “parameter efficient fine-tuning” mean anything in context of the other Lora? If not, then it’s probably obvious which one this is about.


Actually it does: Lora the radio protocol has parameters to tune. Usually both sender and receiver needs to match these, so I read this like a method how these could be automatically tuned based on the distance and radio environment.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: