Hacker News new | past | comments | ask | show | jobs | submit login
Tinkering with electronics (thanassis.space)
192 points by ttsiodras on Feb 25, 2018 | hide | past | web | favorite | 79 comments



I design interfaces for a living, but tinkering with hardware is one of the most satisfying journeys I know. Every now and then, like OP, I play around with different hardware projects. It's amazing how, showing a half-baked hardware project to a friend gets them a lot more exciting than showing them a nicely designed software project. It feels like when you're tinkering with hardware a lot more parts of the brain are challenged and thus generate a more fulfilling experience. I highly recommend everyone who's into software or design but never did anything with hardware to spend some time this week building a simple blinking LED project with the Arduino. You'll know exactly what I'm talking about.


For last 16 years I have designed and manufactured "real hardware" for living (FPGAs, up to 18 layer PCBs, fast interfaces, RF, analog -- running an engineering house now).

Yet I still tinker with electronics outside of the "real hardware work". It just is fun. Especially if you can teach it to others; or give them the everlasting tinkering bug!


Interesting. I quit electronics once I figured out that software gives me 'infinite parts' at a zero budget. My ability to play around with electronics was always constrained by my parts budget. What kept me going was the ability to scrounge old stuff from the garbage and take it apart for parts, but then it was the luck of the draw and once things got really integrated that route was closed off.

So now anything that I do that has to have an effect in the real world is put together with the absolute minimum in interface hardware and the remainder is software.


>>I quit electronics once I figured out that software gives me 'infinite parts' at a zero budget.

Same here. You can build virtually anything in software.


Virtually anything of couse just means 'not everything'. As a sideproject, I'm building a keyboard, try that in software. Besides, software is sometimes just too slow. There's a reason you have a GPU (yes, it can be programmed, but my point is that it's separate hardware). Certain algorithms are just much better suited for hardware implementations since everything runs in parallel at a very fine-grained scale(cryptocurrency mining for example - the main reason that people are mining with GPU's is that the ASICs are so popular that it's hard to get them).


Anything, so long as it interacts with the outside world exclusively through a screen, speakers, or API :)


Right? Why didn't someone tell me that these newfangled SDRs worked purely through the magic of softrware.


Missing a /s ? If not, what did you think the 'S' stood for?


Well Software, duh ;).

The point I was sarcastically making was that even in SDRs you still have a downconversion via either Direct Converting or SuperHet to baseband so your signals are in the right frequency reference that you don't completely saturate your DSP/FPGA/CPU.

I love software as much as anyone else but just saying that you can do everything in SW is a bit disingenuous.


But nobody said that.

I wrote "So now anything that I do that has to have an effect in the real world is put together with the absolute minimum in interface hardware and the remainder is software." and the other person wrote "virtually".


I was replying to the second, just adding a "virtually" doesn't absolve you of real-world hardware concerns. If your SDR is getting overloaded because of bad pass-band filtering then no amount of software is going to save you.


There's plenty of hardware tinkering around SDRs though.


A way of rephrasing the previous comments would be: “I only use hardware devices when I need special-purpose I/O (including direct physical measurements and physical effects). All signal processing and logic can be more cheaply, quickly, and effectively done in software.”


> cheaply, quickly, and effectively done in software

Hardly.

Power usage in SW(or FPGA) vs ASIC is easily an order of magnitude difference. For most battery based solutions software based signal processing is right out.

Latency is another, your CPU is going to introduce both jitter and long latency compared to a discrete software solution.

Software also can't handle raw speeds involved in a lot of things. There's not a single CPU out there that can natively handle 5GHz wifi off of the wire. That's why even direction conversion SDRs use a frequency mixer to lower the data rate down to something a DSP can handle.


By cheap, quick, and effective, I’m talking about human labor, not cost of silicon doodads for a commercial-scale product.

If you imagine we’re talking about someone’s hobby prototype projects, designing all of those things you mentioned takes skill, experimentation, time, parts, assembly, likely a team collaboration, etc. The combinations of parts get conceptually complicated and the feedback loops during the development process get slower and slower.

Obviously there are reasons to use custom-designed hardware if you are a corporation and you are building a million identical devices, or if you have extreme bandwidth or latency needs that a modern CPU literally can’t keep up with, etc.

But that’s a very different context than what the people several posts up are talking about.

P.S. if you pile your posts with abbreviations your readers have a harder time figuring out what you are talking about.


> Obviously there are reasons to use custom-designed hardware if you are a corporation and you are building a million identical devices, or if you have extreme bandwidth or latency needs that a modern CPU literally can’t keep up with, etc.

Or power, plenty of battery powered things in the hobby space where using discrete hardware optimized for low power usage is necessary. Discrete hardware doesn't have to be an ASIC, it's still possible to string together a opamps and other SMD parts pretty easily these days.


Loads of people do FPGA development for hobby projects, which isn't "software." FPGAs can dramatically improve problems with latency, jitter, power over software implementations, yet are still quite accessible.


Isn't it software? What is it, then?


I think it is more commonly referred to as hardware design (or simply 'digital design'). I think it's also reasonable to call it software - but I think that even the 'real' hardware development (of ASICs) is starts with a VHDL design.


It's more like a million jumper switches(called LUT in FPGA parlance) that you can configure.

It doesn't share much, if anything with serial software and you use the same tools(VHDL/Verilog/etc) as ASIC design.


Compiled hamburgers aren't that tasty:)


Yeah; nowadays parts can be cheaper, but to take advantage of that you often have to order from another continent, waiting weeks to get them. The alternative is to build a local stash, but that requires a large upfront investment, and you'll still be missing that crucial part for the current project.

I still like electronics, but software has spoiled me.


I guess half of that satisfaction can be attributed to the fact, that hardware - especially analogue - is not as lenient about errors as software.

If you burn a transistor you burn a transistor - it's now useless for its original purpose, so you have to triple check everything.

Also there are 2nd and 3rd order effects that require experience to catch.

Example: I was really happy that my two 555 timers driving some LEDs were synchronized so good until I realized that they were plugged into the same weak power source, so the voltage drop from power surges triggered them both.


I really got into building simple hardware while in college. For me it was about the physical aspect of it.

Building software is amazing as the barrier is very low and you can iterate and experiment so fast. But 99% of software I've built exists only as 1s and 0s on a hard drive somewhere.

There's something great about physically holding something you've built. I mainly built midi controllers for friends and other weird i/o devices, But the fact that (maybe) they are sitting somewhere collecting dust brings me joy.


It's amazing how, showing a half-baked hardware project to a friend gets them a lot more exciting than showing them a nicely designed software project.

This might be something to think about when teaching programming to kids as well. Making an Arduino "do" something still involves programming, but with more tactile results and lower expectations for how much it has to resemble a professionally written app.


I feel the same with any project with "physical" results, from electronics to woodwork to fixing a leaking pipe. Don't know why but I get a fuzzy satisfaction that I never do with software.

My previous job involved lots of lab work, soldering, debugging PCB issues, working with optical fibres, et al. Current job pays better and is complex enough, but it's just me sitting behind a screen all day remotely controlling a piece of hardware we just bought from someone else.

I highly recommend everyone who's into software or design but never did anything with hardware to spend some time this week building a simple blinking LED project with the Arduino

I second that. These days it's very easy to get some hobby projects going in no time and with a very low budget. Arduinos, FPGAs, simple PCB design, it will give you satisfaction and you'll definitely learn lots useful stuff in the process that's not necessarily tangentially related to software


It feels like when you're tinkering with hardware a lot more parts of the brain are challenged and thus generate a more fulfilling experience.

I totally agree with this sentiment, but I would add: I know a guy who designs hardware for a living, and he gets a huge kick from of writing a little Python script. The thrill is mostly in developing fledgeling skills far from your main wheelhouse.


I couldn't agree more, but these days Arduinos are both limiting and expensive.

Consider checking out an ARM Cortex-M board - you can get an STM32F103 'blue pill'[1] with 20KB RAM, 64KB flash, external oscillators, and a 72MHz clock speed for $5. Plus, it has lots of extra peripherals like a realtime clock which can run off of backup power and periodically wake the chip up from sleep.

The future of embedded development is so bright, you gotta walk around with sunglasses on.

[1]: http://wiki.stm32duino.com/index.php?title=Blue_Pill


Agree re: official Arduinos.

If you still want to use Arduino, I'd highly recommend going for Arduino clones from China. You should be aware that the USB chip is different so you may have driver issues, but I haven't had any issues on Linux. You can get an Uno clone for ~$3.

It's also worth considering moving to the Atmega by itself You can get the chip used in the Uno for $2, set it up on a breadboard, program it with an Arduino as the ISP, still use arduino libs if you want, and eventually move to a custom PCB.


I've considered it, but the cost/benefit calculation makes ARM boards way more attractive.

They are way more capable, and they actually cost less than ~$3 each for a simple QFP48 STM32F103 or STM32F051 board, and the smaller chips start around $0.75 for my PCBs.


Definitely. The STM32Fx line is awesome.

I was mostly thinking about alternatives for people who are currently using Arduino and want to keep the same toolchain, libraries, etc. but at a lower cost. If someone is comfortable moving outside the Atmega ecosystem, I would recommend ARM chips as well.


There is official port of arduino for some nucleo boards. I've chosen one for making commercial product. I can use arduino libraries for prototype, and if it won't fit my needs I can always go with C.

Arduino port is just wrapper around stdperiph HAL...


I switched from using Arduinos to using ESP8266-based devices, partly for cost reasons, but also to gain the on-board WiFi support.

Having a small device with WiFi makes it a lot easier to do interesting things - such as pulling data via Web-based APIs, and submitting readings of sensors to MQ, etc.

Having a "real computer", like a PI, or a blue-pill is nice, but suddenly there is a lot more to think about.


I really like the ESP8266 for small one-offs, but I usually code for them with tons of abstraction, in something like NodeMCU or MicroPython.

Because I can't justify learning the nitty-gritty of some proprietary 'Tensilica' core. Lately I've been meaning to look into using NRF24L01 transceivers for wireless communications in projects, since breakouts for them are less than $1 each and have the same footprint as an ESP-01 module.


My current favorite arm cortex are cypress's psoc. They're basically cortex m0/m3 + embedded cpld that has a direct interconnect to cpu reg/interrupts.


How's the open toolchain support with those chips?

I've also heard good things about NXP's Cortex-M chips - apparently they let you map peripherals to whatever pins you want, which sounds neat. And TI's MSP432 chips apparently have fantastically low power consumption for their performance. Atmel's SAM chips look less promising, with minimal peripherals besides a weird common communication bus that seems to handle things like I2C/SPI/etc, if I'm reading the datasheets right.

Does that sound right? It'd be nice to learn about all of them, but there are only so many hours in a day.


I used one of their Cortex M3 parts on a project a while ago that needed a large number of SPI ports to drive strings of LEDs. Was very easy to get the initial POC working, and their IDE software worked surprisingly well.


"It feels like when you're tinkering with hardware a lot more parts of the brain are challenged and thus generate a more fulfilling experience."

Probably a hardware object strikes more senses than a software for being something you can touch, but also feel the loss of it if it breaks, because you can't copy it just like lines of code.


I think its really just that with hardware, you're building something "tangible" that feels like it actually exists in the real world. Meanwhile, with software, all your hard work exists entirely within a virtual space that you really cannot touch.


> ... gets them a lot more exciting than showing them a nicely designed software project

Maybe because of the association that "software = free".

Or because hardware is 3d while software is always 2d. Would you get the same appreciation if you made a piece of hardware that connects to a monitor and shows a blinking dot?


That basically what I did by creating a small video console and making games work on this bare hardware. Having it generate sprites on screen from bare hardware was much more fun that using unity on a pc ...


What would be a good project after a simple blinking LED project with the Arduino?


Learn about getting input - buttons, switches, potentiometers, etc.

Then go for motor control: Servos, steppers, DC motors, relays, solenoids, h-bridges, PWM, etc.

Add in some other sensors - perhaps ultrasonic distance sensing, Sharp IR sensors...

...then move into hacking on a Neato Lidar sensor...

At this point - build a robot?


Definitely a star-tracking camera mount for astrophotography.

https://partofthething.com/thoughts/making-a-cheap-and-simpl...


Welcome to my weekend!

https://imgur.com/a/XxMUY

This is an AVR-based pong video game (Grant Searle), modified to work with BBC micro:bits as digital paddles - also operatable from phones via Bluetooth!

This is for a STEM/Code Club event at a local school, for a bit of retro-meets-contemporary fun, and a paddle programming challenge - let's try using the micro:bit accelerometers.

The breadboarded circuit will be made up on stripboard, but I'll take along the prototype.

https://github.com/linker3000/Microbit-TVPong


If you want to get into electronic tinkering the Kris Cochrane youtube channel is a good palce to start. He only uses hobbyist kit (so really cheap from AliExpress) and he'll teach simple soldering and surface mount soldering. He also walks through building reallly cheap chinese electronic kits.

https://www.youtube.com/channel/UCh8JiW2G9yR2v7TwUm04m_g


Great projects!

I got an Arduino recently and it’s the best £6 I’ve spent in ages. My pet project is building a synth driven from the midi output of my keyboard.

As a software person, “real” electronics felt hard to grok. The Arduino has been a dream. You have to learn loads along the way. How do I get audio out? Oh right, what’s PWM? Oh ok, how do these timer things work? But it’s soooo satisfying.

Highly recommended!


Would love to read more about that synth project. I'm just embarking on a similar project myself, using a little Cypress PSoC that a friend gave me. I'd like to make something glitchy and lo-fi - halfway between a Monotron Duo and an Atari Punk Console.


Checkout https://github.com/BleepLabs/Nebulophone

It's not to hard to build and not a bad place to start. The code could be better, some variables are poorly named and has some minor bugs (off-by-one on the wave table lookup?). I did start to clean it up but never got to submitting a pull request before moving onto other projects.


Just found this YouTube playlist on electronics basics on YouTube: https://www.youtube.com/playlist?list=PLAA9B0175C3E15B47 the videos from the usaf training are particularly good. Person who made this playlist has a lot of other good ones too.


After building some electronics [1] I kind of think it's a pain. It took me seven tries to get the switching power supply working right. Can't do that on a prototyping board; layout matters, and all the parts are surface mount. LTSpice simulations get you close, but not quite there. Then I had to learn to surface mount solder, which means working with tweezers under a microscope. I'm not that good at it.

People like that design. Others have built more of them. Someone in Australia plans to manufacture a hundred or so. There's enough interest for hobbyists to build it, but not enough for a real production run.

[1] https://github.com/John-Nagle/ttyloopdriver (USB interface for old Teletype machines, models from 1910 to 1960).


I love tinkering with hardware. I just hate the debugging because now you have both physical on top of code debugging you have to deal with in order to find out where things go wrong.


Typically on a new hardware design I'll write throwaway code that exercises all outputs. You use this to verify the schematic.

Maybe half the time that code evolves into an API/HAL that the final application will use. Many times it also becomes code for manufacturing test (burn in with the test code, then reflash with final production firmware right before boxing)

Starting out with application code on top of a brand new design will lead to madness. Take it in smaller steps.


This takes me back to the days when I was an embedded software engineer. One day I had to write a Linux driver for a custom piece of hardware the company had developed.

I just couldn't get it to work. While trying to understand what was going on, I literally went down the levels of abstraction until I was measuring voltages with an oscilloscope. Turns out that one of the resistors on the board wasn't properly soldered.

It was a good journey from software to all the way down to hardware, but frustrating. Nowadays I develop web applications and I'm happy that I don't have to deal with this kind of problem anymore.


But you get to play with lots of cool fancy equipment do to that debugging!

(Okay, that equipment is also kinda expensive, and retains its value far longer than old computers... But even new stuff has gotten a lot cheaper than it used to be.)


I would love to see what tinkering could be done with low-power, minimal or "self-sufficient" location-placement of sensors to gather data and report data using Lora, cdma, or HF for long range. Why? Because electronics are small and there should be interesting ways to exploit the difference between this smallness and full blown CPUs operating on the grid.

But I have a hard time generating good ideas. Any thoughts?


Making extremely low power consumption nodes is very satisfying, I made a very simple light sensor remote sensor node that could power itself from 50mg of vibration acceleration a few years back on as part of my PhD in energy harvesting. The mast challenging part was getting the pic to draw the absolute minimal current and manage the power draw of all the other systems. The current draw was so low by the end that we had to use an electrometer with a 30 giga ohm (if I remember correctly) input impedance to accurately measure the current draw. Programming in asembly was a pain but I had a lot of help on that so not awful.


A high voltage meter input impedance will minimize the meter's effect on the circuit under test when measuring voltage, not current. For accurate current measurements, you want to measure the voltage drop across an accurate, low-value "shunt" resistor that carries all the current for your circuit.


Correct, so we used a high impedance voltage meter across a calibrated resistor in series with the circuit under test. It was 7 years ago I did the work and now rarely work hands on with electronics so I am hazy on the exact details but it’s all in the published journal paper, which, as always with science papers is now behind a paywall.

UPDATE: I miss remembered, we used the high impedance electrometer to buffer the input of a National Instruments interface board so we could record the voltage build up across the storage capacitors, if you use a normal voltmeter the caps just discarge into the voltmeter and the charge never builds up enough to allow the micro controller to turn on, run the code it needs to and then go back to sleep. You land up with some really nice sawtooth wave forms with this method and, as the acceleration on the vibrational energy harvester increases, the frequency of the sawtooth changes proportionally up to a point where the system is transmitting all the time.


I'm with you. Using the Things Network https://www.thethingsnetwork.org/ I am currently setting up a gateway and connecting a few remote sensors to it. Right now I am just sending simple sensor data (button_press, temperature, light etc) but I would like to expand on this and send other data. For example I want to be able to deploy my own speed traps, measuring vehicle speeds on any road I want. I don't need to connect the speeds to reg plates, so I'm thinking about just measuring when the car passes through an IR beam or measuring a magnetic pulse or similar... Also, air quality monitoring. And of course, the robot lawn mower is getting an upgrade for the spring. Water temperature in the lake might be a good addition for summer. I forgot to say but since it is https://www.thethingsnetwork.org/ everything I do is already connected to Alexa and IFTTT... imagine waking up in the morning and having Alexa tell you that it is a good day for a morning swim, since the water is X degrees. Yum.


Those are fantastic ideas - love it!


I am working on a lora/arduino mesh network for reporting back the state of a large number of traps for invasive predators (mustelids, possums, rats).

The traps are spread around the countryside in dense bush/jungle.

When one goes off, someone needs to tramp in and clear and reset it. Currently, there is no monitoring of any kind, so checking traps is extremely labour intensive.

The trickiest aspect of this project is that each node needs to spend virtually all of its time asleep to minimise battery drain, yet for two nodes in the mesh to communicate, they both need to be awake at the same time. Drifting clocks etc.

The easiest aspect is that the acceptable data rate is 1 bit every 8 hours or so!


What a cool and useful sounding project! Does LORA handle any of the meshing aspect, or do you have to manage that at the application level? Also what kind of node to node range are you getting from the radios, and when the users install them, how do they know they'll be within range of the next node?


Yeah, it is some fun, wish I had more free time to devote to it.

LORA doesn't handle any of the mesh behaviour, I'm doing all of that using LORA just at the send and receive level. The devices I am using are moteinos - a cool arduino lora mashup https://lowpowerlab.com/guide/moteino/.

I see various ranges from LORA, the environment is steep valleys and dense bush which severely dials back the range, sometimes down to a few hundred meters or even less. At this stage I haven't played around with optimising the LORA settings, but there are bound to be some wins there since the required data rate is so low.

Good question about installs, my current plan is that before setting out, the user places that part of the network where they are heading into constant beacon mode. The node can then be installed, knowing there's another node within reach.

In practice this should be fine anyway, as the recommended placement for traps for e.g. stoats is every 100 meters or so, so usually there will be other traps within range.


I'm totally ignorant about radio hardware, but I'm curious, why not use a chip with wake-on-radio?


Good idea, except that wake-on-radio requires the radio to be powered on all the time. Whereas I need to sleep everything except the arduino watchdog.


take a look at the feather m0 from Adafruit - I have a few deployed where wifi is quite a ways away, gathering data for me and sending to another one acting as a base station.

I haven't added solar, as these are inside, near power, but have some solar panels and hardware waiting for me to deploy out in the field (literally). so far, it's been my lack of good waterproof enclosures that has slowed me down. I wonder if there'd be a good opportunity for someone in that space.


Put it in one of those sealable jars with hinges - stuff a few packets of dessicant inside to stop condensation and you're done. If you need to pass through wires score a v-shape into the seal and make sure they get squeezed tight when you close the jar. Something like http://img4.foodservicewarehouse.com/Prd/1900SQ/AmericanMeta... make sure to get some extra sealing rings if you do passthroughs as it's hard to get the scoring right.


What opportunity do you see? There's a huge variety of different flavors of water-resistant enclosures on the market, from things that can only take light spray, to underwater-rated enclosures. Google IP-66 or NEMA4 enclosures. Also, I have at least one project that was housed in clear 3" PVC pipe since it had to go down a well.

One of my favorite companies to work with in this area is Polycase.


But what did you use it for


Are you referring to using something like the particle.io platform?


This blog is excellent, and has a good "what to buy" and some similar projects eg http://www.technoblogy.com/show?WNM

The ATtiny85 costs around $1 - it's amazing what you can do with it. I've made / adapted quite a few projects from there and other blogs


Does anyone know what kind of circuit would be able to convert 12VDC (e.g. from a small lead acid gel. cell battery) to 20VDC (which my Lenovo E470's DC charging voltage)? The E470 is my work computer and does not have an easily swappable battery, so I wanted to experiment with some kind of external charging system I can use when working remote at sites without accessible outlets.


If you want a near turn-key solution, there is this: http://www.mini-box.com/DCDC-USB?sc=8&category=1264

Mini-box has a number of dc-dc converters and battery charging systems for gel-cell and other batteries. I do not have personal experience with their products, but I'd try them first if I wanted something other than a hack.

For hacks there are tons of boost converter boards on eBay/aliexpress, but these typically don't have chargers, short protection, or other features you might want if you need more robustness


Thanks @vibrolax, that does look ideal. I was going to muck around and design the circuit myself (hopefully based off a solid reference design), but the COTS setup you listed above looks just like what I need, so I think I'll go ahead and get it once funds allow :) Thanks again for the tip.


You're welcome. Another possible solution for you is a dc-dc laptop brick like these: https://www.powerstream.com/ADC-p006.htm.

Fit your gel cell with an auto accessory socket, then plug in the power supply.


It's called a boost converter (or step-up converter), a specific kind of DC-DC converter. When searching, first look at rated input and output voltages. If the output voltage of a DC-DC converter is a range, verify in the data sheet that it can indeed _boost_ the voltage to whatever you set it to (unless it's specifically called a boost converter or step-up converter).


Thank you sir! I'll do some part searching this week.


Armbian sounds like a version of ambian




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: