I've been ~hoping~ to use Go to access the GPIO pins, but haven't had much success - any suggestions?
Blog post: http://blog.tafkas.net/2012/10/03/gathering-and-charting-tem...
Here's the program I wrote to read the sensor over I2C: https://gist.github.com/tsyd/f48e933a21fb40b9f9eea28118f54e0...
This has the added benefits of being able to be powered with a solar panel.
1. Why not just run InfluxDB and Chronograf on one of the CHIPs?
2. How are you sending data to the Pi?
I started off with a single C.H.I.P. and ran everything on it but then when I added a second C.H.I.P. I wanted to have all the data in a central location. I also found Chronograf lag a lot when trying to browse more than a couple day's worth of data. The Raspberry Pi has much faster storage and CPU.
> How are you sending data to the Pi?
The C.H.I.P. has a Python script that runs on a cron that calls a C program to read the sensor and then sends it to the Pi using InfluxDB's HTTP API.
While I don't have any particular reason for collecting the data it can be fun to look at. For example you can easily spot when somebody has a shower because the humidity spikes in the bathroom:
Naively I expected to track the weather, because I figure on a hot day the temperature of my living-room/etc would spike. Turns out this house is pretty well insulated so the internal temperature has essentially no relationship to the external one. I guess that makes sense in a country where you might have -25'C in the winters.
Another challenge was that I couldn't find a good application to display the Google photos album. Nothing I found would display any new photos added to the album after the slideshow had begun while also displaying everything in a continuous loop. I ended up writing a second small Python app also using picasawebsync to periodically sync the photos to a second Raspberry Pi which was hooked up to a projector and display them looped in a random order.
Here's a list of cameras that gphoto2 supports:
I made sure to buy a DSLR that supported live-preview so that our guests could frame themselves before the photos were taken.
The keys have pressure sensitive film underneath that causes a voltage drop when you press on them. There's a wire from under each key that goes to an input on an MCP3008 ADC. There are 20 of those, each with 8 inputs, all connected on a SPI bus running at 2mhz. Effectively, this acts as a 160 channel digital voltmeter. The Pi can scan all the ADCs about 90 times per second, and it converts key pressure into midi commands that can be sent to an external synth or I can run a software synth locally on the Pi.
For both systems I'm using CraftBeerPi, a python project with a pretty active community around it.
I wrote about it on opensource.com and it was pretty popular, which surprised me a little bit. But I guess a lot of people in our community like beer :). I'm always trying to figure out how to make brewing my career without the related massive cut in income (please share any great ideas on that!)
I wrote my own small weather service, http://www.brilcast.com/, and re purposed the waterproof sensor to help track and record the process from lautering/sparging all the way to transfer to primary fermentation.
The setup is simple. Sensor code is written in Python, then it's sent to a SQLite database running on a Flask webserver. The frontend uses D3.js to visualize the data. The entire service is hosted on Pythonanywhere. It's quite spartan, but it works well. https://github.com/williamBartos/brilCast
Locomotion is largely based upon the designs documented by Cynthia Brezeal Ferrell (MIT mobile robots lab, under Prof. Rod Brooks) in her PhD thesis for the hexapod robot, Atilla/Hannibal.
The first attempt was using Python which presented two insurmountable problems : 1) raspian OS boot time of 1.5 minutes which is unacceptable for an embedded device and 2) python threading is not sufficient for realtime. I was attempting to make series elastic actuators but the imprecision of the threading (jitter) was leading to wild oscillations... I finally had to accept it was a dead end.
I have started over in Elixir + Nerves which is designed at its core for embedded work. I will admit it is very slow going. Not because of any deficiencies in the language or environment. Quite the contrary -- I get a 10-second cold boot time and superb stability! But rather my mind is the limiting factor here. After three decades of imperative programming, the shift to functional programming is a challenge!
I also remember reading awhile back about the huge market for upkeeping these things once Sony abandoned it. Perhaps you're onto a business!
In the mid 2000s, I had written some code for the Aibo to let it read books aloud.
Flash forward toa couple years ago. My kids saw some video of me with the Aibo Reader project and wanted an Aibo of their own. Unfortunately, the Aibo as a product is dead. The batteries are now dying and irreplaceable (thanks to Sony and their idiotic insistence upon DRM - it's not enough to merely provide electrons, the battery must also know the secret handshake in order for the Aibo to accept it. Maddening!!!) And those few used Aibos that do have working batteries are dying from other issues related to mechanical failure (mostly clutches in the head/neck assy.) yet command a premium price.
So now, my kids still want an Aibo. I got to thinking about how far technology had come in the past 20 years, and started wondering to myself if I could build a facsimile with the RPi. Thus the birth of this little side-project.
Using a flexible servo horn, this allows some degree of deformation when torque is applied. Embed a 1/32" neodymium
magnet in the horn, and a magnetic rotary position sensor in the driven part. As resistance to movement ("torque") increases, the flexible servo horn twists/deflects, angle of difference between the two parts changes and this can be detected by the magnetic sensor.
Picture a torque strain gauge, the way the needle deflects from the centerline of the gauge. This is similar but we're doing so in an radial (edit:removed axial) and not linear sense.
I went on to 3d print a flexible servo horn somewhat similar to this:
but with the magnet embedded in the outer rim of the wheel and the sensor in the inner section.
Different amounts of sensitivity are obtained by printing different spoke stiffness on the horn.
Result is a compact, modular torque sensor for about $2 in parts.
Feed the results to a PID look and you have a nice controller that can tell me when a robot leg is bearing a load, or is jammed, etc. This is essential to proper gait. Otherwise your robot is simply an electronic marionette.
One of the big challenges was getting 16 additional 16bit voltage reads back to my Rpi over i2c bus. (one value for each servo)
Soln: used one of these multiplexer boards:
and I used AdaFruit's excellent servo controller board: https://www.adafruit.com/product/815
and some neopixel rings for eyes:
(wonderful for expressing mood)
Use our Hass.io OS build to setup Google Assistant easily on your Pi. Need a USB microphone and speakers connected to the Pi and you'll get the full Google Assistant experience.
(disclaimer: I'm the founder)
We would love to see what you can do with what we are building and to feature you on our website !
Two things I'm curious about...
1) What are you using for hardware with the Pi? It seems a high quality microphone is important to this application and the only array microphone I've been able to find is the MATRIX creator which seems steep in price if I could just buy an Amazon Dot.
2) Your numbers indicate significantly better performance than Google. How are you able to achieve that? Where does your training data come from if nothing is supposedly leaving my device?
I really strongly desire a system that wouldn't require relying on the cloud but I just don't know how you can get enough training data to be anywhere near as accurate as a cloud provider. That led me to thinking the next best thing would be a setup with Snowboy hotword detection where I know nothing is leaving my device until my own programmed hotword is spoken.
Any plans to release source code? I can't trust any privacy claims without seeing it.
As others said, I'd feel more comfortable with an open codebase.
But basically the steps you would have
1. Get a Raspberry pi (obviously) and load linux with python support.
2. Use this shield https://www.amazon.com/Pimoroni-Unicorn-Hat-Shield-Raspberry...
3. Create a python script that blinks with the HZ that was in the Nature paper.
The Website: http://solarpi.tafkas.net
Github Repository: https://github.com/Tafkas/solarpi
Blog Post: http://blog.tafkas.net/2014/07/03/a-raspberry-pi-photovoltai...
Feedback is very welcome.
Some pictures at http://imgur.com/a/r834D
Measure, how many delta ticks a revolution gives you and adjust the delta to trigger the next station to feel natural when turning the dial. I bent a simple bracket from some scrap sheet metal to hold the mouse in place just above the dial. The black and white threads of the cord wrapped around the dial helps the optical sensor, so try to position the optics above that.
Similar project: https://2dom.github.io/the-radio/ I did not bother to remove parts of the mechanics to make the dial endless, however. Dialling through so many stations that you need that is tedious anyways, so I felt I don't need it.
Oh and by the way, I made the Raspi's filesystem read-only so I could shwitch it of safely with the radio. See i.e. https://hallard.me/raspberry-pi-read-only/
I also have a newer Pi 3 running Stratux for receiving ADS-B traffic and weather on my iPad while flying.
Aside from Stratux there are definitely cheaper/easier solutions for what I've set up but nothing beats the 'free' hardware collecting dust in the bin.
I have an old Nexus7 tablet mounted on the wall (3D printed wall mount which incorporates a wireless charging coil) which hits the web UI of NodeRED running on a Pi to provide a 'home control panel'.
There's a pi with a heap of relays which controls things like my garden lights, sprinklers, motorised curtains (3D printed adapters to convert standard curtain tracks into motorised ones), etc, and then I have a number of Orvibo S20 WiFi power sockets to switch things like lamps on and off.
I have a pi mounted behind the front entrance panel acting as a doorbell, also connected to a camera module which triggers push notifications etc when motion is detected. This also ties back into the main 'automation hub' via NodeRED.
I also built an aquarium monitoring/control system on another Pi, with ambient and water temperature probes, a bunch of relays turning filters, lights, heaters, pumps, etc on and off, and an IR-emitter to send the relevant commands to my lighting fixtures to control the colour temperature and intensity (providing a 'sunrise'/'sunset' effect). Again, this all ties back to the main system using NodeRED.
Unrelated to the home automation, I also use a Pi to manage and run my 3D printer using OctoPrint, and run my media centre using LibreElec/Kodi/Emby (with my media stored on my main workworkstation and served via Emby).
I know it's Kickstarter but it will be shipping very soon and you can already download the files to cut your own if you're into that.
(Disclaimer: it's my project!)
Unless you're doing computationally intensive tasks, I find a Raspberry Pi 3 is overkill. If you go with the cheaper models like one of the low end Orange Pi's, you don't mind dedicating them to projects, even if the project is pretty useless.
I'll be honest: it's a lot of fun, but if I lived 100 lifetimes, it would never save me time on balance. ;)
I also use one to run stratux as another poster mentions. That one saved ~$650 vs buying the COTS solution.
http://bemasher.net/rtlamr/signal.html has information on how it works, but you don't need to do/know anything about the protocol to get it to work.
With an LED strip, some carpentry, an Arduino, and rpi, I've brightend up my deck a little bit. The rpi is there to program the arduino while embedded and to have a web interface to control the lights. Still to do was to get Homebridge (which is working on the rpi) to turn the lights on and off using Siri.
Potato quality photo of the very advanced system I came up with for keeping all the components together: https://tootcatapril2017.s3-us-west-2.amazonaws.com/media_at...
Hope this helps!
If you're using BlueZ and Pulseaudio, it comes down to a couple of simple config entries (google for Linux Bluetooth A2DP Sink).
I used a RPi 2 with a USB Dongle for the Bluetooth though, so I can't vouch for the internal Bluetooth of the RPi 3.
Personally, I plan to use it as a traffic camera mounted on the window of our office.
Here's a grainy video: https://youtu.be/sXVZhv_Xi0I
Here's the code: https://github.com/nick264/music-processor-master
Actual code didn't rely on rPi (Elixir/OTP on Linux). But we shipped them on Pis. Other options considered had been Galileo and also an SoC called (I think) Quark (also from Intel).
It started as a for-fun project and I'm now working full time on it. So I guess it qualifies :-)
If you want to display any kind of information on your Pi, you might take a look. The code that "runs" the display is written in Lua and the system is pretty programmer friendly. You can even 'git push' and deploy directly on any number of screens. Questions welcome!
One they issued a ticket for 80 in a 35 to a driver at 5:00am and then again the next day the driver did the same thing! Didn't expect them to be there two days in a row.
Hardware-wise, it's just a Pi and UnicornHat. I wanted to use off-the-shelf components since it's in an office environment with rather strict rules about what can be plugged into the wall.
It's not strictly a hardware project, but it's a crucial building block for any network-enabled Raspberry Pi project, and we'd love your feedback.
1. I've a python script that sends bluetooth LE commands to my ceiling fan & lights. Ceiling fan has got a bluetooth LE remote you get from Lowes. Then used ha-bridge to simulate Philips hue, so now I can control the lights & fan using Alexa.
Built it to turn lights no/off with my new born daughter in hand.
And since about 5 months of age, every time anyone says the word Alexa she looks around to see the change in environment.
2. Run this useful open source project called LittleSleeper to detect my baby daughter crying at night.
3. Configured IP camera to upload pictures on activity and use BerryNet to detect whats in the images
The repository below contains code and instructions on how to setup the Raspberry Pi device to report temperature/humidity data along with manual alerts to the server: https://github.com/ankurp/thermostat-sensor
The server code where data is received and saved, notifications are sent, and the entire system configured via the admin portal is here: https://github.com/ankurp/thermostat
Another is a quadrupedal robot (more like a puppet to start with; autonomy would come after I've got the gait control code working). Control would be through a bluetooth game controller. I've got a laser-cut acrylic body for the thing and a servo control hat to deal with timing jitter.
Third, I've got a Pi-Zero and a broken PSP. 4.3" Backup camera screens are the right size, shape, and resolution to fit in the PSP case, they can be modified to take 5V instead of 12, and the Pi-Zero has 2 contact pins for the analog video out. I'd need to experiment with audio out; I've got a couple ideas.
So I am going to try and monitor the water levels in my house.
What I bought to do that: A ESP8266 board (it's like an WiFi arduino), and an ultrasound distance sensor.
My plan is to point the sensor vertically towards the water. It emits sound and then measures the time it takes for that sound to bounce back. I think I can use to measure the level of water.
Then the results will be sent (using MQTT) to my Raspberry Pi that is running Home Assistant.
1) Internet radio w/ an amplifier in a cigar box. It was a gift for my gf and only plays the station she listened to in college. https://github.com/rocktronica/curpi
2) Timed camera and GIF maker for my cat feeder. https://github.com/rocktronica/feedergif
3) OctoPrint server for my 3D printer http://octoprint.org/
I built a "Kitchen Dashboard" last year: https://gavinr.com/2016/01/10/raspberry-pi-kitchen-dashboard...
And of course you have to build a RetroPie: https://retropie.org.uk/
Other things on Pi in my house: OpenVPN server (http://www.pivpn.io/) and Node-RED (https://nodered.org/) for collecting temperature data and pushing to Power BI.
I use Octopi to control my 3d printer: https://octopi.octoprint.org/
And I've started experimenting with different sensors using Golang and embd: https://github.com/kidoman/embd
Leaderboard - http://www.teabot.co.uk/index.html
Some pictures: https://github.com/Hylian/PiHUD
Making an snes emulator in an HDMI dongle form factor with wireless controllers.
SNES on your main TV system, switch TV inputs, play Super Mario Kart. No hookups, no wires.
It can also stream the video back to my computer so people on the stream can see where the car is going.
Fish code is python: https://github.com/djmips/trout
I'm also in the process of building a "magic mirror" which will have some home automation and Google assistant built in.