It's slow haha. Faster on the Pi Zero 2 with bookworm. I think it's 20-30 seconds.
I use systemd to run main.py let me time it real quick.
Edit: I was off, it's 40 seconds when the intro animation starts playing.
This is why the camera spends most of its time in the home screen state until you're ready to take a photo which is when the live pass through plays or while recording a video (allows you to change focus/aperture while filming). That also conserves power since it has the highest current draw while recording/showing a live preview.
I would advise against running Alpine on a Raspberry Pi (Zero)
Alpine is great for servers or Docker containers, but once you start installing it on barely supported hardware you'll notice the pain.
I have seen some people work on a Kodak sensor and also Cine Pi's latest work with the IMX585 sensor is really amazing (4K/8K upscaled). The IMX585 sensor is expensive though compared to the HQ Cam IMX477. Here I'm using the Pi Zero 2 here so can't do 4K I believe (data lanes limit or something, also writing to SD card).
Thanks for the link to the Kodak sensor - that's awesome, because those sensors have a fairly common interface across all the other Kodak sensors like the 48x36mm Medium format 22MP sensors... It would be pretty easy to adapt this project to make your own CCD medium format camera!
Breaking Taps had a video not long ago covering his efforts to design and fab his own image sensor, though it didn't work and it wouldn't have been anywhere near 1MP even if it had.
FPGAs were an odd experience of learning for me. I started thinking they were hard to program. Then once I got halfway decent at simple things, I realized just how powerful they can be. So many things just run faster if not instantaneously if you optimize for FPGA. It's incredible. And with modern cheapy low end boards, I hope they get more popular in projects.
FPGAs are particularly amazing for this sort of project where you would otherwise need a custom ASIC that could never be even remotely economical to build (see my 16-core Z80 laptop as another excellent example: http://www.chrisfenton.com/the-zedripper-part-1/). It lets you play 'fantasy computer architect.'
I get that. I think it took me a full semester in uni, and a summer internship, and maybe a few more months after that. It's hard to really break through. But once I did, it got a lot easier.
It is on my list of things to learn/have one (Orange Crab). I just haven't had a specific use yet. I know they use them for video out/camera for example. One day... so many things to learn/need time.
The orange crab looks nifty. My recommendation is to start very very simple. I'm not sure how this board integrates with the tools with their DFU mode and all that, but hopefully it allows the same quick iteration as JTAG. I think the next time if any you feel like using a microcontroller, try using the FPGA instead.
Thanks. I listen to EmbeddedFM and I always hear about JTAG but have not used one. Interesting thought about subsituting an FPGA for a microcontroller. Guess you would have to know how to use one in order to do something trivial like blink/move a servo.
I don't have a hardware background but would like to work in one eventually (probably not though no degree so do it on my own time).
These are my hardware/robotic/vision/navigation related projects. They are pretty noobish but fun to work on.
Unfortunately lately I haven't had as much time as I'd like to work on this stuff.
If anyone is interested in looking as far as providing architecture/code review type thoughts, I'd be open to them.
[1] autofocus camera with speech intent intended later for tracking
[2] visual depth probe slam from crude panorama/"color segmentation"
[3] single point slam insect-style quad
mid-level full stack developer, 4-5 years experience
I've primarily worked in web-related tech, I'm working on trying to get into hardware overtime (currently RPi/Arduino). I have different projects on my github/hackaday.
I made this [1] 6 years ago reads HN top 10 articles, top comment every morning... this thing has been sitting on my shelf since then playing everyday lol. Which is amazing because it's a janky breadboard that I never put on a solderable board.
Anyway regarding summary... I looked at rapid api before they have a sumarizer on there, can plug it into polly. It seemed decent but I wanted my own summary based on my own reading process... but never got around to it.
I use noteself for this, which is an extension of tiddlywiki using an in-browser pouchdb back-end. This can then be sync'd to a couchdb instance, which can be self-hosted - which is what I do.
It's perfect for my needs, I need to work out setting up separate databases for separate note purposes (keeping personal and work separate). I find tiddlywiki / noteself's structure perfect for the various streams required for the management of projects.
reply