Hacker News new | past | comments | ask | show | jobs | submit | jcun4128's comments login

amazing, would be funny, liter? no nuclear submarine

Here is a write up on topics covered by these camera iterations

https://medium.com/@jdc-cunningham/making-a-user-interface-f...

A lot of pictures and full menu map

And MS paint wiring diagrams

(1st camera) https://github.com/jdc-cunningham/pi-zero-hq-cam/tree/master...

(2nd camera orange) https://github.com/jdc-cunningham/modular-pi-cam/blob/master...


Nice work! What's the boot time like? I didn't see it in your Medium page.

It's slow haha. Faster on the Pi Zero 2 with bookworm. I think it's 20-30 seconds.

I use systemd to run main.py let me time it real quick.

Edit: I was off, it's 40 seconds when the intro animation starts playing.

This is why the camera spends most of its time in the home screen state until you're ready to take a photo which is when the live pass through plays or while recording a video (allows you to change focus/aperture while filming). That also conserves power since it has the highest current draw while recording/showing a live preview.


Try alpine to slim it down?

I would advise against running Alpine on a Raspberry Pi (Zero) Alpine is great for servers or Docker containers, but once you start installing it on barely supported hardware you'll notice the pain.

I will check it out, I'm using headless bookworm atm

I have seen some people work on a Kodak sensor and also Cine Pi's latest work with the IMX585 sensor is really amazing (4K/8K upscaled). The IMX585 sensor is expensive though compared to the HQ Cam IMX477. Here I'm using the Pi Zero 2 here so can't do 4K I believe (data lanes limit or something, also writing to SD card).

Kodak sensor https://www.youtube.com/watch?v=Ma9FrN5COIo

Recent CinePi work https://youtu.be/tI7hIKG1v40?si=BUvOOGutQJDnv09q&t=177

I'm not affiliated with CinePi I'm just amazed what you can do when you know what you're doing ha (eg. color grading)


CinePi didn't develop the hardware, that's Will Whang's (https://www.willwhang.dev/OneInchEye/) work.

Ah I did not know that, thanks

Thanks for the link to the Kodak sensor - that's awesome, because those sensors have a fairly common interface across all the other Kodak sensors like the 48x36mm Medium format 22MP sensors... It would be pretty easy to adapt this project to make your own CCD medium format camera!

Another good link

https://www.youtube.com/watch?v=6QQx0G5MR3k

Trying to find this one where a guy made a sensor stack from scratch it wasn't great like 1 MP but still amazing

This one open source USB 3 camera damn!

https://www.circuitvalley.com/2022/06/pensource-usb-c-indust...


Breaking Taps had a video not long ago covering his efforts to design and fab his own image sensor, though it didn't work and it wouldn't have been anywhere near 1MP even if it had.

I watched that, was amazing, those kind of creators like Applied Science are on another level.

The round displays are so cool

Uses an FPGA lol damn that's hardcore


FPGAs were an odd experience of learning for me. I started thinking they were hard to program. Then once I got halfway decent at simple things, I realized just how powerful they can be. So many things just run faster if not instantaneously if you optimize for FPGA. It's incredible. And with modern cheapy low end boards, I hope they get more popular in projects.

FPGAs are particularly amazing for this sort of project where you would otherwise need a custom ASIC that could never be even remotely economical to build (see my 16-core Z80 laptop as another excellent example: http://www.chrisfenton.com/the-zedripper-part-1/). It lets you play 'fantasy computer architect.'

FPGA is the only thing where I haven't seen any progress in my understanding after trying to get anywhere for hours and hours.

I get that. I think it took me a full semester in uni, and a summer internship, and maybe a few more months after that. It's hard to really break through. But once I did, it got a lot easier.

It is on my list of things to learn/have one (Orange Crab). I just haven't had a specific use yet. I know they use them for video out/camera for example. One day... so many things to learn/need time.

The orange crab looks nifty. My recommendation is to start very very simple. I'm not sure how this board integrates with the tools with their DFU mode and all that, but hopefully it allows the same quick iteration as JTAG. I think the next time if any you feel like using a microcontroller, try using the FPGA instead.

Thanks. I listen to EmbeddedFM and I always hear about JTAG but have not used one. Interesting thought about subsituting an FPGA for a microcontroller. Guess you would have to know how to use one in order to do something trivial like blink/move a servo.

Location: KS, USA

Remote: preferred

Willing to relocate: no (can't afford to)

Technologies: (web) React/Node/Flavors of JS/MySQL/Postgres/Linux/AWS/PHP/Python/Rails

Résumé/CV: https://github.com/jdc-cunningham/jdc-cunningham/blob/master...

Email: my username at big G

I have about 5 years of experience, mostly in web application space. I tinker with hardware too, would like to go that route. I have no degrees.


I don't have a hardware background but would like to work in one eventually (probably not though no degree so do it on my own time).

These are my hardware/robotic/vision/navigation related projects. They are pretty noobish but fun to work on.

Unfortunately lately I haven't had as much time as I'd like to work on this stuff.

If anyone is interested in looking as far as providing architecture/code review type thoughts, I'd be open to them.

[1] autofocus camera with speech intent intended later for tracking [2] visual depth probe slam from crude panorama/"color segmentation" [3] single point slam insect-style quad

[1] https://github.com/jdc-cunningham/ml-hat-cam [2] https://github.com/jdc-cunningham/pi-zero-2-robot-navigation... [3] https://github.com/jdc-cunningham/twerk-lidar-robot/tree/dev


Location: KS, USA

Remote: preferred

Willing to relocate: no (can't afford to)

Technologies: (web) React/Node/Flavors of JS/MySQL/Postgres/Linux/AWS/PHP/Python

Résumé/CV: https://cunninghamwebdd.com/resumes/Jacob-David-C-Cunningham...

Email: my username at big G

mid-level full stack developer, 4-5 years experience

I've primarily worked in web-related tech, I'm working on trying to get into hardware overtime (currently RPi/Arduino). I have different projects on my github/hackaday.

Note: my website above was built 7 years ago


I made this [1] 6 years ago reads HN top 10 articles, top comment every morning... this thing has been sitting on my shelf since then playing everyday lol. Which is amazing because it's a janky breadboard that I never put on a solderable board.

Anyway regarding summary... I looked at rapid api before they have a sumarizer on there, can plug it into polly. It seemed decent but I wanted my own summary based on my own reading process... but never got around to it.

[1] https://github.com/jdc-cunningham/python_aws_polly_hacker_ne...


Do you use this each morning? What’s it like to do that? What other features or changes have you considered for this project but not added?


It's just running on a CRON job 8 AM... it's like an alarm.

The droning voice of AWS Polly is sure to wake anyone up.

A feature I thought about is putting the generated files elsewhere like cloud where you can play them to your wireless earbuds.


"webpad paste" it's a rest-url giant text pad I use to keep notes across devices

I have an actual app too that's also a note taking app (use it everyday) basic React/Electron thing with Express backend

my goal is centralizing my own info


I use noteself for this, which is an extension of tiddlywiki using an in-browser pouchdb back-end. This can then be sync'd to a couchdb instance, which can be self-hosted - which is what I do.

It's perfect for my needs, I need to work out setting up separate databases for separate note purposes (keeping personal and work separate). I find tiddlywiki / noteself's structure perfect for the various streams required for the management of projects.


Idk, just doing a single run watching HTOP it barely notices the requests... I just want to have concrete numbers.

Anyway I'll look around/write what I end up doing if I can still respond to this thread then.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: