> It then set the name of a channel to the results (either 1-person-in-upl or X-people-in-upl), which others could check.
I'm not suggesting you don't do this, but you /could/ setup a speaker to play the classic remix of Steve Ballmer's "developers! developers!" whenever there are >=2 people in the room. On April 1st, of course.
I swear to God that all of these CS labs at different unis look the same. I am getting flashbacks of labs in Toronto that looked exactly like pictures in the post
The physics computer lab in Chamberlin Hall at UW in the 90's was a secret treasure trove of idle NeXTstation Turbo machines in an almost always empty room cooled to near refrigeration temperatures. I used to light up at least half of that room to run distributed simulations. There's probably still a 30 year old key to that lab in a junk drawer somewhere.
Eventually I realized that it just made sense to suck it up and get my own hardware, as it was either going to be esoteric "workstation" hardware with a fifth of the horsepower of a Pentium 75 or it was going to be in a room like the UPL jammed with CRT's and the smell of warm Josta.
How do students operate these days? Unless one is interacting with hardware, I'd be very tempted to stay in "fits on a laptop" space or slide to "screw it, cloud instances" scale. Anyone with contact in the last 5 years have a sense of how labs are being used now?
It's been nearly a decade now, but we shared a machine with 128 newish physical cores, a terabyte of RAM, and a lot of fast disk. Anyone with a big job just coordinated with the 1-2 other people who might need it at that level and left 10% of the RAM and disk for everyone else (OS scheduling handled the CPU sharing, though we rarely had real conflicts).
It's firmly in "not a laptop" scale, and for anything that fit it was much faster than all the modern cloud garbage.
The other lab I was in around that time just collected machines indefinitely and allocated subsets of them for a few months at a time (the usual amount of time a heavily optimized program would take to finish in that field) to any Ph.D. with a reasonable project. They all used the same in-house software for job management and whatnot, with nice abstractions (as nice as you can get in C) for distributed half-sparse half-dense half-whatever linear algebra. You again only had to share between a few people, and a few hundred decent machines per person was solidly better than whatever you could do in the cloud for the same grant money.
> Unless one is interacting with hardware, I'd be very tempted to stay in "fits on a laptop" space or slide to "screw it, cloud instances" scale. Anyone with contact in the last 5 years have a sense of how labs are being used now?
In my recent physics experience, this is basically what it was unless you had to rely on some proprietary software only on the lab machines like shudders LabView
In my university you could technically use any computer but must ensure that your code would work/compile on lab PCs cause that's where TAs would check it. As a result, during labs most people would just use computers there(too much hassle otherwise)
I went through community college about 6 years ago. And they still had bona fide computer labs with in-person tech support.
Computers were also ubiquitous in places like the coffeehouse, the library, practically every classroom, etc. And, of course, there were ubiquitous WiFi and USB charging ports, so that students with BYOD could get by (although WiFi was often overloaded and contentious.)
Within the main computer lab I was using, there was also a networking hardware lab, with genuine Cisco equipment such as routers and switches. The Cisco certification prep classes would go in there and do experiments on the hardware, so that students could get accustomed to seeing it in action, however outdated it may be.
The lab itself was chock-a-block with both Apples and Windows PCs, as well as scanners and printers available, and even headphones you could borrow from the desk attendant. You'd need to sign in and sign out. There were strict rules about silence and not leaving your station unattended. There was always space for more users and a generally relaxed atmosphere, where people could feel comfortable studying or doing homework.
I believe that there was also an A/V lab where students could get access to cameras and recording equipment, as well as software for that kind of thing.
The library, in addition to allocating lots of space for Windows PCs and Apples, would also loan out Chromebooks to any student, and I believe they had other things for loan, such as WiFi hotspots, for kids who couldn't afford to carry around their own Internet.
There were also Tutoring Centers, such as the Math one, where most of the desks featured a computer where you could log in to your collegiate account, and access your online course materials.
And the Testing Center was essentially a big computer lab, with cameras and in-person proctors monitoring it. It was partnered with Pearson and CompTIA, so I took more than one certification exam in there.
There is a fully-staffed IT Help Center on campus, so during office hours, you could count on a 1:1 in-person interaction to help you get logged in, debug your device's WiFi, or whatever.
Despite having a great computer setup in the comfort of my own home, and plenty of online courses on my schedule, I still appreciated the immersion of collegiate computer labs, and especially the relaxed coffeehouse access, where I could use Apple systems to work on my English homework and essays.
During the COVID-19 pandemic, all this went topsy-turvy, and a lot of these labs closed down, or took extreme health precautions, and of course, a lot more classes went online-only. But I was done with classes by that time.
I can only speak for the UPL, but, yeah, it was a hallmark of labs at the time that one of the benefits you were getting was the equipment. Nowadays, most people just come in with their laptops -- we have a kubernetes cluster for projects, but most of the actual computing equipment is brought in by students when they want to hang
I had finger running on login to `finger stacy` I was at SDSU on a very large SunOS system and she was at a private school and I assume that computer was a bit more limited.
`Finger Stacy` would run every minute and typically be running for 15 minutes max... that is until I moved into the dorms and my machine was online all the time.
A few weeks go by and I get an email from the SDSU admin requesting that I stop fingering stacy as it was bothering the other Sysadmin. I remarked with a grin that all I was trying to do was in fact try to `name of the command` and they promptly deleted the script from the account.. It still makes me smile as I write this.
Back in the man.ac.uk of the early 90s, there were no cameras or YOLO models but we still wanted to know when machines (especially the colour ones! LUXURY!) were available.
We just had "`rlogin` to every machine in the lab, run `who`, and collate the output". IIRC there was an early version written by 'flup that I extended with a tidier output (including an X11 window), auto-refreshing, and easier machine selection (eg. you could select rooms by name with regex filtering.)
Another solution in the age of portable electronic devices: detect presence based on active DHCP leases. You can let then people register themselves by MAC on some web portal so they can be visible by name if they're comfortable with that.
Or Bluetooth. There are some problems (like MAC randomization) to get a real tracking with BLE, but to have a rough estimation, it's a good technique.
And you can do it with just an ESP32.
The idea is to scan every minutes for BLE MAC Addresses and send to a backend.
On the backend you can identify what is the manufacturer of every MAC addresses and make some correlation.
A very simple idea is to filter only the phone MAC addresses and use as presence counter estimator; it's a bit unlikely that a university student will have more than one phone.
A better method would try to correlate not random MAC addresses (headphones, BLE trackers, wearebles, etc) that are frequently together and mark as a single person.
Love the motivation! Reminds me of the first stuff I built back in my university days.
One thing from and old cranky dev that I notice: it would seem the yml you post has redundancies: either have 4 endpoints, no payload or 4 different payloads, one endpoint (the endpoint itself can tell you what you need to do) However, from the express script it looks like you arrived at this in the final solution anyway. Not sure if I missed something though, is there a reason the API needs such a shape? Cheers!
Hi! Author here. The yml is from a configuration in Home Assistant. Whenever it received an event from the sensors, it would fire the corresponding POST request (which you have to specify ahead of time, hence the configuration being a little redundant). The main reason for switching to the Express setup is that I was having a few random people look at the code & send their own requests to fudge what the status appeared as. I could have added some security/authentication to the endpoints, but I liked the idea of having the frontends simply querying the site (or my proxy) instead of having to cache a POST request.
You can read and write HA entities via HTTP API so I think you could have done this with nothing but HA, but either way it's a fun project :)
One of my more fun HA project was scraping my snow removal service's tractor tracking API and passing the result for the closest tractor into HA so I can be notified (by voice with Alexa and/or push notification) when there's one nearby in case I need to move the car.
Do you mean outside of the API that I talk about in the article? If I exposed the HA API via the frontend, users would be able to just take the bearer token and do whatever they wanted with it (even deleting the entities used for the doors). That's why the express server is sitting in the middle to only expose the relevant information (proxying w/ the bearer token, which the end users have no access to)
I have a stealth startup that designs privacy first solutions to share coarse data about what is inside of rooms without disclosing secrets. This is useful for many use cases, from detecting the number of people in a room to other motion sensing devices which are critical for health and human safety. Patents are pending.
That's great! It does feel like cameras generally collect more data than necessary for some purposes. I have a friend who works at Butlr, which uses thermal sensors to detect body heat (and avoid the privacy risks of cameras) - sounds like your startup is in a similar space.
I know you mentioned patents pending, but is there anything you can share about your approach?
"Room occupancy sensors" are a common product. Butlr apparently uses a low-rez IR camera, although they avoid calling it that. Passive infrared is common. Millimeter wave is available.[1]
Another state of the art people counting system would be ultra wideband. It can literally count the number of heartbeats in a room.
See https://m.youtube.com/watch?v=adiUegDxZEs
My first thought was to put some motion sensors inside the room, along with a sensor for the door (open/closed), as well as some sensor for counting the people walking through the door. Maybe even some co2 sensor to detect changes.
I built a similar system for my school’s CS club. I considered using a door sensor, but the eventual solution I settled on was a light sensor, because it’s almost always true for us that the lights are on iff the door is open.
This way, we don’t need to mount anything on the door, we just have a microcontroller plugged into one of the machines.
Our previous solution was a webcam that pointed to the lights that did a similar thing (implemented by someone before my time) but then it stopped working due to some driver issues, and I didn’t want to spend time investigating them.
This is neat. I worked on something similar to college where we built an app to track how long the line was at a popular cafe. We set up a camera + raspberry pi in the cafe and tried to the number of people. There was a lot of noise from overlapping people, random people walking in and out of line, etc. Cool seeing all the techniques and approaches you tried!
Very interesting and fun read! thanks for the post, when I saw the public restful api endpoint I immediately tried to see if I can make request just to see the "NICE TRY!" message, haha, then I kept on reading how you managed this issue.
When I was in the CS department at NMSU, the Computer Science building was open to the public 24-hours a day. I suppose that's not commonly the case nowadays.
That's a clever solution that would use less processing power. A camera pointed at the door provides the benefit of being useful for other purposes e.g. person detection / recognition and standard security camera purposes.
room presence detection has been a long standing challenge in the smart home / home assistant community. it's cool to see home assistant used and adapted for this use case.
What part of this would be illegal? It's just a zigbee door sensor. The only issue I could see is the college getting upset but if anything they'd just say 'take it down'
object detection of people from images is probably what they are referring to. there are maybe some state laws that could be stretched to include that but i would say presumptively legal
Generally, in the United States, you can, in fact, just record people. Legally speaking, that is, which doesn't make it a polite or cool thing to do. Necessarily.
If you're on someone else's property, they can of course set any number of rules, and trespass those who break those rules. But even there, recording people, if against the rules, is still not a crime. The crime is trespass, if this journalist we're speaking of sticks around after being trespassed off the property.
Public university labs are generally public as they're state property (in this particular case, UW Madison is a public state University). Further, recording video or pictures of people in public places is broadly legal in the US. There are only "presumption of privacy" restrictions which apply to places such as bathrooms and private property that is not visible from a public location (ex. a sidewalk).
This reasoning makes sense. Roads and parks aren’t public, as they are city property. “Public” is only when something has no legal owner, like the moon and stars, or love.
I'm not suggesting you don't do this, but you /could/ setup a speaker to play the classic remix of Steve Ballmer's "developers! developers!" whenever there are >=2 people in the room. On April 1st, of course.