Hacker News new | past | comments | ask | show | jobs | submit | more jjcob's comments login

Airbnb is mostly commercial listings that are basically unlicensed hotel rooms / appartments. I don't think people renting out their personal living spaces is a big part of their income. It's just a story they use to get around regulations.


That's true now, but not when they first took off.


None of that existed when they first started. The pitch was to rent out your house while you're out of town.


But they only got big when the commercial "hotels" came on and saw they could "work around" laws like that.


Yes... later. They couldn't have predicted it when they started. It sounded like a stupid idea when they started.


That's the point--their utility wasn't obvious at first, until a new market showed up.


Salaries and rent, probably.


YC and an average Seed round only gives you enough money for a small team for 18 months.

Anyone who is half-decent is going to be expensive.


Is it common for landlords to invest in YC companies?


Hookers and blow.


1980 called and said they're now selling OnlyFans subscriptions and ketamine.


The nice thing about VAT is that everyone has to pay it, whether they pay income tax or not. Lots of people get their money income tax free (undeclared jobs, income from abroad, inheritance, etc). With VAT everyone has to pay.


ChatGPT does a surprisingly good job of estimating calories from a picture of your plate. Especially if you add details that are hard to tell from the picture.


I've done that for weight loss, so I focussed on calories only. That was pretty easy:

- while cooking, you weigh every ingredient. Either I just take photos of the scale with my phone, or I write it on a sheet of paper.

- when cooking is done, you weigh the total food (easiest if you know the weight of your pots)

- when eating, you weigh your portions

After some time, you realise that you need to be precise for some things (oil, butter) but can just guess or ignore some things (eg. onions and miso have so little calories that you really don't need to weigh them).

If it's a dish like Lasagna, you don't even need to weigh it at the end, just estimate what fraction of the dish your serving is.


Exactly this. You just weigh every ingredient. It doesn't matter if it's a sauce or what. If it's something premade (like tomato sauce) you use the calories on the packaging. If it's a raw ingredient you look it up.

I never bothered with weighing the final result or portions, instead I just always divvied up the final product into equal individual portions and divided by the total number of portions. That works well if you freeze them.

Of course, all the calculation is a tremendous amount of work. I did it when I needed to lose weight and only did it for a couple of months. But it definitely "calibrated" my understanding of calories -- e.g. non-starchy veggies have barely any at all, while cheese and butter and oil can easily double the calories in a dish.


How do you calculate calories?


Keep in mind that I calculate enough to achieve caloric deficit. Not to reach an exact number.

I also leave the nutrient part on just eating a varied diet, with lots of whole foods.

I personally use MyFitnessPal, weigh the calorie significant food (e.g. the Protein, starches, fat-rich vegetables and fatty sauces) and establish a rough estimate about the calories.

I try to maintain the error an order of magnitude lower than my estimate. That's why I don't bother weighing leafy and "watery" vegetables (e.g. spinach, letucce or cucurbits). Also, I try to keep an eye of sauces like Mayonnaise, but I usually relax on Mustard (I dunno where you live, but mustard here tends to be low-fat by default).

That error can be easily burnt by the casual movement we do in the day.


Some foods I know, eg. oil 9kcal/g, but mostly I just check the label. Every food in the EU has the calories/100g or calories/100ml on the label. If it's not packaged, I look it up it FDDB [1].

[1]: https://fddb.info/db/de/produktgruppen/produkt_verzeichnis/i...


https://cronometer.com this is what nutritionists use.

It tracks not only calories, but also macros and micros.


I was so happy when HDMI caught on that the troubles with VGA ports in meeting rooms were finally a thing of the past.

But now I randomly get "HDCP not supported" messages when trying to make a presentation because... I have no idea why. It's just a giant fuck you from the recording industry.

I could download a torrent of any movie I want, so the tech is obviously not preventing piracy.

It's just making random things in life harder than they should be.


HDMI licensing is a pain in the ass. There's charges per device for simply providing connectors, and the HDMI forum refuse to let open source GPU drivers implement HDMI 2.0 or above.


> HDMI forum refuse to let open source GPU drivers implement

What? How can an entity "refuse" to let others implement something?

It seems to me that the HDMI forum does not have any say in what someone decides to implement.



The HDMI Forum most likely does have a say about a corporation's implementation, or for allowing the protected hardware to be used.

But that likely leaves space for specs and keys to be leaked, read out, reverse engineered or worked around at some point. Not by AMD themselves.


> I could download a torrent of any movie I want, so the tech is obviously not preventing piracy.

But you couldn't manufacture your own monitor/projector/media player without permission from and tribute to the HDMI lobby. Well, you could, but it would fail commercially due to incompatibility. In other words, DRM is an anti-competitive cartel.


> the troubles with VGA ports in meeting rooms

please elaborate

fwiw vga is plug and play, but multi-monitor support in operating systems was indeed a pia


In my experience, the cables and dongles were prone to loose connections. You had to fiddle with the plugs to make sure they had a proper connection.

Selecting the right resolution was also problematic. Sometimes the native resolution of the projector didn't work for some reason, leading to blurry images.

I remember one time there was a weird issue where only half the image was shown. Another time, the image showed up with wrong colors (not sure how that happened).

HDMI isn't all rosy either, poor cables also cause connection issues. I had one cable that only worked in one direction. That was very odd. But in my experience HDMI connections are way more reliable than VGA connections.

(Maybe projectors and laptops also became more reliable, can't say for sure)


HDMI is a pain in different ways, and these are just examples in my house. Keeping track of version 1-2.2b has become a small chore. Perhaps it is time I burn it all down to claim insurance and start over.

As soon as you go past 1080p@60Hz, as you pointed out, you can't just grab any cable. I suffered a great deal from this moving to 4K screens. Sparkles, drops, and black screens are usually a connection problem. Some smarter device/driver combos will work around a bad connection by dropping colour information to fit into the available bandwidth, some won't.

I have one 4K display where HDMI 1 is, well, HDMI version 1. HDMI 2 (as in the second port) is HDMI version 2 and will actually display 4K@60Hz.

I have TVs that need fiddling to get the proper native resolution and framerate. Some need game or PC mode to disable overscan and show the whole image.

Currently on my desktop connected to a 4K TV, if I try to set a game to 1920x1080, the driver seems to pick something strange and I get no image at all. I'm not sure who to blame here.

I still have devices that won't do 4K@60Hz, they're limited to 30Hz. It's a device limitation, fine. A Raspberry Pi 4 will output 4K@60Hz but not by default. You have to enable it in the firmware config.


I've resorted to putting tags on all my HDMI/Display Port cables with the version/anything special cause I was sick of trying to figure out if the problem was the device or the cable being old.


VGA can show wrong colors when one of the pins isn't completely connected, which can happen if you're used to needing to use force to get it connected and you just jam it on and bend a pin. Some pins control specific color domains, one time I bent and pin and blue completely disappeared from my monitor. Thought it was a gamma gun going bad until I noticed the pin.


Check with the AV supplier for the venue and you will find that most conference projectors are intentionally lower native resolution than entertainment projectors. They are different types of hardware for different markets.

I like to prepare the presentation using a laptop or monitor having the exact same native resolution as the projector will have from the beginning.


VGA/DB15 is not a hot-plug connection by default.

That part started with DVI.


For what it's worth, the second letter of the d-sub naming convention indicates the width of the shell. A DB15 would be excessively wide for the number of pins. The correct name for the classic three row VGA port is DE-15 and it uses the same width shell as the DE-9 often used for serial ports.

Note, old Mac's used a wider, two row DA-15 at one point.

The DE-15 is occasionally called an HD-15 and the correctness of that is widely debated on internet forums.


Interesting, thanks!


No, it started with DDC and was used since Win95, the first PnP OS.

https://en.wikipedia.org/wiki/Display_Data_Channel


Being PnP doesn't imply any hot-plugging capability.

DDC allows digital means of changing data and letting the OS know what the monitor can do. It doesn't allow/enable hot-plugging.

Since the interface doesn't support hot plugging by design, there's no standard way to detect a new VGA peripheral. However, manufacturers flexed the standard to try to enable hot-plug, but it doesn't work reliably, as we seen for years.

Similarly, PS2, SATA, PCI are not hot-plug by default, even if they're PnP. PS2 required standards bending, SATA had to wait AHCI, and PCI had to wait PCIe to gain hot-plugging support. To add to the list, IDE drives required special hardware, and RAM requires chipset and board support to be hot-pluggable. RAM has myriad of ways of identifying itself, making it truly PnP out of the box.

So, being PnP doesn't mean anything, from a hot-plug perspective. They're very different things.


You have it all wrong.

VGA D-SUB actually is hot-plug. You can connect or disconnect a monitor or projector at any time with no risk of damage. SATA is also hot-plug for connect, but it requires firmware support for disconnect (safe eject, more precisely, because it will detect a forced disconnect). It won't support hot-plug if used in IDE compatibility mode, because IDE was not hot-plug.

PCI is also hot-plug, but not the desktop connector.

PS/2 never was hot-plug. It's a serial port with interrupt assigned at boot if there's a device connected there. It's not possible to assign the the resources after the system is booted.

I can't remeber what Win95 could do, but I'm sure that Win98 had support for dual monitor - I used that a lot. I could turn on my second monitor at any time. That's because of PnP. Win311 was not PnP and required a restart to make changes to display configuration.

I'm not sure what you belive "hot-plug" means. Possibly you wanted it to auto-change the default output configuration when something was connected/disconnected? I was very happy it didn't do that! But it was short-lived. The auto-bullshit stuff was introduced by Radeon and Nvidia drivers, independent of OS, and I absolutely hated it when the driver auto-reverted to 60Hz on my 120Hz Trinitron! Many 3rd party tools were written to fix that. I remember using RefreshLock.


You're mixing up two related concepts.

Software support for hotplug can be added or removed as desired. That's an OS feature. You could absolutely reconfigure the interrupts without rebooting, if you felt like it. But hot-plug support starts in hardware, as an attribute of the connector. Being able to safely make and break connections while the circuits are electrically "hot", without damaging the circuitry on either side.

Generally, this can be done two ways:

The first is by having circuitry that moves so little power, or moves it in such a way, that it can't be damaged by the connections being made or broken in random order. For example, plugging a light into an outlet. It doesn't matter if the line or neutral conductor makes contact first, since the light either receives power or it doesn't, and neither state is unsafe. (Don't touch the blades of the connector. That's another matter entirely.)

The second is by having a connector design where some circuits are guaranteed to be connected before others. This is typically called a "make first / break last" scheme. At its simplest, the metal shell of a D-sub connector is really really likely to make contact before any of the pins, and in practice is effectively a make-first. But all the other pins make contact in random order. Compare to something like the SATA power connector, where the grounds are longest, power pre-charge after that, and main power at the very end. This is unconditionally safe to plug and unplug while hot.

VGA is hotplug-safe in practice because while the connector isn't really designed for it, as long as ground makes first, the analog video signals aren't picky at all (they're capacitively coupled and have no DC component), and the DDC data lines have enough short-circuit protection to tolerate whatever. (Because the D-sub connector also isn't "scoop-proof" -- it's possible to touch the male pins with the shell of another connector during clumsy mating, all circuits have to tolerate shorts to ground.)

RS-232 by the way, which was designed for D-sub connectors, contains language in the spec requiring that all circuits be tolerant of indefinite shorts to any other pin or to ground. It doesn't have to function in that state, but it's not allowed to sustain damage.

PS/2 isn't hot-plug safe even if you preassigned the interrupt (or booted the machine with the keyboard connected and then unplugged and replugged it later), because the pins aren't sequenced, and the circuits aren't designed to tolerate random mating order. If the power and data lines connect before the ground, you can get a CMOS latchup situation in the controller silicon that can only be cleared by total power removal. In practice this was fairly rare because the ground usually made first, and before I understood about this, I only smoked 2 motherboards' PS/2 ports despite hundreds of hot-plugs of keyboards and mice.

The canonical example of a terrifyingly-hotplug-unsafe connector is the TRS phone plug and jack. They change order during the mating process. Some old guitar effects pedals used this connector for power, and you were virtually guaranteed to smoke a transistor if you hotplugged it. These connectors were meant for telephone signals (which can tolerate polarity reversal and indefinite shorts to ground, by design), and some idiot decided to put power over them.

Note that there are no drivers or interrupts being assigned to a guitar pedal. Software support is entirely unrelated to the electromechanical phenomenon of hot plugging.


Hotplug needs support at all levels to work. The connector is just one of them.

In case of PS/2, it needs IRQ12 specifically and it doesn't support shared IRQs like PCI does. If PS/2 is not plugged in at startup, IRQ12 is reassigned by BIOS to PCI or ISA PnP cards, so no matter what OS does, it PS/2 can't work without a reboot.

> PS/2 isn't hot-plug safe even if you preassigned the interrupt (or booted the machine with the keyboard connected and then unplugged and replugged it later), because the pins aren't sequenced, and the circuits aren't designed to tolerate random mating order. If the power and data lines connect before the ground, you can get a CMOS latchup situation in the controller silicon that can only be cleared by total power removal.

It can also be a firmware bug or a momentary brown-out during the connector insertion that glitched the controller, which could happen even if the pins were properly sequenced.


Especially silly because the HDCP master key got leaked back in 2010.


I've read that there are HDMI splitters and other devices like that that incidentally also happen to strip HDCP. Maybe you can scrounge up one of these to carry?


Well, good thing that we are slowly moving everything into DP.

But it's a bad thing that it's so slow.


We're...not, though?

Sure, computer-based displays are supporting various DisplayPort standards more broadly all the time, but TV-based displays are still all-in on HDMI, and the #1 reason (well, OK, the #1 reason is "because that's how it's been", but the #2 reason) is because the big TV/movie companies demand HDCP—DRM on the cable.

I'd love to see a big dumb TV and a set-top box or game console with a DisplayPort cable connecting them, but I don't actually expect that to happen any time soon.


> I'd love to see a big dumb TV and a set-top box or game console with a DisplayPort cable connecting them, but I don't actually expect that to happen any time soon.

It's a shame the Alienware 55" OLED gaming monitor (with DisplayPort) seems to have been a one-off.


> I could download a torrent of any movie I want, so the tech is obviously not preventing piracy.

Could? Why don't you? Stop feeding this terrible industry doing everything it can to put the personal computing genie back in the bottle.


Rights holders are pretty good these days about notifying your ISP so they can send nastygrams threatening to terminate service. Usually there's something like a three-strikes policy.

So, safe torrenting involves either paying for a seedbox, or tunneling your client through a VPN.

I'm sure you know all this already, just putting this as a warning to passers-by.


Correct but 0-5$ a month is still cheaper and less effort than 10 different streaming services for +10$ each with the added benefit of preventing ISP surveillance.


> It's just a giant fuck you from the recording industry.

I eagerly await the moment when AI folks will just buy a bill to abolish copyright and send the content industry packing to do something more useful than sitting on swaths of human culture and clipping coupons.


Nah, that won't happen. AIBros will just pay them to get the medium for peanuts money.

Just like this: https://mathstodon.xyz/@johncarlosbaez/113221679747517432

Spoiler: Academic publisher Taylor & Francis recently sold many of its authors’ works to Microsoft for $10 million, without asking or paying the authors — to train Microsoft’s large language models!


Yeah, theoretically, this battle should already have happened, the moment Disney realized there was mouse IP in DallE's, Stable Diffusion's etc trainsets and people were using it to create unauthorized content.

In practice, they seemed too interested in using the technology themself to care.

I predict IP law will just become fully hypocritical, with your protection as a creator and consumer depending on your status and connections.


> Yeah, theoretically, this battle should already have happened, the moment Disney realized [...]

The fact that it has not yet happened makes me very hopeful about the outcome. Basically content industry knows it's gonna lose and just sits really still to feed as long as still possible before the inevitable end.

> I predict IP law will just become fully hypocritical, with your protection as a creator and consumer depending on your status and connections.

That's exactly how it always worked, at least for as long as I'm alive.


IP law isn't hypocritical. It's doing what it was built to do[0]: centralize control of publishing in the hands of capital so that the state can then regulate speech through regulating those publishers.

You see, in England, publishing used to be a state monopoly, but it was extremely unpopular with authors, so Parliament dropped the law that established the monopoly. But they still wanted the control over speech that such a monopoly would provide. Publishers had a long habit of ripping off[1] authors, so this new censorship regime was sold as a way to bind publishers to authors. In other words, cede to the state control[2] over your speech and we'll mint you memberships to the new and upcoming capitalist class.

Copyright is often framed as a bargain, or social contract[3] between the public and authors: we agree to not copy this work for X years and you agree to make works without expectation of prepayment. The real social contract is between authors, publishers, and the state: you deliver our propaganda, and we treat authors' labor as a special kind of capital, which publishers are allowed to trade like stocks.

Like all social contracts, this deal has changed before and it is currently changing now. Publishers still have an interest in cutting authors out of the deal, and generative AI gives them cover to do so in the name of innovation.

[0] https://en.wikipedia.org/wiki/Statute_of_Anne

[1] Politically correct: "capturing the value stream of"

[2] The American version of this dropped the state censorship regime, but we still occasionally see attempts to wield copyright as a censorship tool. Most recently, someone tried to sell returning to 14-year copyright terms as a way to punish Disney for being too "woke".

[3] A gentleman's agreement, informally bargained for through the actions of many people, that has been codified as law and enforced through the power of the state.


In case you are considering Nikon as an alternative, their Webcam Utility might be free, but it doesn't work on the latest version of macOS.

There are 3rd party utilities (paid), but I had trouble with autofocus when I tried them.

I wish camera manufacturers put half as much effort into usability as smartphone companies. Why does a camera need drivers to be recognized as a webcam at all? Why doesn't my 2000€ camera come with GPS and LTE built in? Why is the software still as crappy as in the 90ies?


> Why doesn't my 2000€ camera come with GPS and LTE built in?

2 seconds later on HN: why does my 2000€ camera spies on me? If you want a smartphone, use one, leave us be with our sane tools.


The problem is that right now, you need to install a Nikon Spyware app on your smartphone if you want geotagged photos.

If the camera had GPS built-in, you could have geo-tagged photos without needing spyware on your phone.

Geotagged photos are extremely useful, there's a reason why they sell GPS dongles for cameras. Cameras really should have that built-in (and I think the top-of-the-line models do)


Modern Sony cameras can be used as webcam without any software. Just plug in, select usb streaming, and done.


Sonys are some of the better cameras for software, but that's a low bar. I love my Nikons for picture taking though.


Nikon has been killing it lately. Z8, Z9, Z6iii, lots of cool lenses! If you're in the market for a full frame body and "the holy trinity" (16-35, 24-70, 70-200 F/2.8), you can't go wrong with any of the major 3. They're all very competitive.

(but I would go with the Sony because I like their designs the most, Nikon would be my 2nd choice)

Nikon users are missing out on a 16-35 2.8 though, but I'm sure Nikon is working on it.


I've only ever used Nikon starting with a D40 ~20 years ago. I regularly still use my D7100, but primarily use my Z5. The only time I'm ever envious of another camera system is when connection options like this come up. But then I go out and take pictures and I'm reminded why I'm probably Nikon for life :)


I'm running my full frame Nikon DSLR as a webcam using a $15 USB to HDMI dongle - works great.


Same, i also got a "remote clicker" cable, and have modified the button to always stay pressed, so it does not switch off after MAX_TIME (camera model D3300)


Wow, you made me go check and Nikon still hasn't fixed the software. Supported OS:

macOS Ventura version 13 macOS Monterey version 12 macOS Big Sur version 11


It does work there is just a bit more effort involved in setting it up


Nikon's HDMI output works just fine on MacOS.


Yes, but that requires extra hardware.

I wish Nikon would just include useful features like USB webcam mode out of the box.


Requiring "precise location" is probably necessary because it wants to use Bluetooth to discover speakers.

Any app that uses direct bluetooth could theoretically get your precise location from a Bluetooth geotag, so Apple requires apps to get "precise location" permission before being allowed to use bluetooth.


Not correct, not even close.


I checked the docs, and you are right, I mixed something up. Bluetooth does not need "precise location". Bluetooth has its own permission dialog since iOS 13.

I confused it with the "Access Wi-Fi Information Entitlement". If an app wants to scan for nearby Wifi networks, it needs the "precise location" permission [1], because the names of Wifi networks can be used to determine precise location.

Maybe the Sonos app wants to scan for Wifi networks?

[1]: https://developer.apple.com/documentation/networkextension/n...


This is the answer for one of the arguments as to why Sonos needs precise location. It needs to scan for WiFi networks because they connect to your WiFi network and Apple to do that requires precise location.

But it won't matter saying that because some people refuse to believe that some companies genuinely need to use that.


> need to use that

But they don't need to [use location services]. I only wanted the 3.5mm line in on the Sonos Play5, but was denied unless I jumped through their app "onboarding" routine.

It's essentially a loudspeaker with 3.5mm input. But you must complete the mandatory Sonos induction dance before the input signal is permitted the path to output. Like I said, never again.

It's a shame because I quite liked the sound of the Play5. It's bright, but the wide dispersion is good for certain types of listening at low to medium volume.


Then I think you may have misread the product line for Sonos and perhaps the core product just isn't for you. Sonos is about wireless speakers that can connect with other speakers in your home and can be controlled over your WiFi network, that is the heart of Sonos. Heck the Play 5 is called "Our most powerful wireless Hi-Fi speaker".

Line in is an additional component of the speaker however the entire ethos is controlling the speaker over WiFi by anyone in the home.


Sonos speakers set up their own WiFi network that's used to configure them before they join the final network, doing that probably requires "Access Wi-Fi Information Entitlement".


I guess that's why they ask to move the speaker "close to the router" when setting up.

Honestly, if they had a checkbox saying "do not share my location with Sonos" it would have eased my anxiety. But they don't, they default everything to "we collect data".

Relying on an unread "privacy policy" as the pinky swear for protecting the data they force users to give up, is bad privacy. You can't even delete your own Sonos account, the "delete account" button doesn't actually perform that function.


They seem to imitate the MrBeast format. But they are missing the storytelling, pacing, execution and scale of MrBeast videos. (And also the banter that makes MrBeast videos feel a bit like a sitcom episode)


Luke used to work there so I think some of that style may bleed in, especially during the editing process. Appreciate you taking the time to read the article. I'd love to hear more about how you think we could improve in any of the areas you mentioned.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: