This has been such a frustrating limitation of all the big AR platforms. For years, my company has wanted to make an AR app for a certain industrial use case that scans QR codes. Neither Meta nor Apple allow it! We had to give up and do AR on an iPhone instead. Think about that - the iPhone has more powerful AR than the Apple Vision Pro for every developer except Apple.
It's important to note that this is only really relevant advice for a specific type of startup that is still trying to rapidly iterate to find product market fit.
Personally I find it incredibly annoying to work on and with with products that were developed like this. There are so many half baked features that technically "work" but are slow, buggy or difficult to integrate with.
It's relevant if you need to rapidly iterate, period. The test is not whether you work at a start-up, but how well you understand your problem. I am currently doing a lot of automated design work with optimization over highly non-convex constraints. Good luck writing that without rapid iteration.
Fair point, but I have seen a lot of shitty software developed with this mantra. Usually because the "iterate" part is forgotten in favor of the next "rapid" development. I agree that the quickest way to learn whether your solution is valid is to ship & experiment. But once you know the solution (which is sometimes not even that complicated) then you should really take the time to produce a solid piece of software before moving on.
But I agree, my original comment is probably a little too critical. There are valid times to rapidly iterate and ship. When it turns into the _only_ way you ship software, I think it becomes a problem.
One of the very frequent challenges I have with optimizing code, is that unless the problem is very highly contained, first I have to look at this code that’s all over the place, both physically and emotionally, and just try to figure out wtf it’s actually trying to do. I can’t replace it unless I know the requirements. The intent.
Give me a piece of code that’s wrong but clear about it, and we’re good. I can get in and out and move on to the next most tricky:value ratio’s problem.
Refactoring it to that point will always pay dividends. Even if it’s just for the next person trying to add more functionality in this area.
I don't think you necessarily need to optimize before you make the code "clean" but you should at least understand what it will take to make it performant. This is a stronger requirement than it might sound like since most programmers are quite bad at predicting what will perform well without profiling first.
So knock together a prototype and profile that to understand what the bottlenecks are in your program, then you have an informed baseline of how the program needs to fit together to function well.
Maybe if you hand-roll the struct layout, but if you use something like flatbuffers I doubt you would see many more bugs - and flatbuffers will take care of endian swaps as necessary without you needing to think about it.
Would you accept $34k to move to Syria? I have no idea who this incentive is designed for. The (financial) opportunity cost alone totally swamps the incentive before we even consider the risk in living in a volatile dictatorship.
On a macro level, moving people from functional, high-productivity countries to failed states makes even less sense. Totally hapless policy from Sweden here.
> Those who fail to benefit the country in which they are guests
That's hardly more specific. In an attempt to steel-man - before you edited your comment (or left another one? hard to tell in the HN UI), you referred to criminals. Well, this incentive is pointed at all migrants, not just criminals. I struggle to understand how law-abiding immigrants don't benefit their host country, yet the Swedish government would like to see them leave as well.
Yes, I edited the comment as I decided that criminals is too narrow. It's really a simple evaluation of whether you extract more value than you produce - which is usually true with criminals but could be expanded to include any migrant who is receiving state benefits while not making efforts to contribute going forward. Now, obviously there are exceptions and caveats but I posit there's a larger group of non-criminal migrants who are still a net-drain on the society that's supporting them.
For reasons given in the comments, both players probably choose a mixed strategy at equilibrium. If someone actually managed to find / prove a mixed strategy equilibrium for this game right there in the interview you probably couldn't go wrong hiring them on the spot.
This always makes me uncomfortable though. How would we tell the difference between rampant scamming and fudging numbers and an economy where we all pass around Monopoly money to do services for one another? I pay you to mow my lawn and you pay me to mow your lawn. Are we creating GDP?
Very true and you quickly get into a very command economy style argument about what should be produced. Ultimately we have the system we have and wasted or scammy products generally eventually die. Look at things like NFTs they were an extremely brief blip it turns out because people quickly saturated the ability of crypto early adopters to inflate values with their funny money. Some scams last longer like Thomas Kincade 'paintings' but trying to sort through the economic data to throw those out is just not possible.
> How would we tell the difference between rampant scamming and fudging numbers and an economy where we all pass around Monopoly money to do services for one another?
It will show up as decrease in exports because other countries (or societies or tribes or whatever you want to call them) will want less of what your country is selling.
Which then shows up as decreasing purchasing power for things that you do want from other countries (i.e. you getting poorer).
Luckily for the US, that does not seem to be the case given the resilience of the purchasing power of the USD.
HN does not have extensive server costs. Last I saw (~10 years ago) it was running off a single machine in a closet. Apparently they've since moved to two machines at a hosting company.
IMO the pendulum swung too far with Rust. The experience is better than C++, but the template generics system is not very powerful. They ostensibly made up for this with macros, but (a) they're annoying to write and (b) they're really annoying to read. For these reasons Rust is much safer than C++ but has difficulty providing fluent interfaces like Eigen. There are libraries that try, but AFAICT none match Eigen's ability to eliminate unnecessary allocation for subexpressions and equivalently performing Rust code looks much more imperative.
Rust doesn't have a template system per se. C++'s templates are closer to C's macros. Rust has a typed generics system which does impose additional limits but also means everything is compile time checked in ways C++ isn't.
I agree that Rust's macros are annoying. I think it was a mistake to invent an entirely different language and syntax for it. Of course Rust also has procedural macros, which are macros written in Rust. IMHO that's how they should all work. Secondary languages explode cognitive load.
I'm not attached to the word "template", I just wanted to clarify that they're not Java-style generics with type erasure. If you'd like me to use "monomorphizing generics" instead I'm game :)
Even procedural macros are annoying, though. You need to make a separate crate for them. You still need to write fully-qualified everything, even standard functions, for hygiene reasons. Proc macros that actually do, erm, macro operations and produce a lot of code cause Rust's language server to grind to a halt in my experience. You're effectively writing another language that happens to share a lexer with Rust (what's the problem with that? Well, if I'd known that I'd need another language to solve my problem I might not have chosen Rust...).
For all its warts, using constrexpr if and concepts, is much more easier to do macro like programming than dealing with Rust's two worlds of macros and special syntax.
If static reflection does indeed land on C++26, this experience will be even better.
If we account for arbitrary precision operations necessary for large N, I believe the memoized insane recursion is quadratic in space and time. This is actually not as bad of a pessimization as I thought it would be over just using Binet's formula, but it's still a hit.