I don't know the PDF.js library. Writing both the client- and server-side for a PDF annotation editor would have taken 60 hours, maybe more. Instead, a combination Copilot, DeepSeek, Claude, and Gemini yielded a working prototype in under 6 hours:
> Writing both the client- and server-side for a PDF annotation editor would have taken 60 hours, maybe more.
How do you know? Seems to me you’re making the exact same estimation mistake of the people in the study.
> Instead, a combination Copilot, DeepSeek, Claude, and Gemini yielded a working prototype in under 6 hours
Six hours for a prototype using four LLMs? That is not impressive, it sounds insane and a tremendous mess that will take so long to dig out of the prototype stage it’ll effectively require a rewrite.
And why are you comparing an LLM prototype to a finished product “by hand” (I surely hope you’re not suggesting such a prototype would take sixty hours)? That is disingenuous and skewing the numbers.
With most projects where innovation is a key requirement, the goal isn't to write textbook quality code, it's to prove your ideas work and quickly evolve the project.
Once you have an idea of how it's going to work, you can then choose to start over from scratch or continue on and clean up all the bits you skipped over.
Right now I'm in the innovation cycle, and having AI able to pick up whole API path strategies and pivot them, is incredibly amazing.
How many times have you used large API's and seen clear hands of different developers and URI strategies, with an AI, you just pivot.
Code quality and pen tests are critical, but they can come later.
> Code quality and pen tests are critical, but they can come later.
In my experience, no.
These kind of shortcuts taken at the beginning of the project is why velocity have a sharp descent after some times. Because you’re either spending time undoing all of it (unlikely to be allowed) or you’re fighting in the code jungle trying to get some feature out.
It'd be great if some of these open source security initiatives could dial up the quality of reports. I've seen so so many reports for some totally unreachable code and get a cve for causing a crash. Maintainers will argue that user input is filtered elsewhere and the "vuln" isn't real, but mitre don't care.
> I've seen so so many reports for some totally unreachable code and get a cve for causing a crash.
There have been a lot of cases where something once deemed "unreachable" eventually was reachable, sometimes years later, after a refactoring and now there was an issue.
At what rate though? Is it worth burning out devs we as a community rely upon because maybe someday 0.000001% of these bugs might have real impact? I think we need to ask more of these "security researchers". Either provide a real world attack vector or start patching these bugs along with the reports.
I wouldn't bother to write PoC because it is a waste of time and it is faster to fix the potential bug rather than figure out what conditions are necessary to turn it into a vulnerability. I think that we all should stop writing PoCs for bugs and spend the lifetime for something more useful.
That's not easy though, especially not for large and old code bases. As an outsider doing occasional bugfixes when I spot issues in an open-source project, I don't have the time to dig into how exactly I need to set up my computer to even have a minimum viable build setup, adhere to each project's different code standards, deal with the bullshit called "Contributor License Agreement" and associated paperwork, or wrap my head around how this specific project does testing and pipelines.
What I can and will do however is write a bug ticket that says what I think the issue is, where my closest suspicion is that causes the issue, and provide either a reproduction or a bugfix patch. Dealing with the remainder of the bureaucracy however is what I do not see as my responsibility.
IMHO, at least the foundations of what makes the Internet tick - the Linux kernel, but also stuff like SSL libraries, format parsers, virtualization tooling and the standard libraries and tools that come installed by default on Linux systems - should be funded by taxpayers. The EU budget for farm subsidies is about 40 billion euros a year - cut 1% off of it, so 400 million euros, and invest it into the core of open source software, and we'd get an untold amount of progress in return.
They should be funded by the companies using them. Do you believe any of the fortune top100 would be greatly impacted by funding libxml2? They probably all rely on it, one way or the other.
The foundation of the internet is something that gets bigger and bigger every year. I understand the sentiment and the reasoning of declaring software a "public good", but it won't scale.
> They should be funded by the companies using them. Do you believe any of the fortune top100 would be greatly impacted by funding libxml2? They probably all rely on it, one way or the other.
I agree in theory but it's impractical to achieve due to the coordination effort involved, hence using taxes as a proxy.
> The foundation of the internet is something that gets bigger and bigger every year. I understand the sentiment and the reasoning of declaring software a "public good", but it won't scale.
For a long time, a lot of foundational development was funded by the government. Of course it can scale - the problem is most people don't believe in capable government any more after 30-40 years of neoliberal tax cuts and utter incompetence (California HSR comes to my mind). We used to be able to do great things funded purely by the government, usually via military funding: laser, radar, microwaves and generally a lot of RF technology, even the Internet itself originated out of the military ARPANET. Or the federal highways. And that was just what the Americans did.
It shouldn't be, but it is to a huge degree. Oil companies, corn production, milk subsidies, road network growth, etc. are all bad business subsidies in the US for example.
Governments used to fund basic research all the time for decades to provide a common good. Governments fund education, universities, road infrastructure and other foundational stuff so that companies can work.
I mean for context, in those countries Meta paid to setup these networks. They're not a government-enforced monopoly, you're more than welcome to start a competing network.
Countries like Mexico or Spain have adopted it as the default form of messaging. Only today I used it to chat with our lawn maintenance guy, our car washer, and someone who's repairing our espresso machine.
I could maybe try to convince friends and family to use another app but I won't be convincing an entire country.
Reminds me of the story the other day, "Meta found 'covertly tracking' Android users through Instagram and Facebook" with the STUN requests being sent from web pixels back to localhost Meta apps (FB/IG).
I just don't think anyone can be using Facebook/IG, especially persistent mobile apps, while have any real concern about tracking.
Does FB still run their Tor onion service? That seemed to be the only possible way to use these products in the past without being subject to extreme tracking.
Network effects have most people stuck on at least one of them. If all your friends use instagram/fb/whatsapp to keep in touch / make plans, leaving the platform is akin to cutting ties with your community.
Which is why there is a role for gov in regulating privacy and mandating interop between platforms. Asking people to “just stop using them” isn’t a realistic ask.
I want to push back on this narrative - I got off facebook and now my friends just text me instead. A few of my friends also got off facebook. Sometimes I can't see a facebook event so I text a friend asking for details. It's fine.
In some countries it has become difficult to live without a WhatsApp account. I'm doing it, but it's a pain since WhatsApp is used for everything that phone calls were once used for: schedule appointments, keep in contact with your kids' teachers, buy and sell goods, etc. The same numbers often won't pick up the call, or it will be simply turned off (since it's used just for WhatsApp).
Imagine living without a phone, or whatever is equally important in your area. Sure, it is possible, if you're at the right level of masochism.
Of the people who accumulated in my Facebook friends list over the years, the only ones I know who actively use Facebook still are almost entirely using it to have stupid political arguments with each other. It really has snowballed and bred derangement.
Facebook isn't the worst of it. WhatsApp is, in those areas where it is the de facto standard app for texting. This is not the case for Americans so they are mostly blissfully unaware of it, but just imagine literally not being able to text anyone.
I dumped Meta probably a decade ago, and anyone who wants to get in touch with me does so through e-mail.
But I still have two relatives stuck on FB Messenger. Even if I contact them via SMS, they still respond to my dormant account in FB Messenger, because Messenger is where all of their friends are. To them, it's the only messaging app, and have no idea why it doesn't work sending messages to me.
Besides that, pretty much everything “after school” is being arranged over Facebook, as well as community “blogs”, newsletters etc.
Facebook solves this problem extremely well. I still remember the “good old days” of poorly managed Wordpress sites, shared Google calendars, mailing lists, and texts, and I’m not particularly keen on going back to that.
The sad truth is that there is nothing on the market today that solves this problem in a combined package, and you can add discoverability to the mix. If you’re interested in X you can search for it on Facebook and 9/10 times you’ll find what you’re looking for, from menus for restaurants to opening hours. Yes, Google does this as well but somehow people (here) are more aware of the feature on Facebook.
I would rather prefer the good old days with wonky WordPress sites and mailing lists. It is true that most business owners moved to Facebook at some point, but the price to pay is having all content undiscoverable and inaccessible, unless your user has a Facebook account.
Yeah, it's a tradeoff. I don't mean to be glib, but on one side we have a loneliness epidemic, mass misinformation campaigns, and centralized control, and on the other side we have better information about restaurants, easier after-school arrangements, and community blogs. I really don't mean to say that the benefits are not real benefits - they are! I just think their price is way too high.
I finally ripped the bandaid off with Instagram early this year. I can't say it's done wonders for my social life. Mental health has been a lot better though.
Yep, I've tried, but if I say, e.g. "let's use Matrix!" it ends up being the app they only have to talk to me, and most of what they say is "why can't you use the app everyone else uses". Most people already have a second choice that isn't much better than a Meta app (or is also Meta).
Don’t get off meta, leech off of it. Don’t contribute any posts, comment, or any behavioral signal. Use the webapp, use them in separate, private browsing containers (if able). Uninstall and eradicate all Meta apps from your devices.
reply