Hacker News new | past | comments | ask | show | jobs | submit | aiiotnoodle's comments login

The postgres developer had disabled a bunch of optimization in order to create a low noise floor on his test machine that increased the delay too.

Normally it would be much quicker but potentially could have been picked up downstream by other developers when the code became more mainstream.

He did an interview for Risky Business #743.


Did airhelp talk to aviation ADR do you know? We've Just essentially lost based on what we provided but still think we're owed compensation really. Just absolutely exhausted doing all the admin. Originally went with resolver but they did nothing. I think aviation ADR is our last course of action unless we actually sued them, do you know if you did that?


I dont know whats ADR. All I did was put all relevant details/docs on airhelp. They sent another email when lawyer stepped in and I got email from lawyer too but that's about it.


A lot of this sounds like they were under-resourced and the business increasingly adopted new technology with no ongoing support for their IT infrastructure.

> These legacy systems will in many cases need to be migrated to new versions, substantially modified, or even rebuilt from the ground up, either because they are unsupported and therefore cannot be repurchased or restored, or because they simply will not operate on modern servers or with modern security controls.

> There is a clear lesson in ensuring the attack vector is reduced as much as possible by keeping infrastructure and applications current, with increased levels of lifecycle investment in technology infrastructure and security.

> Our reliance on legacy infrastructure is the primary contributor to the length of time that the Library will require to recover from the attack.

A lot of lines like the following, also indicate to me IT was increasingly were involved in fighting fires and maintining operational systems ("keeping the lights on") rather than deploying new infrastructure and automation, updating software etc.

> Some of our older applications rely substantially on manual extract (...) which in a modern data management and reporting infrastructure would be encapsulated in secure, automated end-to end workflows.

Modern business is IT, I know that I am preaching to the chior but this sounds a lot like their IT was seen as a cost.


> However, the first detected unauthorised access to our network was identified at the Terminal Services server. This terminal server had been installed in February 2020 to facilitate efficient access for trusted external partners and internal IT administrators, as a replacement for the previous remote access system, which had been assessed as being insufficiently secure. Remote usage expanded during the subsequent Covid-19 pandemic because of the greatly increased requirement for remote working and the range of IT projects being undertaken with third party support.

While I'm certain they are underfunded and overworked, this sounds like they had an internet accessible terminal server. I'd like to imagine IT screaming this is a bad idea but a suit somewhere saying they needed easy access for partners. I can only imagine how insecure the solution they replaced with this one was.


The British Library is closer to academia than business. Their IT provider is a state-adjacent entity: https://en.wikipedia.org/wiki/Jisc .


I think it's part of a general trend where UK govt institutions have notoriously poor IT, usually consisting of semi-obsolete infrastructure, multiple legacy systems, sticking-plaster upgrades, one or two new state-of-the-art bits where budget is available, etc. Consider the NHS, the MOD, DVLA, etc.


I would be fully supportive of the GDS (https://www.gov.uk/government/organisations/government-digit...) taking on additional responsibilities and providing support and assistance to other government agencies. gov.uk is almost universally praised by the general public and tech people.


Agree, but they can't really do very much about the massive number of legacy systems in departments that can't or won't spend money to modernise. My favourite example to hate is the Driver and Vehicle Licensing Agency which tracks different things in multiple systems, and still requires snail mail interactions (!!!) for some services, such as reclaiming a license after a medical suspension (personal experience). To DVLA, people like me are a pure cost, as are the systems that record my data.


Having experienced both the DVLA and (California) DMV, the DVLA feels miles ahead, like it's living in the future.

Things like finding out the status of a renewal involved finding a fax machine, everything but the most trivial renewal (say, renewing if you're on a work visa) seems to be done in person with handwritten paperwork, and the amount of busywork that seems to be done by hand by the DMV agent is quite easy to blame for the impressive wait times, multiple hours even if you have an appointment.

My DVLA renewal was trivial comparatively, they could even use my passport for an updated ID photo. But maybe if you're not a UK citizen they also make you jump through weird hoops?

I'm not saying that the DVLA is good, just that it could be even worse.


> I'm not saying that the DVLA is good, just that it could be even worse

Some things they do reasonably well, yes. But edge cases like mine are the pits. To get a licence back after a medical suspension involves DVLA and the NHS posting physical letters to each other! It took 5 months for this purely admin process to complete, after I was medically fit. Grrrr. That is a long time to be denied the right to drive.


I've known people who have worked in IT in national museum settings, and from what I heard it sounded like a mix of traditional IT support—ensuring the lights stayed on, printers could print, emails and phones worked, and a very simple website stayed online.

Some aspects sounded quite interesting, but these weren't places pushing the envelope in any aspect of technology. I'm sure they were running outdated software and configurations on everything, but IT was closing their tickets and meeting their SLAs. And with no disrespect, these people weren't necessarily disruptors looking to shake up and modernize the museums' infrastructure and take it into the future either, they just did their job to the best of their ability and went home at the end of the day.

To generalize I find that this usually holds true in a lot of non-tech industries, and IT is generally seen as a burdensome cost as opposed to enabler of business.


The British library has pretty complex systems because of the vast size of teh collection. Some pretty interesting stuff:

https://www.youtube.com/watch?v=ZNVuIU6UUiM


If you are referring to NCR self checkout cameras. This is generally NCR SmartAssist and NCR ScanItAll which is for shrink and employee theft rather than mass identification. I only know of ASDA near me that has this implementation.


Overton window is moving. If they are not doing mass identification now, they will in a couple of years.


> We each have a choice.

You really don't though. You can either pay more, or go to another supermarket with a similar scheme. Doesn't really seem cricket, am I really loyal for buying bread or cheese because I needed them and this supermarket is in my neighborhood?


I don't know where you are in the world, but in the UK it is really common to have multiple supermarkets in spitting distance of each other. Most of them deliver if you're not in the neighbourhood. And not all of them offer loyalty schemes.


And users unable to correct the data themselves


Google lets you edit (or more accurately, suggest edit) maps. It just appears to get overwritten by some process after a few days. I have fixed so many things only to have them snap back to what they were before. Its annoying and leads to situations where door dashers end up coming down an irrigation road, nearly crashing into the canal because its "faster" than using the road that goes around the field. I am shocked Google doesn't have MORE lawsuits on its hands.


Google's defense to a lawsuit would be simple: that it makes no guarantee that maps is accurate, and you use it at your own risk. If that defense ever proves to be insufficient, their obvious next step is to no longer make maps available for free. Then we will see how many people are actually willing to pay for actual accuracy.

In the case of the irrigation road, since from another post of yours upthread it appears to be private property, it should be signed accordingly to make it clear to people that they are trespassing if they use it.


In some jurisdictions you cant just say whoops when its public safety related


Unless the jurisdiction has some sort of contract with Google regarding the use of maps for public safety, I don't see how Google could be held responsible. Google does not make any guarantees about the accuracy of maps, and you use it at your own risk. That means the safety risk is on you, not Google.

Yes, it would be nice if Google would take steps to ensure the accuracy of maps and make guarantees accordingly, but it doesn't (and I don't see how it ever will unless and until maps becomes a paid application instead of a free one).


> That means the safety risk is on you, not Google.

There could be jurisdictions that disagree with this. The US delegates quite a lot to contract law, but say Germany (I’m not sure if they do, but it sounds like it could be a German thing) could definitely decide that the act of providing a map with navigational assistance is sufficient to make some legal guarantees about accuracy.


Then Google Maps becomes unavailable in your country and I want to see how long it takes for that to be reinstated with a “Sorry oh Google”


If their competitors are brave, they'll stick around and take over.


Its illegal to block an irrigation road without prior approval and people straight up ignore signs, especially when the authoritative source says "turn there NOW!" We know this because fedex trucks have traveled down this "road" multiple times despite it being marked as private in several places along the way.

Also, Google is currently being sued for a situation exactly like this so we will know soon enough if that defense works: https://arstechnica.com/tech-policy/2023/09/lawsuit-says-man...


In their defense: I think FedEx drivers will often have to make deliveries on private grounds.


Thats driving on what is known as the "curtilage," the space on a private property that is effectively "partially public". FedEx can drive up a driveway, and use your walkway to deliver a package. That is okay. FedEx can not jump your fence and enter your backyard to leave your package somewhere safe. The backyard is off limits. As such, an irrigation access road is NOT within the curtliage of any property and thus is more of a "backyard" to those not permitted. There are signs and gates which state that yet FedEx directed its drivers down that road several times and they complied.


I imagine the 'curtilage' for many properties had quite a few 'private property' signs. If GPS directs a driver into a road marked as such, a driver may easily assume that there will be a delivery address on that private property, which would warrant ignoring the signs.


I have a strong suspicion that they're cross-referencing between multiple different data sources, including OSM. Meaning that if OSM has the necessary correct changes, reporting issues tends to have more of an effect.

Basically the last resort is to create a thread on Google Community Guide forums, then actual staff will eventually address the issue.

But is tedious and ridiculous how there's so little recourse against Google spreading incorrect info.


Google knows best. You must take the most efficient route.


It is not the user's responsibility. Google Maps is now a marketing platform (try typing a Hotel name and see how that goes). At this point, we should have an Open Source Open Map that let's people contribute and it's actually free.


Like, say, OpenStreetMap?

https://www.openstreetmap.org/


The moment such a map will have high enough usage assholes will put in incorrect data maliciously. One of the major reasons for google map errors is malicious editing.


OpenStreetMap has high enough usage that vandals enter incorrect data [0]. But such vandalism is generally corrected very quickly.

[0]: https://wiki.openstreetmap.org/wiki/Vandalism


I sent a correction once, and they did indeed fix it after a couple of weeks. I only knew it was wrong because I had studied the area in non-interactive maps ahead of time.


This could be solved by letting users pay a $20 deposit or something with their edit.


That might work for users that are affected (ex. pizza deliveries never get to your house) but will kill users providing fixes just to be helpful.


It was temporarily halted in 2020 but now is back in production.

According to here anyway,

https://europe.autonews.com/automakers/vw-plans-restart-sale...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: