It didn't use any fancy frameworks, just plain old CRUD on Java.
AFAIK it was one of the top five biggest library systems in the world at the time.
I was asked to add some features that would have been too difficult in the old distributed system. Things like reading competitions, recommended reading lists by age, etc…
I watched the effect of these changes — which took me mere days of effort to implement — and the combined result was that students read about a million additional books they would not have otherwise.
I’ve had a far greater effect on the literacy of our state than any educator by orders of magnitude and hardly anyone in the department of education even knows my name!
This was the project that made realise how huge the effort-to-effect ratio that can be when computers are involved…
The part I added was built with ASP.NET 2.0 on top of Microsoft SQL Server 2005, and was eventually upgraded to 4.0 and 2008 respectively.
The only magic sauce was the use of SQLCLR to embed a few small snippets of C# code into the SQL Server database engine. This allowed the full-text indexing to be specialised for the high level data partitioning. Without this, searches would have taken up to ten seconds. With this custom search the p90 response time was about 15 milliseconds! I believe PostgreSQL is the only other popular database engine out there that allows this level of fine-tuned custom indexing.
I know you can tune the hell out of search performance, but that seems a bit too insane for what looks like a relatively unspecialized setup (Standard DB).
One hiccup was that when the query cardinality estimator got confused, it would occasionally ignore the partition prefix and do a full scan somewhere, bloating the results by a factor of 2000x! This would cause dramatic slowdowns randomly, and then the DB engine would often cache the inefficient query plan, making things slow until it got rebooted.
This is a very deep rabbit hole to go down. For example, many large cloud vendors have an Iron Rule that relational databases must never be used, because they're concerned precisely about this issue occurring, except at a vastly greater scale.
I could have used actual database partitioning, but I discovered it had undesirable side-effects for some cross-library queries. However, for typical queries this would have "enforced" the use of the partitioning key, side-stepping the problem the cloud vendors have.
Modern versions of SQL Server have all sorts of features to correct or avoid inefficient query plans. E.g.: Query Store can track the "good" and "bad" version of each plan for a query and then after sufficient samples start enforcing the good one. That would have been useful back in 2007 but wasn't available, so I spent about a month doing the same thing but by hand.
it's okay sir, we now know you as jiggawatts
Nice work, but check your ego mate. Seems your growth hacking would have had zero result if those kids couldn't read to start with, so you could share some credit ;-)
I love Steve Jobs' metaphor: computers as a bicycle of the mind . Unfortunately, a lot of effort is concentrated on problems that scale to billions of people. There's a lack of attention to problems that would have a big effect for a relatively small number of people. It's a shame, because they're a blast to work on.
Designed and deployed credit card readers used in gas pumps back in 1979. (Sold to Gasboy)
Wrote a fine tuner to allow communication between satellites (precursor to TDRSS days). Still used to this day.
Failover of IP in ATM switches (VVRP, PXE, secondary DHCP, secondary DNS, secondary LDAP, secondary NFS). While not invented here, it is still used today as this is a Common setup to this day.
Printer drivers for big, big high-speed Xerox printers on BSD. Still used to this day by big, big high-speed printers.
Also, early IDS products (pre-Snort) at line-speed. Sold to Netscreen.
Easy zero-setup of DSL modem before some BellCore decided to complicate things (thus exploding their field deployment budgets; Southwestern Bell/Qwest enjoyed our profitable zero-setup). Sold to Siemens.
1Gps IDS/IPS before selling it to 3Com/Hewlett-Packard Packard.
Impact? It is more about personal pride but its impacts are still being felt today.
created: May 5, 2015
Hack the Planet
Have you made more than a typical SWE?
It is one of those traits where a mind clicks and said "this is it and how" and surprisingly gets into the most illusive hyperfocus/high-energy mode (without using any drug).
Slow-path network processing (arguably me) was commercially made in Ascom Timeplex in 1982 and someone else leaked it to Cisco (or ripping AT's patent off). I got that from observing how different river bends (re)connect year-after-year while doing trout fishing trips.
Money is not my thing but it does help greatly in the pursuit of my ideals (so many hardwares, so many test equips).
And a typical enterprise NIDS would not be able to see beyond those encrypted packet containing JS over 2-way-signed TLS/SSL, or HTTPv3 (QUIC) (or a few other E2E protocols).
Care to develop more on the potential attacks here?
But which side should assume the responsibility of this JS-defanging effort into text-based? Client or server? Postal said "be liberal in what you receive and conservative in what you send". So, being conservative (in this respect), server has to be minimalistic (including denial of programmability).
Real problem remains, too much accessibility of programming is being made available to let client-side take it in ... in a gullible way.
And no amount of Sideshow Barker (not a dig on HN's Sideshow Barker) can fix this, until one of the MAANG decides "enough".
Meanwhile, the wild Wild West shall continue.
Personal non-code project: The first adult LEGO fan conference in 2000. While I got out of that business years ago it has been replicated by dozens of other annual cons around the world. Back then the LEGO group didn't really understand and was very weary of adult fans. Now there's a whole reality tv show about them with LEGO designers as the judges, and LEGO actively supports cons and clubs.
Open source project: A project I released anonymously ~2010. Several github repos (unrelated to me) keep this project alive (the main one has ~600 stars and ~200 forks) and it's apparently used in several commercial products too.
My mom got into adult lego when she took apart my child hood lego and reassembled them to resell.
Now we mail each other sets that the other is done with, and it gives us a great opportunity to connect. We're both anxious people and there's something relaxing about just assembling something where everything has a place.
When she found out there's a lego con in my town, she made plans to come visit me so we can go together and I can show her around the city I just moved to.
My 3 and 6 year old love lego kits. Historically I found myself sitting with them and helping when they got stuck or directing them when I saw they made a mistake. More recently I decided to pick up my own kit and build along side them. I’m currently working on the Saturn V rocket. It’s been a lot more fun for me and a way to bond with my kids.
>At work: the CDN for Megaupload. I was also the guy who had to shut it down when the FBI seized it.
>adult LEGO fan conference
Wow, what a small world. That's what I love about HN. The people that make things you use are on it :)
I wish I had something nearly as impressive. I just have open source stuff that people use. Nothing recognizable though.
Also, I have a project in production at work where a device needs to grab its public IP address. My code has a list of sites that provide that info and I have ip4.me as a fallback in that list, so thank you for building it!
Maybe worth reaching out to Mozilla. That's the only actual non-profit I can think of who I think would have both the ability and the incentive to keep it online.
Ability? 5M/day for "what's my ip" is not much, and I'd wager most of us on this site would be able to keep it up and alive just fine. As for incentive... in addition to the Mozilla Foundation, orgs like Calyx, NLNet, Quad9 come to mind.
I'm not getting any younger so it's really about survivability. Transferring to another individual HN'er probably wouldn't solve that.
If it doesn’t already exist anyway.
Datasette, Django, and Lanyrd.
> Locating elements by their class name is a widespread technique popularized by Simon Willison (http://simon.incutio.com) in 2003 and originally written by Andrew Hayward (http://www.mooncalf.me.uk)
My most impactful thing I've done outside of paid work is a website running on Django. I could live without queryBySelector or their descendants, but not without Django.
Thank you, Simon.
/* That revolting regular expression explained
\---/ \---/\-------------/ \-------/
| | | |
| | | The value
| | ~,|,^,$,* or =
querySelectorAll wouldn't ever appear without jQuery which got its idea from Simon's idea.
And even then querySelectorAll was so poorly implemented that it didn't even have any useful helper methods.
Then thinking, I suppose you could do it by (exactly the method you used), but never actually doing it because if it were that simple, someone would have already done it.
Actually, seeing the date, I realize this predates me even leaving high-school, which makes it even more atrocious that I never knew of it!
I was a fairly fresh college-hire SDE1 at Amazon. And I was annoyed, because I'm lazy. Every time I was oncall, I had to manage the deployment pipeline for my teams software- the UI for the tool used by Pickers inside Amazon Warehouses. On Monday, deploy the latest changes to the China stack (small). On Tuesday, check if anything bad happened, and then deploy to the Japan stack (small-ish). On Wednesday, Europe (big). Thursday, North America (biggest). Repeat each week.
And I thought "why am I doing this? There are APIs for all of this stuff!". So I made an automated workflow that hooked into the pipeline system. You gave a metric to look for, a count of how many times the thing should have happened, and an alarm to monitor. If everything looks good, it approves. I hooked it up for my pipeline, and then it usually finished the entire weekly push before Tuesday afternoon. I made it in about 2 weekends on my own time.
And I left it open for anyone in the company to configure for their own pipelines. A few weeks later I was checking if it was still operating normally and realized there were something like 50 teams using it. Then 100. Then a lot more.
The last I heard, it's considered a best practice for all teams within the company to use it on their pipelines. Before I left in 2021, it was running something like 10,000 approval workflows per day.
I named it after the BBQ/grilling meat thermometer in my kitchen drawer- "RediFork". Given the overlap of "people who read HN" and "devs who worked at Amazon", I probably saved someone reading this an aggregate hour or two of work.
Thank you for creating it!
Eg: Stick a fork in it and see if it's done yet
From one "engineer whose irritation at inefficiency spawned a whole tool" to another (I got sick of staying up overnight to run load tests, so wrote myself an automation and monitoring tool - which got picked up, spun off to its own team, and now is used by >300 teams) - thank you!
> I made it in about 2 weekends on my own time.
Initially, ignoring the wisdom of the time that said OFDM was no good for indoor channels. The research project was eventually shut down due to lack of commercial interest, but the research leaders had enough faith to immediately start their own company (Radiata). Later, commercial success for Radiata came from being in the right place at the right time.
> Were there any close competitors?
In the research phase, not that I was aware of. In the commercial phase, Atheros. The story I was told after the event was that Cisco had decided to buy whichever company came to market first. Radiata came to market 2 weeks before Atheros and so Radiata was acquired.
> Was infrared close to being the winner?
It could have been, but specular reflection in IR channels causes inter-symbol interference, which limits the data rate. If someone could have solved that problem then IR might have happened instead of WiFi.
> I'm also surprised big enough FPGA was already around.
At the start of the project FPGAs were not big enough, so we had to partition across multiple 3000 series Xilinx parts. Bigger FPGAs had been released by the end of the project, so the transmitter fitted on a single XC4025 FPGA, using manual placement. The 4025s were brand new and Xilinx (as always) were difficult to deal with, so we had to beg for devices and they magnanimously granted us 3 or 4 chips.
At the time there wasn't much sense of occasion, as we were busy doing the work and none of us knew how big it would get.
At the time the collaboration with CSIRO worked quite well, as there were no business development types involved. In 1995 CSIRO was more concerned about science than IP. Since then they have become more money/IP focused. Maybe they got gold fever from the $1 billion in royalties they made from their WiFi patent?
There was a lot of talk about this in the news, and although the software I was working on didn't entirely fix the problem, it allowed the agencies to communicate better. Their data wasn't siloed, and families got separated for only a few days rather than (sometimes) permanently.
I really miss that job. The pay was atrocious and zero WLB, but everyone agreed it was an important problem to solve, and I think the tool we had built really was helping.
(Including Peter Eckersley https://en.wikipedia.org/wiki/Peter_Eckersley_(computer_scie... who passed away earlier this fall at just 43.)
During Hurricane Maria most of Puerto Rico was offline. Slowly but surely, some people started having access to some online services. To this day, I don't know how, but I saw frequent posts in social media (Facebook and others) of people saying they could access spotty internet but SMS and making calls wasn't working, and asking people to let their family outside of Puerto Rico know that they were okay.
So I setup a site on glitch.com with real simple 2 field form. One for a phone number and another for a message to send. It was dead simple, no framework, no CSS, just little bits of vanilla HTML and JS, and a bit of backend code connected to Twilio. Some text on the top with instructions too. I was making it intentionally small so that a spotty connection wouldn't have a problem using it.
Any time I saw someone posting in social media asking for someone to reach out to their family, I posted a link. I also shared it in a slack where many from the PR diaspora where trying to contribute ways to help. Before I knew thousands of people were using it. I did some continuous monitoring to make sure nobody was using it for abuse, and making sure it was being used as intended. It would have been EXTREMELY easy for someone to abuse it if they wanted to.
No one abused it. Thousands used it as it was intended. Left it up for weeks, and I kept monitoring it to make sure it wasn't being abused. I eventually saw it had stopped being used entirely for two weeks and spun it down.
I saw some people posting about it afterwards being thankful they were able to receive messages from their family, and I'm happy I rushed through to write very sloppy high impact code.
Ivy sends you a text message introducing herself as a virtual concierge when you check in. She answers FAQs in 1 second using NLP and routes anything more complex to the front desk team for resolution in 2-3 minutes. All in one simple text thread, no apps or UI needed.
Guests often come to the front desk trying to tip Ivy, rave about her in reviews, ask her out on dates, and even drop off hand written thank you notes for her.
One woman texted Ivy in a panic asking about the nearest drug store to buy Benadryl because her son was having a severe allergic reaction. A guest service agent brought Benadryl to her door in 3 minutes at a large Las Vegas property. She called Ivy a life saver.
It wasn't a planned thing. I had recently got injured playing football, so I was stuck at home, not being able to walk or drive. I started checking the #mono IRC channel (it was 2003 and internet was something you did over a 48k modem, when your home phone line was not needed). Some guys, lead by Miguel de Icaza, the founder of Gnome, were implementing a compiler of C# and a bytecode interpreter of .NET IL, and I was very curious about it. I kept downloading, compiling and trying things out.
Then one day Miguel wrote in the channel that it would be nice to have some graphical editor and that somebody could perhaps port SharpDevelop over to Linux, by replacing Windows.Forms by calls to GTK. I said that I'd give it a shot and... well, 10 days later we had a working editor and half a dozen of contributors.
Also, as a new parent, my immediate thought is of course "WHO wasn't watching the kid??"
> Various degrees of hypothermia may be deliberately induced in medicine for purposes of treatment of brain injury, or lowering metabolism so that total brain ischemia can be tolerated for a short time. Deep hypothermic circulatory arrest is a medical technique in which the brain is cooled as low as 10 °C, which allows the heart to be stopped and blood pressure to be lowered to zero, for the treatment of aneurysms and other circulatory problems that do not tolerate arterial pressure or blood flow. The time limit for this technique, as also for accidental arrest in ice water (which internal temperatures may drop to as low as 15 °C), is about one hour.
Also you can't just warm the body back to 38 degrees, it should be carefully brought up AFAIK.
Before post-play, you had to open the episode menu and click on the next episode to play it. We didn't want to do autoplay for a long time because we were afraid people would fall asleep with Netflix playing and it would break the internet. So we included the now infamous "Are you still there?" popup a few minutes into episode 3 with no interaction with the player.
Now it is everywhere - YouTube, Hulu, HBO, etc. And people watch way more TV than they should.
I guess when something just works your users will assume the cases where it is working properly are just the way things are and the cases where it does something they don't like is your fault.
So well done!
It prevents me from being able to see the credits! Sometimes I want to know who played what part!
I'm okay with an optional prompt that lets me skip the credits if I want to, but that should NEVER be the default!!!
Note: I work at YouTube :P
Hope you are doing well!
Why not just have a next episode button without auto playing the next episode? Make the autoplay optional and not the default.
https://www.flickr.com/photos/joshu/sets/72157600740166824 (credit to joshu)
The wordpress suggestion in the response is just A+.
I have always wondered if it could be scaled to a Google alternative. Ranking pages by how many people have bookmarked them seems like a good alternative to PageRank.
1. Was the intern that coded the mechanism to open/close the LIDAR cover on the Mars Phoenix Lander, so it runs on another planet. I also did circuit work, and other tasks for the CSA’s contribution to that mission. That was also the internship where I (re)met my wife.
2. Was on the Android team that brought video to Instagram back in 2013. We brought gyro stabilization to the iPhone, couldn’t quite get it running reliably on Android via the NDK, but I damned well tried.
3. Wrote the first Android app for Instacart.
4. Currently rolling out our new software platform to handle $15B/year revenue for Anheuser-Busch’s supply chain. We have 1000+ companies relying on us to ensure they can order and fulfill products.
Unsure what’s next, but it’ll likely be high impact and fun too.
Seriously, fun cv :)
That probably had more impact than the Binary Lambda Calculus language I designed  or the logical rules of Go I co-formulated .
Computing the number of Go positions  or approximating the number of Chess positions  had little impact beyond satisfying my intellectual curiosity.
Scrypt is from 2009, per Wikipedia. That's memory hard, and using hashes with some zeroed out bits is a thing done for a long time (Bitcoin 2009; some old meaning of "cryptographic pepper" (fallen out of use) that iirc dates back to the 90s). Am I misunderstanding what you built?
The reason it makes a very poor PoW (as choice of hash function in the Hashcash Proof-of-Work) is that the PoW verifier needs as much memory as the PoW prover, whereas a good PoW should be instantly verifiable.
This is why blockchains using scrypt as hash function severely limit the amount of memory used (usually to 128KB). So that verification, while slow, is not horribly slow.
Cuckoo Cycle also requires a configurable amount of memory to solve (subject to certain tradeoffs), but crucially, can be instantly verified with no memory use at all. And thus makes for a good PoW.
In the form of the Cuckatoo32 variant that most mining takes place with, it requires 0.5 GB of SRAM and 0.5 GB of DRAM to solve most efficiently.
Ooh, yes I see, that is a big difference. Cool work!
People I respected told me I was wasting my time because Internet Explorer was the de-facto standard and the idea of a new browser engine becoming prominent was fantasy.
Then Apple decided they wanted do a browser and looked around at what open source engines were available they could use as a starting point. Thus was born WebKit .
I consistently ignore anyone who tells me I shouldn't try something because it's "too hard" or "nobody will use it". Most of the time they turn out to be right. But not always.
Edit: Here's an interesting presentation by Lars Knoll and George Staikos on the history of the project: https://www.youtube.com/watch?v=Tldf1rT0Rn0
Inspiring. Thanks for your contribution!
Tiled rendering seems to be what all the major renderers use, but the layers of abstraction they utilize to get there are so dense they're unreadable without extensive amounts of time.
My goal was to help volunteers that were in the field in Nepal communicate in English -> Nepali and back. Even though this was somewhat effective, there was still a communication gap because most people in Nepal in remote parts could not even read in Nepali.
I looked around for solutions but couldn't find any Nepali Text To Speech solutions. The builder brain in me fired up and I decided to build a Nepali Text To Speech engine using some of the groundwork that was laid by Madan Puraskar Pustakalaya (Big Library in Nepal) which they had abandoned halfway.
I spend all night hacking along to build a web app that let the volunteers paste translated text and have it spoken. The result was https://nepalispeech.com/ and the first iteration of this was built in just 13 ish hours.
I hope the people that got affected by the earthquake are in a better situation now.
Nothing. I haven’t built anything with a significant impact. I’ve made things that made a significant impact on businesses, but in the scheme of things, nothing exciting.
The thing I made which generated the most revenue was easily the most harmful, and likely the most impactful. Unfortunately. It was an ad exchange that did extremely well. The owners went from random guys with a gross idea to multimillionaires in a couple years. They both spend their days buying up startups.
I should have done better by now. I feel like I need to make up for building that exchange. I was young and had no idea what I was getting into until it was too late.
I have a humongous list of failed stuff though, so much that when I look back I wonder why I couldn’t just stick with any given thing.
It still brings in some revenue but I have been intentionally neglecting it for years now, as I personally hate those things with a vengeance. But still, I don’t pull the plug on it.
It was among the first text to language models created independently. And it was fully open source.
It also got covered by New York Times in the article covering Dall-E 2 by Cade Metz.
- GitHub: https://github.com/borisdayma/dalle-mini
- Hugging Face Demo: https://huggingface.co/spaces/flax-community/dalle-mini
- NYT article: https://www.nytimes.com/2022/04/06/technology/openai-images-...
(I know this is not as much impactful as others in this thread. But I did this after less than 2 years after transitioning to tech from Physics, and at the age of 22.)
I'm also apparently the original inventor of the tracking cookie, which had the implication that no one was able to patent it. It was presented in a patent of mine that was about a collaborative filtering technique for recommending ads; I'd come up with the tracking cookie mechanism to support that technique. So, it didn't attempt to patent the tracking cookie separately; but because it was the first publication describing the method, no one else could patent it either. In 2021 a joint legal brief filed by Google and Twitter together, defending themselves against a patent troll, called it "Robinson's Cookie". My patent is owned by Google now. It contained a lot of details for giving users control of the data derived from tracking; that part was pretty much ignored by people implementing it.
While I deserve no credit for its current success, it's been used by millions to:
* catalogue millions of plants and animals around the world
* tagged image data has become critical for computer vision training models
* map species range and impact of various natural changes to biodiversity, with data cited in scientific journals
* new species have been discovered through the app
previous HN thread - https://news.ycombinator.com/item?id=22442479
I've spent the last 6-7 years making autonomous aircraft that deliver medical supplies in various African countries. Probably a hundred thousand deliveries or so have been for emergency blood transfusions, typically for women that hemorrhaged during labor. So that's got to be quite a few lives/families saved!
It's not a perfect application, by any means. But the bar was _so_ low, that I can't help but think of how much we've helped users just over the last few years.
I hope it ships with Windows by default one day.
It was such a surreal moment to finally leave the office after months of crunch time, walk out into the sunshine for lunch for the first time and see almost every person on the street playing the game.
 To be extra-clear, all code in the game was touched by more than one person, every one of them better engineers than I am.
I do want to say how amazing OSM was. There are SO MANY weird laws in different countries and OSM was a fantastic source of data in many of them. One example is South Korea - there were laws from decades ago that made it very difficult legally to have detailed maps of many parts of south korea - the OSM maps there were far superior to anything else available.
That seems a million miles away from everyday agile and crud stuff...
Also, Ingress is all about controlling areas of the map, while PGO was mostly based on points of interest, so the architecture needed to be quite different. I'll go into more detail when I post the writeup.
I've since gotten a degree and written software for a handful of companies.
When I think of how many people are actually _using_ my software, though? Fourteen years later, the mug club software is still live in a production environment, used every day by wait staff who turns over every few months. No doubt hundreds - potentially thousands (it got deployed at a few different bars) - of people have interacted directly with it. That code embarrasses me nowadays, but as far as impact goes: that's probably it.
I've traveled North America photographing native bats. This was born from an obsession with documenting creatures that are not easily observed (this goes far beyond bats).
To accomplish the bat project, I built my own high-speed photo systems, designed specialty gear, and developed a processes for capturing extremely detailed images of bats in flight. Others had done it before me, but never shared the technical process. So I had to build it myself. Then I collaborated with biologists and institutions around the country to learn about behavior and more. It was a hell of a journey.
I'm so proud of the project. This work is very hard recreate these days because of a pandemic amongst bats (WNS) and humans (Covid). I think bats are among the most interesting creatures on the planet.
Working with all of these bat biologists, I learned of the holy grail of bats. Its a species that was once considered one of the rarest species in North America. Up until the 90's only a few specimens have ever been observed or documented.
But if you want to see images of the most spectacular bat in North America - the spotted bat - I am in a rare group who has ever seen one much less photographed them.
Some day I'll have to tell the story of Kentucky cave shrimp and how I traveled to the deepest bowels of Mammoth Cave with a crew of 20 - A combined group from the National Park Service and US Fish and Wildlife Service to photograph these tiny and rare shrimp.
Don't get me started on my journey to photograph red tree voles (that only live at the top of mature douglas fir trees).
I'm bragging - yes. I never imagined I could make a six figure income from this work. I expected to be poor. I genuinely hope this work has lasting impact.
Coming from a family rooted in poverty, addiction, and early death - this path has been a surprise beyond description.
I love bats. As you may know, two species of bat are the only mammals native to New Zealand, where I live. I hope to see one one day!
- In 1996 built and deployed a system to keep track of the removal of landmines in Bosnia. In 2015 I met someone who knew my work as a child in Sarajevo, producing the maps they’d give out to schoolchildren.
- I managed a project with over 30 team members to build a system to help former Soviet Union countries manage their import/export control policies.
- I helped create a system for generating some annual reports for Poland that was a requirememnt for them to join NATO.
We were working on a new product, electronic access to textbooks. I'd built the entire system that takes the textbook XML we got from the content side, created indexes used by our search engine, and made it possible to efficiently display in the web application any text fragment from a full chapter down to a single sentence containing a search result.
The CEO called an emergency meeting: many of our library customers were government funded, and their funding required the library to receive a physical object in exchange for the licensing fee. They didn't want to have to store the physical textbooks and we didn't want the overhead of sending them textbooks. So the team starting talking about creating an entire new subdivision dedicated to the production, management, warehousing, and shipping of CD versions of the books, just so the customers could be given something physical.
I interjected: "If a CD is good enough, I can generate that using everything I've built already. I'm already converting the content to HTML for display in the app, so I can render the textbook out to a folder, one HTML page per chapter, with a table of contents and all of the images, and create an ISO image that the librarians can download using a link in the web application. Let them burn it themselves if they want a physical copy. They could also store the ISO locally so they still have that version if they let their license expire." That was a funding requirement as well.
So that's what we did. It took me a couple of days extra to implement that feature, and I saved the company a fortune compared to what they were considering doing.
I believe I got a $25 Starbucks card as a reward.
For example, the guy who invemted the process to create artificial diamondsnfor GE,got a nice plaque and $1.
Next up is probably scrypt; it would rank higher if cryptocurrencies used it, but instead they use nerfedscrypt which defeats the entire point of scrypt.
Third is probably FreeBSD/EC2. Of course I didn't do all the work for that, but I can certainly claim the status of technical project manager.
My day job, Tarsnap, comes in fourth.
I'm currently procrastinating my master's thesis on transient execution attacks, and just re-read it a few weeks ago while drafting my background section. So, thanks a ton for writing one of the most helpful introductory texts on timing side channels!
It parsed a text file containing Jeep parts that needed to be sequenced and printed barcode labels to Zebra printers. One day a construction crew dug up all of our data lines and we lost all comms to Chrysler and our data center.
We had to have a rotation of floor supervisors driving to Chrysler to copy/paste orders onto a floppy disk and bring it back to be processed. We kept the line running for about 30 hours, which basically saved our company because our contract with Chrysler stipulated that we would be charged $10,000 per minute if we stopped the line.
So the process was:
* supervisor drives to Chrysler, pastes part orders into text file, saves it a floppy
* floppy returns to your company, you open it up and run the perl script, which prints barcode labels
* ... then what?
I was a programmer working on Fortnite, and I ended up working on the on-site fortnite events, doing everything from the custom cameras and broadcast specific UI, to hooking up the events in-game to the lights in the stadium. It was pretty cool!
https://youtube.com/watch?v=EWANLy9TjRc - I worked on this game (and the demo in this video) for a few years. I wrote much of the code for the asset pipeline for the destruction, lots of the gameplay code for how it interacted with the game and a good chunk of optimisation on the cloud physics side.
Designed/Built/Deployed Meta's backend operating system for the last 7 years
Sounds interesting, I had no idea Facebook had their own OS - presumably a Debian derivative?
Or is it just the kernel with a totally new/different userland from a normal “Linux” box?
We have faster iteration than upstream distros.
I won't give exact details but 1% CPU gives extremely significant monetary savings, and there are at least 15% savings from static linking, PGO, LTO, more appropriate `-march`, more appropriate CPU security sharing considerations, ...
Billions of dollars per year, in essentially electricity and required systems savings
(considering the scale of serving 3 billion users a day).
Also devs get access to the latest compilers, language levels, and libs, completely independent of distros, who have a more general compat issue to contend with. Considering there are about 30k tech in Meta this value also multiplys up.