Hacker News new | past | comments | ask | show | jobs | submit | simondotau's comments login

Of course you can. If you invented it, and nobody else has patented it, you can patent it. Opening the source doesn’t invalidate your rights as an inventor or copyright holder, though it can add confusion and/or complexity to the enforcement of both the patent and the open source license.

Yes, but my understanding is that publically detailing how your invention works _before_ trying to patent it means the invention becomes public knowledge/prior art. That is, so long as you submit a patent application before releasing the open-source code, it should be OK, but there's no much you can do once the cat is out of the bag.

I imagine the rules and best practices would vary between jurisdictions, but basically yeah. But as soon as you file for the patent, you can release source and enjoy the confusion.

(Based on 30 seconds of googling, it seems that the USA and Australia gives inventors a 1 year grace period after publishing, but the granted patent might not be valid in other countries.)


USA also used to not consider publications or patents published outside USA as prior art, to the point of granting patents that were rewritten from someone else's patent in another country.

Not sure if it got better or worse with WTO patent rules.


Considering the history of NAND flash amounts to Toshiba applying to Japan's patent office and getting laughed out of the room, then Sandisk saw it and applied for and received a patent for NAND flash from the US Patent Office, it's probably still the case.

This isn't an issue with Firefox, it's a consequence of the NoScript extension blocking unsafe features. The behaviour is likely same/similar with NoScript and Chrome.

Chrome is the new MSIE. In both cases, a dominant position was used to dictate web standards in an unhealthy way. Microsoft did it through strategic neglect. Google is doing it by strategic smothering. Firefox and Safari are the web's last stand against an impending Chrome browser monoculture, against Google endlessly ramming new features down our throats and declaring them "standards".


> against Google endlessly ramming new features down our throats and declaring them "standards".

New features that will be used for their intended purpose maybe 1% of the time and fingerprinting by AdTech the rest of the time. What could possibly go wrong handing the Web over to an advertising company?


George Hotz went down the AMD rabbit hole for a while and concluded that the driver software — more precisely the firmware which runs on the cards themselves — is so badly written that there's no hope of them becoming serious contenders in AI without some major changes in AMD's priorities.

I'm not defending their software. It does honestly have a ton of issues.

George Hotz tried to get a consumer card to work. He also refused my public invitations to have free time on my enterprise cards, calling me an AMD shill.

AMD listened and responded to him and gave him even the difficult things that he was demanding. He has the tools to make it work now and if he needs more, AMD already seems willing to give it. That is progress.

To simply throw out George as the be-all and end-all of a $245B company... frankly absurd.


The fact that consumer and "pro"(?) GPUs don't use (mostly) the same software is not confidence inspiring. It means that AMD's already apparently limited capacity for software development is stretched thinner than it otherwise would be.

Also, if the consumer GPUs are hopelessly broken but the enterprise GPUs are fine, that greatly limits the number of people that can contribute to making the AMD AI software ecosystem better. How much of the utility of the NVIDIA software ecosystem comes from gaming GPU owners tinkering in their free time? Or grad students doing small scale research?

I think these kinds of things are a big part of why NVIDIA's software is so much better than AMD right now.


that greatly limits the number of people that can contribute to making the AMD AI software ecosystem better

I’d say it simply dials it down to zero. No one’s gonna buy an enterprise AMD card for playing with AI, so no one’s gonna contribute to that either. As a local AI enthusiast, this “but he used consumer card” complaint makes no sense to me.


> No one’s gonna buy an enterprise AMD card for playing with AI

My hypothesis is that the buying mentality stems from the inability to rent. Hence, me opening up a rental business.

Today, you can buy 7900's and they work with ROCm. As George pointed out, there are some low level issues with them, that AMD is working with him to resolve. That doesn't mean they absolutely don't work.

https://rocm.docs.amd.com/projects/install-on-linux/en/lates...


Agreed that AMD needs to work on the developer flywheel. Again, not defending their software.

One way to improve the flywheel and make the ecosystem better, is to make their hardware available for rent. Something that previously was not available outside of hyperscalers and HPC.


Indeed, AMD willing to open firmware is something Nvidia never has done.

> To simply throw out George as the be-all and end-all of a $245B company... frankly absurd.

I didn't do that, and I don't appreciate this misreading of my post. Please don't drag me into whatever drama is/was going on between you two.

The only point I was making was that George's experience with AMD products reflected poorly on AMD software engineering circa 2023. Whether George is ultimately successful in convincing AMD to publicly release what he needs is beside the point. Whether he is ultimately successful convincing their GPUs to perform his expectations is beside the point.


> The only point I was making was that George's experience with AMD products reflected poorly on AMD software engineering circa 2023.

Except that isn't the point you said...

"there's no hope of them becoming serious contenders in AI without some major changes in AMD's priorities"

My point in showing you (not dragging you into) the drama, is to tell you that George is not a credible witness for your beliefs.


Clearly you've experienced some kind of personality clash and/or a battle of egos. I can't fault you for holding a low opinion of him as a result, but I'm unimpressed with personal beefs being used as evidence to impeach credibility.

My point is as I wrote in both posts. George was able to demonstrate evidence of poor engineering which "reflected poorly on AMD". From this I could form my own conclusion that AMD aren't in an engineering position to become "serious contenders in AI".

The poor software engineering evident on consumer cards is an indictment of AMD engineers, and the theoretical possibility for their enterprise products to have well engineered firmware wouldn't alleviate this indictment. If anything it makes AMD look insidious or incompetent.


I really don't give AF about George.

Egohotz is brilliant in many ways, but taking him at his word when it comes to working with others has been a mistake since at least around 2010. This is well documented.

Who said anything about taking him at his word? Everything he has done regarding AMD GPUs has been in public. I'm sure there are plenty of valid criticisms one can make of his skills/strategy/attitude/approach, but accusing him of being generally untrustworthy in this endeavour is utterly nonsensical.

I can reliably crash my system using kobold.cpp with Vulkan running an AMD GPU. All it takes is a slightly too high batch size.

What is slightly too high of a batch size? If max size is 100 and you're at 99, of course 100 will crash it.

Yeah, more of a pinch and twist.


Turnbull also claimed that domestic 1Gb connections are absurd because consumers wouldn’t pay the cost of a guaranteed non-contended 1Gb connection.


That’s an unrealistic assumption. It wasn’t even the same in your own part of the world fifty years ago.


Perhaps it was EU regulation, or perhaps it was Apple wanting to make good on a ten-year-old promise of connector continuity. When Apple introduced the Lightning connector in 2012, they described it as their iPhone connector "for the next decade".[0] Their switch from Lightning to USB-C on the iPhone came just over ten years after that.

[0] https://www.youtube.com/watch?v=82dwZYw2M00&t=1571s


Boy did they ever get hell when they left the 30-pin connector. I could easily see them wanting to avoid that whole mess again.

Honestly I am still blown away that the switch last year to USB-C was met with some positivity (often from tech people) and a bunch of ’meh’. I was expecting tons of screaming and “Apple’s making you buy all your cables again to juice their books!”


Counter-counterpoint: When Apple introduced the Lightning connector in 2012, they described it as their connector "for the next decade".[0] Their switch from Lightning to USB-C on the iPhone came just over ten years after that announcement. Perhaps it was EU regulation, or perhaps it was Apple wanting to make good on a ten-year-old promise of connector continuity.

[0] https://www.youtube.com/watch?v=82dwZYw2M00&t=1571s


20 years ago I introduced a Japanese person to Vegemite and, I think because I described it to him as like miso, he really liked it.

I wonder if the shock value is derived from an expectation of chocolatey sweetness. If you’re expecting Nutella, Vegemite is guaranteed to disappoint.


It’s like Pepsi vs Coke, but without the ease of appeal of 12% sugar content. Behind the imposing wall of salt, they have subtle yet significant differences in flavour. An alternative probably won’t trigger the same comfort memories.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: