Hacker News new | past | comments | ask | show | jobs | submit | AkBKukU's comments login

I became "pro-net neutrality" back in the 2010's when Verizon was trying to charge an extra $20/mo for hot spot functionality on my provider locked android phone.

After some rooting and side loading I was gleefully working around that until FCC came down on them for it [1]. Net Neutrality was passed after that and only seemed like a logical response as a means of consumer protection.

It has always been a user facing issue, it's just not one that many people seem to want to expend the energy to think about how it impacts them. Netflix isn't using that bandwidth, the users are. Without users, Netflix would use low/no bandwidth, just as it did when it was renting DVDs. The users are paying for their own access and speeds to be able to watch netflix over the internet instead. And in turn Netflix is paying their ISP to be able to provide that data. Punishing either the users or the web hosts for finding a more effective use case for the internet than just sending static pages is the ISPs either trying to find a way to blame someone else for having over provisioned their network. Or they are trying to strong arm web hosts into paying more because they have regional monopolies and can get away with it. As a consumer if I had a choice between two ISPs and one of them throttling Netflix to try and extort them for more money, even for self centered reasons I would pick the other just to have better service. But there are a lot of areas where that isn't the case and there is a single major broadband provider who has free reign.

[1] https://www.cnet.com/tech/mobile/what-verizons-fcc-tethering...


Not OP, but to provide some historical perspective, RTX hardware raytracing is very firmly a gimmick and it isn't AI nonsense that's going to be the end of it. It's going to go the way of PhysX, 3D Vision, and EAX audio. Cool, but complicated and not worth the effort to game devs. Game designers have to make all the lighting twice to fully implement RT, and it's just not worth the effort to them.

Nvidia's own site[1] lists a total of 8 Full RT compatible games, half of which they themselves helped port. There are far more games that "use" it, but only in additional to traditional lighting at the same time to minimize dev costs. Based on that and past trends, I would personally predict it to be dropped after a generation or two unless they can reuse the RT cores for something else and keep it around as a vestigial feature.

[1] https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-e...


"Full RT" means the game uses 100% raytracing for rendering (in some mode), which currently needs still far too much power to be a mainstream thing and is only added in a few games to show the prowess of the engine (IIRC a review of the Cyberpunk 2077 Full RT mode only a 4090 is really able to provide the power needed). The important entry is "yes", which shows far more entries and means there's Raytracing enhancements in addition to rasterization.

So, no, it's quite the opposite of what you stated: RT gets more important all the time, is not a gimmick and there's zero reason to assume it will be dropped in the future.


It is a gimmick in that you have to sacrifice far too much performance. An RTX 4080 will need to run at 1080+upscaling+framegen to get above 60 (!) FPS with ray tracing.

No thank you, I’ll take buttery smooth real 120 FPS at 4K. Especially because games have gotten so good at faking good lighting.

Maybe with the RTX 6xxx series it’ll be viable.


It does look fabulous though. I have a 4090 and absolutely turn RT on for cyberpunk. Even with a 4090 I use upscaling for a good frame rate. But the resulting image quality is just spectacular. That game is really beautiful.


No contest there, it looks really great! But for me, not enough to go back to “choppy” gameplay.


Personally, I'd rather vote 120fps and 4k as gimmick. If I had to choose between Raytracing or 120fps? Always Raytracing.


You could argue 4K as a gimmick if you’re sitting at TV distances, but the difference between 60 and 120 FPS is extremely jarring. Try playing at 120 and then mid-session capping it at 60.


I would hardly say it's a gimmick. Now that frameworks like epic's unreal engine and others implemented for the developer. I don't see these technologies going away. One can hope that nvidia's dominance lessons overtime.

I believe the next big thing is generative AI for NPCs as soon as the models are optimized in the hardware for the average GPU. Let's see what the next generation of Intel AMD and arm produce. Windows branding of an AI is going to make this possible. It's going to take years though for the market to be saturated with hardware capable for developers to pay attention.


You do realize that RT greatly simplifies the job artists and engineers have to do to make a scene look well lit? The only reason it's done twice currently is because GPUs aren't powerful enough yet. RT will simplify game production.


Apple doesn't need to support it, they need to not block it and let the user decide if they want to participate in a beta.


I am deeply steeped in the history of computers and the biggest three things I can point to as the reason (MS-)DOS won are:

- Licensing: Most computers either had custom operating systems that were not shared with other hardware vendors, or in the case of BASIC frequently, were licensed themselves.

- IBM letting the genie out: The BIOS on the IBM PC 5150 was cloned, quickly and legally, and other companies started making compatibles. This caused an explosion of computer variety in a few short years for a single platform.

- Microsoft: DOS usually means "Microsoft DOS", Microsoft also was responsible for many of the BASIC environments of early systems as well. The ability to buy your OS from someone else lowered the pressure on hardware makers. IBM also favoured Micorsoft's DOS over CP/M-86 and stopped supporting it quickly.

All this meant the PC compatible ecosystem with Microsoft DOS became easy to make from a hardware side, and lacked a single point of failure like Apple, Radio Shack, Commodore. Atari, etc. There were other MS-DOS compatible DOS's out there, but MS-DOS was usually the one shipped with computers to be as "IBM compatible" as they possibly could and gained dominance through that.

EDIT: To those who may not be aware, BASIC did become more OS like before going away. HP BASIC was extremely feature packed before HP-UX replaced it and was more capable than MS-DOS in many ways. It evolved far beyond just a programming language.


> This caused an explosion of computer variety in a few short years for a single platform.

The impact of this point can not be overstated. 99% of businesses make a much larger investment in software (and people!) than hardware. The idea that compatible hardware systems existed was a great hedge on their investment in software and training. For most businesses, this would be a no-brainer!

Over a short time, other propietary/non-compatible systems were relegated to home use, education, and gaming.


(I am the video/page creator)

This is the eternal struggle of trying to write about something like this on the modern internet. The video is the flashy thing that gets attention (and revenue which allows me to do this as my job) but the written part is just talking into a void and hoping someone notices. I agree this type of information is best presented in text which is why I made an effort to produce a written component as well. But there's no way that article would have ended up linked somewhere like here.


I am a big fan of your channel for quite some time. Thanks for your content! I wish I had the practical talent and your patience when it comes to repairs and restorations.


A lot of people these days want to flip through YouTube or listen to podcasts rather than reading even though the latter is often more efficient. So now you're asking someone to create additional content that won't make them any money (assuming they care).

People still create content for free--and it's obviously hard to monetize content generally--but, as apparently in your case, it's possible and if that were my goal I'd probably optimize for that.


Always enjoyed your channel! I was literally taking notes on this video yesterday, so I really appreciate you writing the whole thing up. I recently cleaned, lubricated, and re-aligned a few 5.25 disk drives using IMD (and by some miracle it worked despite my initial frustrations) and your videos were seriously helpful. Much thanks!


You're certainly not the one to be scolded for low information density in your videos.

(One semi-serious metric is the need to jump backwards to check something interesting when watching on 2×. This means the author/editor worked well on concise packing. Obviously, not applicable to live stream recordings.)

An article like that certainly can be linked here. Many similar ones are. There is no guarantee it will swoosh to the top, but it will be noticed by some people, and appear in someone's searches. Sometimes links are resubmitted years later, and finally get the deserved attention.


Well, you're losing the minority that didn't click on the video because life's too short and went straight to the comments.

Comments on HN usually contain related links and possibly more info than the item the story links to so it's a valid strategy.

Edit: but then you wouldn't have made any money off me anyway because I run ad blockers.


I run ad blockers and pay for a subscription to YouTube Premium. That's the best of both worlds as the creators get paid more and I see no ads.


As a subscriber and Patreon member, thank you for making videos with so much substance. You strike a good balance between entertaining and informative, and I’m glad you’re also able to create written records for work like this.


Take any advice you see on HN with a grain of salt.


> but the written part is just talking into a void and hoping someone notices.

The Web crawlers for LLM trainers definitely notice.


I love the channel and have been watching for a few years. I agree that videos are the flashy things but in some ways videos have ruined the internet for me a bit (not your channel though). What I mean is if I'm trying to learn how to do something and search for a solution, I find a gazillion videos and will be lucky to find a page that has succinct, written instructions. It's very infuriating. So, I often (almost always) have to watch a 10 or 15 minute video for 2 minutes of information instead of having written info that gets to the point. Keep up the good work on Tech Tangents, its awesome.


> One would rather expect this sort of functionality implemented in a high level operating system function

Almost counterintuitively, floppy drives were actually very fast compared to the CPUs early on. The DMA transfers were more to bypass the CPU than anything. For the CHS addressing some formats would implement interleave of the sectors (ie: 1,6,2,7,3,8,4,9,5). This would purposefully space sequential data apart so the CPU would have time to process it while passing over out of sequence data before encountering the next section of it. Putting more load on the CPU compounds this and was why dedicated FDC chips never went away.

Also fun fact,the usage of the ISA DMA interface is why you can't have full featured native floppy controllers on modern motherboards, that doesn't exist any more.


Do you really need DMA if your controller is fast enough and has enough memory? Couldn't you just emulate that behavior with modern hardware?


On the PC platform, the floppy is hardwired to DMA channel 2. DMA "channels" were orginally provided by an Intel 8237A DMA controller or compatible, and this is an ISA device.

DMA and FDC controller functionality got taken over by chipsets. ISA became the LPC bus, which still exists - and I believe will keep existing, because TPMs use it. But not sure if modern PCHs include the ISA DMA functionality any more. The floppy was really the only thing that used it.

PC floppies don't have to use DMA, they can use PIO.

So ...

- you need to create a PIO floppy driver for your OS

- you need something that takes floppy controller signals and converts them to LPC (signaling is not the same) and something like that doesn't exist (FPGA hobby project to the rescue)

- you need to wiire that into a modern PC's LPC bus, meaning you need to physically connect those LPC pins to an LPC debug header on your motherboard (if it has one-and if it does, you probably have to add header pins)

- then you need to do something to convert SATA power from the PSU to the floppy's four-pin mini-Molex. I think there's big Molex (IDE hard drives) to double-mini-Molex so with a SATA-to-Molex adapter you can "frakenstein" that.


LPC does DMA just fine, there are off the shelf LPC SuperIO chips with FDD support. There is also this project trying to bring ISA back from LPC https://www.vogons.org/viewtopic.php?t=93291

>ISA became the LPC bus, which still exists - and I believe will keep existing, because TPMs

already superseded by almost 10 year old eSPI. eSPI dropped DMA support.


One problem is Blender's multithreaded rendering doesn't scale well to VSE work because it focuses on breaking up each frame and as a result doesn't well utilize multiple cores. I've experimented with making a plugin [1] in the past to start multiple render jobs different points in the timeline in separate processes and was able to massively speed up renders.

I have since switched to Resolve on linux as well but due to using Blackmagic cameras that work better with it.

[1] https://github.com/AkBKukU/blenderSubprocessRender


This missed all of the PC VR headsets from the mid-late 90s somehow which was where all of the interesting innovation was happening. PC VR and 3D was a huge market compared things like the relatively terrible VirtualBoy.

If you want to get your fix of probably the most influential era of VR, http://www.mindflux.com.au/index.html has been around forever as a timepiece of what was happening on the PC then.


That's a you issue if a show you like gets canceled from lack of profitability due to pirating. It is your problem.


Open discussions like this should give them a hint what to do. Supporting DRM is just less ethical than pirating.


I honestly don’t care enough about TV.


I also make my living from Youtube and I would strongly disagree that it is fair.

On the absolute most simple basis, false claims can be damaging to the creator but the same false claims are completely risk free to the issuers.

The problem shown in the OP is an additional layer of complexity where another company is contacting the creator on behalf of Disney which muddies the waters on who exactly is filing the claim and whether they have the right to.

The only thing in the Content ID system that is built in favor of the creator is if the claim issuer doesn't progress the counter claim and it is automatically dismissed.

This doesn't even get into the likely diminished recommendations the video will get after being flagged, the time wasted by the creator to manual fight things that can be spammed with the API (0), and the unfair revenue splits that can result if the creator did make an honest mistake.

(0) https://developers.google.com/youtube/partner/identify_conte...


>I also make my living from Youtube and I would strongly disagree that it is fair

>The only thing in the Content ID system that is built in favor of the creator is if the claim issuer doesn't progress the counter claim and it is automatically dismissed.

Fairness doesn’t mean that the system should be stacked in favor of anyone who uploads a video. I get that copyright is a controversial subject, but both Content ID and the DMCA have mechanisms that are intended to balance the rights of copyright holders against the rights of people who create content using others’ works.

>This doesn't even get into the likely diminished recommendations the video will get after being flagged

My personal experience is that Content ID claims have no impact on video performance. Do you have any evidence that a claim negatively impacts search and discovery?

>the unfair revenue splits that can result if the creator did make an honest mistake

If the result of unintentional copyright infringement is a revenue split, that sounds to me like a very pro-creator outcome. They could take the video down. Or even sue you.


> Fairness doesn’t mean that the system should be stacked in favor of anyone who uploads a video.

Agreed, however if a creator repeatedly violates copyright and gets the three Copyright strikes (which I recognize are distinct while will related to claims) it is deleted. For the the issuer though. there is no penalty for invalid copyright removal requests. This is the type of unfairness that is an issue. Additionally, the claim issuer needs zero proof they even have the right to file a claim. The DMCA is mostly unfair to the companies that host the content forcing them act against the uploader and have zero ability to push back against bad faith actors. So the system Google has implemented can only legally pass the problem onto the content creators.

> Do you have any evidence that a claim negatively impacts search and discovery?

No. Can anyone truly have a confident stance that X == Y when it comes to how Youtube presents videos to potential viewers through its black box "algorithm"? I've seen plenty of inexplicable things happen with video recommendations as both a creator and viewer that both make me question what can/can't influence and never make an absolute statement about it, hence the "likely".

> that sounds to me like a very pro-creator outcome

You glanced over the unfair part there. Having 10s of music audio in a 10m video because you walked past a restaurant while filming a conversation can cause drastically disproportionate amounts of revenue to go to the claimant. This part is Google's fault and is an overreaction erring on the side of caution to appease the claim issuers.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: