Hacker News new | past | comments | ask | show | jobs | submit | jdiff's comments login

That's what Whisper's for

To my knowledge, the massive size isn't from the drivers, but because transcoded video files are also slipped in with the shaders. Proton struggles with things like Media Foundation, so Valve transcodes videos on their end.

Because of all the things humanity has learned about the universe, none of them has turned out to be aliens or evidence of aliens. You can theorize it all you want, but so far every time the universe has just turned out to be a big, strange place.

Aren’t we the evidence for aliens?

We're a pretty big statistical outlier as far as any of us can tell.

We exist, therefore it is proven.

We are not evidence that aliens capable of colonizing gas giants exist.

I beg to differ. In a trillion plus rocks floating in a vacuum and we colonize this one, I don’t see why candy flos planets controlled by beings is so far fetched.

We have Bluetooth toasters!


You said "proven" and now you're saying "[not] so far fetched." I still think it's far fetched, but I'm satisfied that you aren't claiming proof anymore.

One in a trillion plus rocks sounds like a pretty big statistical outlier.

It's not a Firefox issue, it's a NoScript issue. Chrome with its market share and propensity to implement its own standards, or Safari with its market share, quirks, and propensity to implement its own standards, make much better candidate IEs.

> Safari with its market share

Safari's market share is completely dwarfed by Chrome. Safari is nowhere close to IE's monopoly.

Forced WebKit on iPhones and iPads is the only thing standing between Chrome and complete IE monopoly.


Sorta, not really. Neural networks are deterministic in the wrong ways. If you feed them the same input, you'll get the same output. Any variation comes from varying the input or randomly varying your choice from the output. And if you're randomly picking from a list of even probabilities, you're just doing all the heavy lifting of picking a random number yourself, with a bunch of kinda pointless singing and dancing beforehand.


PowerShell exists on other platforms


why in god's heavenly image would you use powershell anywhere but windows


I use it on other platforms so I can use the same scripts no matter where I'm developing. Also github ci


For the same reason one would use Python, Ruby, Perl on Windows, outside UNIX.

Portable scripts.


Do they typically not? My only contact with passkeys has been the 2FA service (Duo) at my place of work, and I've got a passkey on my phone and laptop, as well as OTP push notifications, OTP SMS, or recovery code from IT. It's particularly handy with the Chromeboxes hooked into the big presentation displays since I can scan a QR code with my phone to use the passkey stashed inside it.


Slightly poor wording from me maybe. There have been cases where for example only one hardware key could be set up but other methods were available at the same time.

I remember AWS having some weird choices at some point too, not sure how they are currently.

But yeah, typically I think most services have had multiple choises available at the same time.


If it processes graphics, I think it counts, even if it has no output. There's still use for GPUs even if they're not outputting anything. My place of work has around 75 workstations with mid-tier Quadros, but they only have mini-DisplayPort and my employer only springs for HDMI cables, so they're all hooked into the onboard graphics. The cards still accelerate our software, they still process graphics, they just don't output them.


It's the shader core of a GPU. There are no graphics specific pipelines, eg: vertex processing, culling, rasterizer, color buffer, depth buffer, etc. That's like saying a CPU is also a GPU if it runs graphics in software.


> If it processes graphics, I think it counts, even if it has no output.

That's not a good definition, since a CPU or a DSP would count as a GPU. Both have been used for such purpose in the past.

> There's still use for GPUs even if they're not outputting anything.

The issue is not their existence, it about calling them GPUs when they have no graphics functionality.


Graphics functionality != display output What about laptop GPUs, which don't necessarily output to the screen at all times. Sometimes they don't even have a capability to do so. If it's coprocessor working alongside the general processor for the primary purpose of accelerating graphics computing workloads, it seems appropriate to call it a GPU.

Edit: perhaps your point is that it doesn't make sense to call a device designed primarily to accelerate ML workloads or just general purpose vector calculations. In that case I'd agree that GPU isn't the right name.


>> Graphics functionality != display output Exactly. Graphics functionality also includes graphics specific hardware like vertex and fragment processing, which this does not have. It has no graphics specific hardware, ergo not a GPU.


If it looks like a duck and it walks like a duck, why is it not a duck? If you are using a DSP to process graphics, then at least in the context of your system it has become your graphics processor.

Plenty of GPUs don't have (or aren't used for their) display output. It's a GPU because of what it does: graphics processing. Not because of what connectivity it has.


But it doesn't do graphics, so it shouldn't be called GPU. That's the whole point of this thread.


But it does - it just needs an application to retrieve the buffer and do something with it. For example pushing it to storage.


It does do graphics. Calculating graphics is different from handling display output. You can separate the two.

Like someone else mentioned, laptops often have discrete graphics cards that are not wired to display hardware at all, needing to shuffle framebuffers through the onboard graphics when something needs to make its way to a screen.


> Like someone else mentioned, laptops often have discrete graphics cards that are not wired to display hardware at all, needing to shuffle framebuffers through the onboard graphics when something needs to make its way to a screen.

Those are GPUs even if they aren't connected to a display because they still have graphics components like ROPs, TMUs and whatnot.


You're free to define it that way, but that's substantially different from GP's "if it's not a display adapter, it's not a GPU" that I was pushing against. It does seem pretty fragile to define a GPU in terms of the particular architecture of the day, though. There's plenty of things called GPUs that don't/didn't have TMUs, for example.


CPUs and DSPs are not primarily designed for graphics work, therefore they don't count as GPUs. CPU are general-purpose, DSPs might be abused for graphics work.

The "G" in GPU doesn't imply that they have to render directly to a screen. In fact, professional graphics cards are commonly used for bulk rendering for animating videos.

Datacenter GPUs are mostly used for AI these days, but they can nevertheless do graphics work very well, and if they are used for generative AI or if their built-in super sampling capability is used, the distinction becomes rather blurry.


But this particular one isn't designed for graphics work either, so it shouldn't be called GPU.


It's in the very name: "Tiny-GPU". Since it's a demonstration project by a hobbyist, the author probably didn't want to implement the whole optimized rendering stack yet.

On the other hand, they also left out some features that you'd expect to find on a general-purpose compute accelerator.

For example, they focus on tensor math. No support for bit wrangling and other integer math. No exotic floating point formats. Minimal branching capabilities.


The name is what I'm contesting. It's called Tiny-GPU but there's no mention of graphics functionality anywhere in the project.


Graphics pretty much boils down to matrix multiplication, and that's exactly what this thing accelerates. If it were a generalized accellerator, it would have to support other kinds of arithmetic as well.


Agree to disagree. I'll stop here because we're just wasting time running in circles.


Blur is completely natural to human vision, to any kind of vision. Look far away while bringing your hand close. Blur. Now look at your hand. Everything else goes blurry. You can even unfocus your eyes while looking at something. Now everything's blurry.


Funny thing is it's not even a feature fyndesk can claim. It's flat design, but it's not material design. Not even good design. Fyne's achilles' heel is its lacking aesthetics and it's pervasively non-native feel.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: