Hacker News new | past | comments | ask | show | jobs | submit | cglong's comments login

One-liner for previewing a file with Quick Look. I aliased this to `ql` :)

    qlmanage -p $argv >/dev/null 2>&1

I'm curious if there's a way to do this with standard input instead of having to supply a filename?

I do this with man pages but it opens up in a full Preview window, not QuickLook.


You can use process substitution[0] to have a command output act as a file:

  qlmanage -p <(man man) >/dev/null 2>&1
0 - https://www.gnu.org/software/bash/manual/html_node/Process-S... (it works in zsh too, but bash manual explains this feature more clearly)

You can ask Gemini to summarize a YouTube video for you! Also if you have YouTube Premium on Android, you can ask questions about the current video.

Here's Gemini's summary of GP's video: https://g.co/gemini/share/8c0417024a3f


Alibaba just released 100 large models. One takes a 20 second video and summarizes it.

Now I wonder if it supports audio. If so, I want the relevant browser plugin so I can read YouTube on my machine!


Congratulations on the release! Since the submitter is the primary author, this should be marked as a Show HN: https://news.ycombinator.com/showhn.html




Some anecdata: 14 of my 32 extensions are listed as "may soon no longer be supported" :(


Once your macOS version reaches n-3, you start seeing this message constantly https://news.ycombinator.com/item?id=37665120


https://news.ycombinator.com/item?id=37672354

The entitlement of users calling open source maintainers that try to limit the surface area of support tickets to the systems that provably 95% of users are on “user hostile” is always sad to see.


The sociopathy of people who pretend "it's user-hostile to be told you can never speak of running this on an OS version > 2 years old. Ever. All attempts will be deleted" is "expecting endless support for free products that I'm not paying for or contributing to."

I reject your name-calling via a strawman, though I support the message.


Why must the makers of this platform for keeping software up-to-date be so obsessed with people running up-to-date software?


Is that it? That just reads to me like a project that’s tired of getting bug reports from people on old versions of software.


Not quite what you're asking for, but Microsoft (my employer) has a free tool for checking web and Windows apps for accessibility best practices: https://accessibilityinsights.io/


Kivy's marketing seems to be targeting LOB apps. If I was going to develop one of those, I'd optimize for something standardized and easy to maintain (HTML/JS) vs. the performance benefits of a native UX or cross-platform framework.


Kivy is old, so its improved more recently, but animation in HTML/JS is not power efficient.

Everything is GL accelerated, so the UI is snappy and fast.


The landing page is weird; it talks more about the funding for the framework than the framework itself. There's only one image showing UI, and the way its styled (cropped, tilted) makes me think its a stock photo, not a screenshot. The stock photo of a train right underneath isn't helping this perception for me.

If you got as lost as me, the Gallery is accessible via a link at the top: https://kivy.org/gallery.html


Note that those are not stock widgets.

And that's one of the main show stopper for me with kivy: it comes with very few built-in UI controls, so you have to code a lot of things yourself.

I much prefer Python to JS, but things like react native win because of the community libs you can install save you tons of time, and produce a better result.


I prefer Python as a language but JS is much easier to package dependencies and ship a finished product.

Python is a big fat conda-docker-shitshow because it doesn't provide a way to do

    import tornado==5.1.2
    import torch==2.1.0
etc. while coexisting in the same shell environment as something else that wants different versions.


This is especially true when you use a lot of tooling. I love jupyter, but installing it in a venv means pulling a lot of deps which will affect a lot what I can install.

Fortunately the Python community is much more serious about making deps that work together than the JS community, and the fact it works at all given the cartesian products of all the python modules is kind of a miracle and a testament to that.

Unfortunately, that's a problem that is unlikely to be solved in the next decade, so we all live with it.

The reverse problem is true for JS, and I see many projects shipping very heavy frontend code because despite all the tree shaking, they embed 5 times the same module with different versions in their bundle. That's one of the reasons for the bloated page epidemic.

I guess it's a trade-off for all scripting languages: choosing between bloat or compat problem. Rust and Go don't care as much, and on top of that they can import code from 10 years ago and it sill works.

However, and while I do know how hard it is to ship python code to the end user (at least if you don't use a web app), I don't think the version problem is the reason. We have zipapp and they work fine.

No the main reason iscompiled extensions are very useful and popular, which means packaging is solving more than packaging python, but a ton of compiled languages at one. Take scipy: they have c, pascal and assembly in there.

This can and will be improved though. In fact, thanks to wheels and indygreg/python-build-standalone, I think we will see a solution to this in the coming years.

I'm even betting on astral to providing it.


My ideal situation is that the system should maintain authoritative versions of every package and version that is ever requested, and they should not need to be shipped. Multiple versions of a package should coexist.

    /usr/lib/python3.12/torch/2.1.0/
    /usr/lib/python3.12/torch/2.1.1/
    /usr/lib/python3.12/torch/2.1.2/
When a package requests 2.1.1 it fetches it right out of there, installing from PyPI if it doesn't.

The same should be true of JS and even C++. When a C++ app's deb package wants libusb==1.0.1 it should NOT overwrite libusb-1.0.0 that is on the system, it should coexist with it and link to the correct one so that another app that wants libusb-1.0.0 should still be able to use it.

> Fortunately the Python community is much more serious about making deps that work together

This is very not true at least in ML. I have to create a new conda environment for almost every ML paper that comes out. There are so many papers and code repos I test every week that refuse to work with the latest PyTorch, and some that require torch<2.0 or some bull. Also, xformers, apex, pytorch3d, and a number of other popular packages require that the cuda version that is included with the "torch" Python package matches the cuda version in /usr/local/cuda AND that your "CC" and "CXX" variables point to gcc-11 (NOT gcc-12), or else the pip install will fail. It's a fucking mess. Why can't gcc-12 compile gcc-11 code without complaining? Why does a Python package not ship binaries of all C/C++ parts for all common architectures compiled on a build farm?


I'm assuming by system you mean OS, which is a terrible, terrible idea. Dev stack and system libs should not coexist, especially because system libs should be vetted by the OS vendor, but you can't ask them to do that for dev libs.

> I have to create a new conda environment for almost every ML paper that comes out

That's how it's supposed to work: one env per project.

As for the rest, it's more telling about the C/C++ community building the things bellow the python wrappers.


> one env per project

That causes 50 copies of the exact same version of a 1GB library to exist on my system that are all obtained from the same authority (PyPI). I have literally 50 copies of the entire set of CUDA libraries because every conda environment installs PyTorch and PyTorch includes its own CUDA.

I'm not asking the OS to maintain this, but rather the package manager ("npm" or "pip" or similar) should do so on a system-wide basis. "python" and "pip" should allow for 1 copy per officially-released version of each package to live on the system, and multiple officially-released version numbers to coexist in /usr/lib. If a dev version is being used or any version that deviates from what is on PyPI, then that should live within the project.


Actually conda creates hardlinks for the packages that it manages. Found this out a few weeks ago when I tried migrating my envs to another system with an identical hierarchy and ended up with a broken mess.


> but rather the package manager ("npm" or "pip" or similar) should do so on a system-wide basis.

I basically agree with this. With the caveat that programs should not use any system search paths and packages should be hardlinked into the project directory structure from a centralized cache. This also means that a dev version looks identical to a centralized version - both are just directories within the project.


Are you just describing something close to Nix?? In any case, Nix solves a lot of these problems.


Kind of, but not really. Nix is extremely complicated. Programs / projects including their dependencies is exceedingly simple.

Also, Windows is my primary dev environment. Any solution must work cross-platform and cross-distro. Telling everyone to use a specific distro is not a solution.


It is complicated... but honestly I have found claude 3.5 to just 'fix it'. So you hardly have to spend much time spelunking. You just give it all your dependencies and tell it what you want. It'll whip up a working flake in a few iterations. Kinda magic. So yeah when you can abstract out the complexity it moves the needle enough to make it worth it.


Nix != NixOS. It runs on WSL: https://github.com/nix-community/NixOS-WSL


Less than zero interest in WSL.

Nix fans are becoming as obnoxious as Rust fans. And I say that as an times annoying Rust fan.


Ah, sorry I misunderstood.

Yes, it would be nice to have that by default.

In fact, it's what uv (https://github.com/astral-sh/uv) does, and one of the reasons it's so fast and became so popular so quickly.

Astral for the win.


I don't think that's true for the exact same version: https://stackoverflow.com/a/57718049 (ie. it's deduplicated)


ML researchers might be thinking that their paper will be obsolete next month so why bother taking time to make their coding environment reproducible.


It’s not the researcher’s fault if the libraries they use make breaking changes after a month; proof-of-concept code published with a paper is supposed to be static, and there’s often no incentive for the researcher to maintain it after publication.

At this point, venvs are the best workaround, but we can still wish for something better. As someone commented further up, being able to “import pytorch==2.0” and have multiple library versions coexist would go a long way.


I install most tooling, including Jupyter, using pipx. The only thing I then need to install in the project venvs is ipykernel (which I add as a dev dep), and then create a kernel config that allows Jupyter to be run using that venv.


I'm hopeful the uv will bring us closer to tooling on par with other language ecosystems. But it's very early on in the process.


Given the track record they got, I'm confident they will.

But what I really hope is that they'll tackle the user app shipping problem eventually.


The problem I see a lot of JS developers having when they start using Python is they try to do the "import the entire world" strategy of development that's common in JS, and there isn't good tooling for that because Python just doesn't have that culture. And that's because it's a bad idea--it's not a better idea in JS, it's just more part of the culture.

Pick one package source. Stick with it. And don't import every 0.0.x package from that package source either.

There are obviously reasons to use more than one package source, but those reasons are far rarer than a lot of inexperienced devs think they are. A major version number difference in one package isn't a good reason to complicate your build system unless there are features you genuinely need (not "would be nice to have", need).


And it doesn't provide any way to use a link to any other package repository if you want to stick to vanilla pyproject.toml + build (the official build tool). So if you want to use the CUDA or rocm version of torch, for example, you have to add a direct link to the package. That means that you'd have to hardlink to a platform specific version of the package. There's no way to just make a package look at a non pypi repository to get the version you want otherwise.

So say you want to add pytorch, with GPU acceleration if it's possible on a platform. You want to make it multiplatform to some extent. You can't add another index if you want to use vanilla build, as that's not allowed. You can add a direct link (that's allowed, just not an index) but that's going to be specific to a platform+python version. Pytorch doesn't even provide CUDA packages on pypi anymore (due to issues pypi), so you need to be able to use another index! You'd need to manually create requirements.txt for each platform, create a script that packages your app with the right requirement.txt, and then do it again whenever you update. Otherwise, I think the most recent advice I've seen was to just make... the user download the right version. Mhmmmm.

The other option is to use poetry or something like that, but I just want to use "python build . "...


Poetry at least helps with that for Python. It's all still a mess though.


But you can do that, obviously not with this syntax. It’s non standard but I have built programs that install all dependencies as a first step. It’s pretty trivial.


KivyMD has a good selection of Material Design compliant widgets for Kivy. It does for Kivy what MUI does for React.


Yeah the stock photography feels really off to me as well and not really helping show off what the project is. Strange vibes


Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: