Hacker News new | past | comments | ask | show | jobs | submit | sinab's comments login

On the topic of low-noise bikes, why do some riders insist on having a loud freewheel? The clicking noise is unbearable in my opinion.

Source: https://bicycles.stackexchange.com/questions/4606/what-cause....


It's not so much an insistence, more just where the industry is at. Those freewheel hubs are loud because that's generally the lightest, most efficient design - lots of teeth for immediate drive engagement, no extra grease that would lessen said engagement, no extra material for sound dampening.

Clutch mechanisms are available, but these are slightly less efficient and generally more expensive.

Basically... People love spending lots of money for the absolute highest quality components. How will other riders know how fancy your hubs are unless it broadcasts that fact to the world?


In group rides with expensive bikes its often associated with higher quality wheels.

Friend of mine had Zipp wheels which had a distinctly louder freehub.


It is just the fashion of our time. See also: gravel bike tyres with tan sidewalls, or if you’re old enough, bar ends.


I don't, but my Zipps are loud and I like them otherwise.


Where are health providers hosting these data? Could you give an example? Thank you.


This is exciting news. Currently, a free online “Photoshop” alternative is https://www.photopea.com/


Photopea is fantastic, fast, looks and works exactly like Photoshop, and maintains some features removed from Photoshop, such as the ability to generate normal maps.

Don't buy into Adobe's bullshit.


The data availability statement in this article undermines its entire premise.

> We did not publish our raw data along with the manuscript because it could be understood that we are publicly shaming authors who did not want to share their data. As for the raw data that were received during this study, we informed our study participants that those raw data will be deleted after being examined and that all data and communication will be treated with strict confidence.


There's a big difference between explicitly denying to share data for justified ethical reasons, and saying you will share the data and then failing to follow through.

There is absolutely no contradiction here.


The concern of “shaming” is not a justified ethical reason.


Super cool. Thanks for sharing. I would it be possible to hash the content in such a way that is addressable by a memorable phrase?


The "Get a demo" button doesn't seem to work.

Google Chrome Version 98.0.4758.80 (Official Build) (x86_64)


Ah duh, fixed. Thanks!


It occurred to me, after reading the comments, how strange it is to then write out "huh" in text-based conversation.


For which kinds of problems are matlab and python unworkable?


All of these have some exceptions, but are for the most part true:

high energy physics

lots of computational bio

fluid dynamics

rendering

In general, python and matlab really struggle in problems where you want maximum performance, but the most efficient algorithms aren't vectorized. In some cases, this is solvable by writing python libraries in C/C++, but especially in scientific fields, the end users are often the same people writing the algorithms, so they don't gain much from a python library if they have to do all the hard work in C++ anyway. Julia gives them a way more productive dev experience while still having a good user experience.


high energy physics (HEP) shameless plug: https://github.com/tamasgal/UnROOT.jl

indeed, the grants (in the millions $) given to rewrite C++ or Python just to handle array (because for loop sucks) and ends up making monorepo blobs is jaw dropping -- while it's almost free in Julia, with 1/100 of the line of code and more flexibility we can match and beat even C++ code...


Optimization type problems. Also anything with loops that isn't easily vectorizable.


Statistical bootstrapping techniques are the ones I ran into.


Interesting article on the history of Perl for bioinformatics.

As someone who is currently a bioinformatics PhD student, Perl has become the vinyl of programming languages. While some older folks script with Perl, most packages I’ve seen published recently use either R, python, c++, and more recently, rust.


I’ve been working with biology data. A lot of our older scripts are Perl. It’s really good at this stuff. The great thing is the language is super stable so we don’t have to rewrite them usually.

Our new stuff tends to be in R or python though.


The Perl scripts I wrote 20 years ago are still running on modern systems today. It's almost scary how stable the language and interpreter have been.

Turns out it's not an accident, though! There were a lot of good decisions that kept the language stable, and test frameworks and code coverage have been a critical part of Perl and its modules since the 1980's. In fact, pretty much all of CPAN gets tested on a matrix of Perl versions and operating systems (ex: http://www.cpantesters.org/distro/N/Net-Amazon-EC2.html)


In many ways Perl's testing culture was the forerunner of TDD. CPAN Testers earned a lot of respect for making the Perl ecosystem stable.


this.

i remember silly java consultants rabbiting on about TDD and agile while dismissing oss.

meanwhile, the curation of both testing and documentation as well as overall code quality on CPAN was light years beyond the best corporate code i've ever seen.

i'd argue that perl (with CPAN) was the first internet native programming environment.


Yes, I also was there in those years. Perl snippets were shared by everyone, but almost nobody knew how they work, much less how to tweak them to adapt to new scenarios or how to contribute to a library. I remember a snippet that read a FASTA file and counted the GC%, but it had an error somewhere that made it fail when it reached the 1001 sequence. Nobody was able to find the error, they just splitted the FASTAs in smaller files. Until someone wrote a less esoteric Python script, and the Perl snippet died.


Might be unrelated, but Perl does have a hard limit on recursion within the same function; after 100 recursions it'll die with an error and you have to unroll your loops.


False.

    $ perl -Mwarnings -E'sub f { $c++; say $c if 0 == $c % 1_000_000; f() } f'
    Deep recursion on subroutine "main::f" at -e line 1.
    1000000
    2000000
    3000000
    4000000
    5000000
    6000000
    7000000
    8000000
    9000000
    10000000
    11000000
    12000000
    13000000
    14000000
    15000000
    16000000
    17000000
    18000000
    19000000
    20000000
    Terminated
You get a warning after 100 calls which almost always indicates a bug. In case it's a genuine deep recursion, the warning can be easily suppressed with `no warnings "recursion"`.

On my computer, the program continues to run for about 10 seconds, consuming 9 GB virt./res. after which it is killed off by `earlyoom`.


It might have behaved slightly differently 20 years ago when I ran into it :) or maybe I misremember!


Awesome project, thank you for sharing. What protocol / method is the controller using to communicate with the browser?


Thanks. It uses WebSockets.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: