Hacker Newsnew | past | comments | ask | show | jobs | submit | foolrush's commentslogin

Sleigh guidance.



> But those are also the most expensive line items in Federal and State budgets,

https://media.nationalpriorities.org/uploads/discretionary_s...


Two things missing:

1. Non-discretionary spending.

2. 50 additional links to breakdowns of the public budgets for the 50 States.


I hope you are kidding. Tell you what, why don’t you show yourself and search for common racist language or homophobia?

https://counterhate.com/blog/fact-check-musks-claim-about-a-...


This is demonstrably false, extremely biased reporting, and a cherry-picked source with a clear agenda. Hate speech is at an all time low.

https://blog.twitter.com/en_us/topics/product/2023/freedom-o...


Elon has stated repeatedly that X adheres to freedom of speech, not freedom of reach. Up until recently, X was a left-wing echo chamber. This is a classic example of MSM falsely reporting anything as non-moderate and non-left wing as "hate speech."

And this basically proves my point--you're not able to show, concretely, a non-cherry picked example of this purpoted "hate speech."


“Brighter-than-white”…

Gilchrist enters the chat.

http://wexler.free.fr/library/files/gilchrist%20(1999)%20an%...


False.


Epistemology 101. Statistical pattern matching has no epistemological truth beyond a correspondence of linguistic statistics.

It is merely statistical probability, which hardly can be classified as an epistemological basis. If we pretend for a moment it is, the ontologies formed are most certainly peculiar, and we can expect ideologies that emerge out of the ontologies are problematic.


Beyesians accept an epistemological foundation of statistical priors updated by experience.

Predictive processing is well established in the neuroscience community.

Capital T Truth is really of very little interest to anyone outside of faith-based epistemology like religion.


Physicists are very interested in capital T Truth. They work with explanations not just statistical priors.


This talk by Feynman is slightly related:

https://youtu.be/obCjODeoLVw


The issue is that the definition of “statistics” is anchored in a magnitude of frequency of glyphs. The “information” is fabricated in this regard, pulled up out of the ether, and by decree christened as “meaning”.

Numbers carry no meaning, nor do the magnitudes arbitrarily assigned to meaning. The map is not the territory.


> The “information” is fabricated in this regard, pulled up out of the ether, and by decree christened as “meaning”.

No, not fabricated, but inferred from a structured corpus of information generated by other semantic processes (humans).

> Numbers carry no meaning, nor do the magnitudes arbitrarily assigned to meaning.

Prove it.

> The map is not the territory.

Except if the territory is information, in which case the map is literally the territory. Knowledge is information, is it not?


Have you heard of information theory?

Numbers can mean anything. A multitude of numbers as voltage potentials and ion gradients sufficiently describe your brain.

Biology manifests this arrangement as a brain, without which this arrangement would also be similarly meaningless.

Your argument against deriving meaning from statistics completely ignores that the brain also works this way.


> Have you heard of information theory?

> Your argument against deriving meaning from statistics completely ignores that the brain also works this way.

The brain is not predicting it's compressing.


> The brain is not predicting it's compressing

Compression requires prediction, therefore your brain requires prediction.


Some form of prediction being used by the higher-level neurons doesn't make the brain a prediction engine.


I'm not sure who claimed the brain was a "predictive engine" or what that means exactly. The OP specifically referenced predictive coding which describes precisely what is meant, and has empirical support.

If you meant this as a comparison to machine learning, then a predictive coding model closely matches.


The current prevailing theory in neuroscience is in fact the brain is a prediction engine.

https://en.m.wikipedia.org/wiki/Predictive_coding


[flagged]


My former claim is that everything in the natural world can be reduced to statistics, so saying meaning cannot be derived "because statistics" is a very poor argument.

The second is a theory for the underlying mechanisms of the brain.

I'm sorry you don't understand.


> Epistemology 101. Statistical pattern matching has no epistemological truth beyond a correspondence of linguistic statistics. It is merely statistical

Your belief in epistemic truth is a statistical inference from your perception of apparently reliable causality. How do you ground this inductive inference?

Your attempt to appeal to epistemology 101 with a casual dismissal as "mere statistical probability" covers this deep, gaping maw. Bayesian inference reduces to classical logic when all probabilities are pinned to 0 and 1, but in what circumstances can we actually demonstrably infer absolute certainty? None that I can think of, except one's own existence.


I did not lay claim to epistemic truth. I referenced epistemology. That is, there are a plurality of epistemologies, all of which have their own epistemic truth mechanics.

So I would agree with you, entirely!

I was merely pointing out that statistical correlation alone affords no epistemological basis.


Statical pattern matching is how human brains work.


This is not technically correct as best as I can ascertain.

When we think about AA, it is a representation of sub pixel occlusion. As such, using uniform tristimulus to sample the “in between” value is incorrect.

There is simply no “correct” approach because there are no known models that properly model visual cognition.

What we can say though, is that the intermediate subpixel should not sample RGB tristimulus, but a loose nonuniform representation that approximates the lower order visual signal representation.

When discarding signal, RGB tristimulus is more “correct”. When interpolating the signal, approximate lightness is more “correct”. Some solid analysis is available here: https://hhoppe.com/filtering.pdf


You don't have to do subpixel AA. It's arguably worse on modern high DPI displays because of the color fringing that wouldn't be present with grayscale AA.


Subpixel AA is always bad - it just might be a necessary but disgusting hack on screens with too low resolution to render fonts comfortably...

Screenshots and screen recordings of subpixel AA content is foo without magic to get clients to disable such rendition during capture...


How about subpixel hinting? A white 1-pixel vertical line has the same amount of fringing whether it's RGB or GBR or BRG.


> in the sense that atoms are a direct cause of our perceptions.

I do not believe this is correct in any manner, and reduces the role of cognition.

Jastrow’s work focuses on this departure between one context and higher order cognition.


the concepts we develop through sensory-motor interaction with our environment structure our perception so-as-to-present a certain "level of abstraction" over the environment

But just as `sum([2, 4, 6])` presents `12`, it does so via directly summating `2`, `4`, `6`.

That our perception is aggregating and abstracting does not mean that it isnt caused directly by those things which it aggregates and abstracts.

Here, "seeing" is the `sum`, `[2, 4, 6]` are the atoms, and `12` is the perception.

A case in point: a frog can detect a single photo. Is there any sense here in which a frog's qualitative sensation of a "flicker" is not directly caused by a photon?

And likewise, that redness of the apple is just an abstract presentation of photons of light scattering of the atoms of its surface. The causal chain here is direct.

Light does not "go via" purgatory first, we see, directly, the objects of the world.


Resisting the structure requires an epistemic shift; resist applying a license.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: