Hacker Newsnew | past | comments | ask | show | jobs | submit | gchamonlive's commentslogin

Loneliness is a symptom of the loss of the third place. You can't solve loneliness, but we can as a society look at why many have lost their place for socializing.

Hyperfocus on productivity, the one dimensional man that know only rest and work, and the rise of narcisism and hyper individuality, all causes of the loss of the third place.

Everyone needs to take on the quest to find where they belong, but society needs to give people time to invest in this quest.

So I think it's as simple as working less and spending more time with people.


I designed a fairly complex test matrix with a lot of logic offloading to the control mechanisms Gitlab offers. You create job templates or base jobs that control the overall logic and extend them for each particular use case. I had varying degrees of success, and it's not a job for a Devs side quest, that means I think you need someone dedicated to explore, build and debug these pipelines, but for a CI tool it's very good.

Because you can extend and override jobs, you can create seams so that each piece of the pipeline is isolated and testable. This way there is very little that can go wrong in production that's the CI fault. And I think that's only possible because of the way that Gitlab models their jobs and stages.


Do you have reference for the krita+comfyui setup? I have a drawing tablet and always wanted to augment drawings using AI but never got around to deploying a stack for that. I have a 3090 that should be enough for it, I just need a reference for the setup.

Plugin for Krita I use: https://github.com/Acly/krita-ai-diffusion

App I recommend to download models and manage UIs: https://github.com/LykosAI/StabilityMatrix

(1) Download Krita (2) Download and install the Krita AI diffusion plugin (3) Run comfy UI using StabilityMatrix

Docs for using the Krita AI plugin: https://docs.interstice.cloud/basics/ It's a really fun plugin to use!


> we are given random connections and asked to do the mental work of inventing meaning in them

How is that different from having an insight yourself and later doing the work to see if it holds on closer inspection?


Don't ask me to elaborate on this, because it's kinda nebulous in my mind. I think there's a difference between being given an insight and interrogating that on your own initiative, and being given the same insight.

I don't doubt there is a difference in the mechanism of arriving at a given connection. What I think it's not possible to distinguish is the connection that someone made intuitively after reading many sources and the one that the AI makes, because both will have to undergo scrutiny before being accepted as relevant. We can argue there could be a difference in quality, depth and search space, maybe, but I don't think there is an ontological difference.

The one that you thought of in the shower has a much greater chance of being right, and also of being relevant to you.

Has it? Why?

Because humans aren't morons tasked with coming up with 100 connections.

Doesn't explain why a connection made in the shower has in essence more merit than a connection an LLM was instructed to come up with.

Not sure how to make it clearer. Look at the quality of this post, and compare it to your shower thoughts. I imagine you're not as stupid as the machine was.

The article is, however, about a different kind of stack.

OP has an obsession with how every web/JavaScript developer is terrible besides him

How do you know when you don’t suck?

I don't think you can. Success is circumstantial, failure is personal. Sucking is the only way to know your limits.

The management answer is that you compare yourself against your peers using qualified metrics. You stop sucking when your numbers are high enough on your organization's bell curve. Most developers can't measure things, and most organizations won't train them, which limits them to forever sucking at what they do.

But by then it becomes a number game and it stops being about quality but about optimizing given metrics. If you can you should always strive to suck less. If you can't then it's time to maybe seek some other working environment which will enable you to do so.

I have had the opportunity to try your optimized JS code and it sucks tho. It is almost comedic seeing you bragging about such a "blazing fast" bad app.

Sucks how and do you have an example of a React SPA that executes faster?

Sucks as not being interesting enough for the fast load to be worth anything. Why compare it to React, if React is shit?

So I take it the application works well for you but you aren’t interested in home server administration and that you have no examples of any SPA that is faster.

It would have been more helpful had you said that, because otherwise your comment just sounded like empty whining.


well you can take it the way you want to feel better, that was not the case tho. And the code was even worse than the results, lots of boilerplate. I don't want to be mean, but it is funny to see you brag so much about bad code that you think is good.

Or "nice project, I've done exactly the same 10 years before..." Or "cool project, but why haven't you tried [insert obscure software]".

The list goes on.


the "why not xxxx?" comments really are the height of disrespect, ignoring someone else's effort to instead show how smart they are, while being lazy about it. I bet 9/10 people who make such comments never even look at the original project in any depth, let alone try it out in anger.

That might be the case sometimes but it is incredibly uncharitable to assume so by default. This is a discussion forum for the technically inclined. "Why not X" is an entirely reasonable, even valuable, question. It's not ignoring the other party's efforts but rather attempting to learn from them. Why _didn't_ you use this framework that at first glance appears to be the obvious and easy thing? There must be a good reason and I'd like to learn about it.

The topic has been discussed at length here before. https://news.ycombinator.com/item?id=21675717


very often there is no reasoning given.

There could instead be warm fellow-feeling where everyone maintains respectful silence about alternatives, everybody with a new project gets a lovely ego boost, and I remain uninformed about what else exists.

If that doesn't come in the form of a discussion grounded in the original post, I could just as well have asked chatgpt and wouldn't have known the difference.

all about you, all the time, right?

I mean, there's room for both things. It would be bad if nobody at all was willing to fawn over a new project and call it exciting instead of shooting it down. (After all, it might be my project ...)

Shouldn't be sensitive about people asking "why didn't you do this?" though, or "that reminds me of my own thing that I made back in the stone age". Those are useful and reasonable points, if not made unkindly. Reality is unkind.


It's hard to explain, but easy to spot when someone is genuinely contributing and when it's just ego-boosting.

We can at least agree we are talking about the latter.


Thinking about it, I really don't know, couldn't say for sure what people do. I'm gonna let them get on with it. Here endeth the meta-comment. :)

i'll know what to think if i see "what about... ?" though

That's no middle ground, that's the bare minimum.

Bare minimum for what? Bare minimum to establish a political common ground or whatever?

Bare minimum for basic decency. You can't force someone to live with the result of rape forever on the grounds that a bunch of cells got the same rights as a fully grown human being.

It's more like going into a video game and tuning the difficulty all the way down so you are virtually invincible. It's taking the fun out of the game for some, but for others that's the only way to play it.

And you know what? I’ve got a medically complicated kid with a million doctor’s appointments and a full time job. I often switch the games down to the easiest mode. Then sometimes a new Dark Souls comes out and I relish every moment, if I’ve got the time.

I’ve been having Suno make random instrumental chiptunes, too, and it’s got me interested in buying a MIDI keyboard to play around with. Which 40 years ago people were saying that wasn’t real music, either.


>a MIDI keyboard to play around with. Which 40 years ago people were saying that wasn’t real music, either.

Citation needed


Cozy games where you basically can't lose are a booming industry in the last decade, so that outlook is certainly bullish for AI creative tools!

On my LG OLED I think it looks bad. Whites are off and I feel like the colours are squashed. Might be more accurate, but it's bad for me. I prefer to use standard, disable everything and put the white balance on neutral, neither cold nor warm.


I had just recently factory reset my samsung S90C QDOLED - and had to work through the annoying process of dialing the settings back to something sane and tasteful. Filmmaker mode only got it part of the way there. The white balance was still set to warm, and inexplicably HDR was static (ignoring the content 'hints'), and even then the contrast seemed off, and I had to set the dynamic contrast to 'low' (whatever that means) to keep everything from looking overly dark.

It makes me wish that there was something like an industry standard 'calibrated' mode that everyone could target - let all the other garbage features be a divergence from that. Hell, there probably is, but they'd never suggest a consumer use that and not all of their value-add tackey DSP.


"Warm" or "Warm 2" or "Warm 50" is the correct white point on most TVs. Yes, it would make sense if some "Neutral" setting was where they put the standards-compliant setting, but in practice nobody ever wants it to be warmer than D6500, and lots of people want it some degree of cooler, so they anchor the proper setting to the warm side of their adjustment.

When you say that "HDR is static" you probably mean that "Dynamic tone-mapping" was turned off. This is also correct behavior. Dynamic tone-mapping isn't about using content settings to do per-scene tone-mapping (that's HDR10+ or Dolby Vision, though Samsung doesn't support the latter), it's about just yoloing the image to be brighter and more vivid than it should be rather than sticking to the accurate rendering.

What you're discovering here is that the reason TV makers put these "garbage features" in is that a lot of people like a TV picture that's too vivid, too blue, too bright. If you set it to the true standard settings, people's first impression is that it looks bad, as yours was. (But if you live with it for a while, it'll quickly start to look good, and then when you look at a blown-out picture, it'll look gross.)


This is all correct.

“Filmmaker Mode” on LG OLED was horrible. Yes, all of the “extra” features were off, but it was overly warm and unbalanced as hell. I either don’t understand “Filmmakers” or that mode is intended to be so bad that you will need to fix it yourself.


Filmmaker is warm because it follows the standardized D6500 whitepoint. But that's the monitor whitepoint it is mastered against, and how it's intended to be seen.

TV producers always set their sets to way higher by default because blue tones show off colors better.

As a result of both that familiarity and the better saturation, most people don't like filmmaker when they try to use it at first. After a few weeks, though, you'll be wondering why you ever liked the oversaturated neons and severely off brightness curve of other modes.

Or not, do whatever you want, it's your TV!


The whites in Filmmaker Mode are not off. They'll look warm to you if you're used to the too-blue settings, but they're completely and measurably correct.

I'd suggest living with it for a while; if you do, you'll quickly get used to it, and then going to the "standard" (sic) setting will look too blue.


The problem is that comparing to all the monitors I have, specifically the one in my Lenovo Yoga OLED that is supposed to be very accurate, whites are very warm in filmmaker mode. What's that about?


Your monitor is probably set to the wrong settings for film content. Almost all monitors are set to a cool white point out of the box. If you're not producing film or color calibrated photography on your monitor, there is no standard white temperature for PC displays.


The Lenovo has an official ICC profiler, so I think that's unlikely.


still looks like yellow piss.


Disclaimer: i prefer movies to look like reality. but apparently this is far away from "artistic purpose".


What does “like reality” mean?


It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass. If I look at the screen and then I look outside, it should look the same. HDR screens and sensors are getting pretty close, but almost everyone is using color grading so the advantage is gone. And after colors, don't get me started about motion and the 24fps abomination.


> It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass.

It is not as clear cut as you think and is very much a gradient. I could send 10 different color gradings of the sky and grass to 10 different people and they could all say it looks “natural” to them, or a few would say it looks “off,” because our expectations of “natural” looks are not informed by any sort of objective rubric. Naturally if everyone says it’s off the common denominator is likely the colorist, but aside from that, the above generally holds. It’s why color grading with proper scopes and such is so important. You’re doing your best to meet the expectation for as many people as possible knowing that they will be looking on different devices, have different ideas of what a proper color is, are in different environments, etc. and ultimately you will still disappoint some folks. There are so many hardware factors at play stacked on top of an individual’s own expectations.

Even the color of the room you’re in or the color/intensity of the light in your peripheral vision will heavily influence how you perceive a color that is directly in front of you. Even if you walk around with a proper color reference chart checking everything it’s just always going to have a subjective element because you have your own opinion of what constitutes green grass.


In a way, this actually touches on a real issue. Instead of trying to please random ppl and make heuristics that work in arbitrary conditions, maybe start from the objective reality? I mean, for the start, take a picture, and then immediately compare it with the subject. If it looks identical then that's a good start. I haven't seen any device capable of doing this. Of course you would need the entire sensor-processing-screen chain to be calibrated for this.


Everything I talked about above applies even more so now that you’re trying to say “we’ll make a camera capture objective colors/reality.” That’s been a debate about cameras ever since the first images were taken. “The truth of the image.”

There is no such thing as the “correct” or “most natural” image. There is essentially no “true” image.


I completely agree. Theoretically you could capture and reproduce the entire spectrum for each pixel, but even that is not "true" because it is not the entire light field. But I still think that we can look at the picture on phone in the hand and at the subject just in front, and try to make them as similar as possible to our senses? This looks to me like a big improvement to the current state of affairs. Then you can always say to a critic: I checked just as i took the picture/movie, and this is exactly how the sky/grass/subject looked.


White walls in my kitchen look different depending on the time of day and weather, and that’s before I turn on the lights.

What is the correct colour?


Well, I know what you mean, color is complicated. BUT, I can look at a hundred skys and they look like sky. I will look at the sky on the tv, and it looks like sky on the tv, not like the real sky. And sky is probably easy to replicate, but if you take the grass or leaves, or human skin, then the tv becomes funny most of the time.


> I will look at the sky on the tv, and it looks like sky on the tv, not like the real sky.

Well for starters you’re viewing the real sky in 3D and your TV is a 2D medium. Truly that immediately changes your perception and drastically. TV looks like TV no matter what.


In a world full of deception, the spherical cow is a cup of fresh milk.


Is the milk spherical too?


Yes, if it's floating in space in a pressurized spaceship.


Cylindrical straw not included. Limited time offer. Warranty may be void if spaceship uses any reaction wheel or propulsion system. Other exclusions and limitations apply, see ...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: