Hacker News new | past | comments | ask | show | jobs | submit | zinxq's comments login

Everyone should experience the x220 keyboard. It's probably the best laptop keyboard I've experienced.

The x230 onward is also extremely good.

The trick for me is key travel. I bought a new Thinkpad that had 1.3mm of travel and instantly hated it. 1.5mm feels so so much better (not to mention 1.8mm). No new thinkpads have above 1.3mm I believe (in the pursuit of thinness).


I posted something along the lines of "games are too easy to make" on reddit and got expectedly lambasted. My fault, don't tell people with new found ability that the only reason they have it is because it's 100 times easier than it used to be.

A long time ago, I got interested in computers to make games but immediately veered into other kinds of software. No worries - I always planned that once I was "done" in the application/startup space, I'd head back and make those games.

Sadly - I waited too long. Like music, books, or photography - the supply-side is so inundated with content that the market is more about marketing than creation or merit. Mind you, never did I expect or even care if I made money. That was never the goal. But now I realize just to get some people to play my game would be a huge undertaking requiring tons of luck - just to rise above the noise. That was the deal breaker - I don't care about making money - but I do care about eventual players, at least if something will take months or years* to create. I wanted to make games, not do marketing.

The bright side is there's no shortage of fun games to play. I'll stay on the player side of the equation!

*Wouldn't be surprised if something like ChatGPT allows games to be made in days in the semi-near future. If so, I just might make games anyway - still not for money, and now not for players - but just to finally let those ideas out of my head.


You say making games is too easy when you mean creating the programs you play them on is easy. You entirely disregard the large number of other people and skills required. Most developers do this in most areas of expertise.


It depends what the OC was envisioning as their game. Not every game requires a "large number of other people".


The statement that making games is easy implies games in general, not "my first game" type games. Without significant qualification, the statement is no more true than saying "coding is easy." You have to engage in some serious logical gymnastics to justify it.


Likewise, it seems fruitless to argue over a general qualifier when the poster already clearly said they have a different definition. Better to spend more time aligning than talking past each other.


I'm not sure what exact action you're suggesting I should have taken. If you're objecting to my tone, well it's too late for that. I've spent a lot of my life hearing developers glibly dismiss the importance, complexity, and difficulty of most non-coding jobs. I'm not interested in holding a neverending stream of hands while developers embark on their journey to discover that other people's contributions and expertise are consequential. Sometimes people who don't know what they're taking about just need to hear the right answer.


>people who don't know what they're taking about just need to hear the right answer.

If that's your response to my statement, all I can say is: you're not owed an audience and your tone is a great way to have people ignore what may or may not be "the right answer".

But no, ironically enough you're proving my point. I'm simply saying to clarify what's on the users mind before going on a rant about what you assume they mean. There's no point in arguing about an aquatic fowl's eating when the user was talking about DuckDuckGo. And then doubling down by saying "but it's true ducks eat too much bread". At this point It's clear your conversation is a tangent no one is interested in taking.


> Mind you, never did I expect or even care if I made money. That was never the goal. But now I realize just to get some people to play my game would be a huge undertaking requiring tons of luck - just to rise above the noise. That was the deal breaker - I don't care about making money - but I do care about eventual players, at least if something will take months or years* to create.

I broadly agree but I think you're being too pessimistic about not having players. If your game is not novel (or maybe even if it is), I'm sure you can find a community that would enjoy it. Maybe participating in that community counts as marketing to you but I feel like sharing what you have created and marketing what you have created are separate.

Reddit is a good place for this type of discovery. Discords of other niche games in similar/adjacent genres would be another one. I feel like it's not as hopeless as you make it seem!


It is true that Unity and Unreal lowered the technical difficulty to make games. And the market is saturated with acceptably good games. You get posts by people who say, my game is as good as game X or game Y and had new features. Why isn't mine popular as well? It's usually a complicated question that people who enjoy making games don't like to think about: what makes people spend money on games.

But there is still plenty of room for ambitious, beautiful, complicated or thought provoking games. That's what's hard now. It used to be hard to make good pathing AI or a performant 3D renderer or real time physics. Now the challenges are different but still very hard.


> You get posts by people who say, my game is as good as game X or game Y and had new features.

A majority of the time, this just isn't true. I'd like to see a single example of a game that wasn't successful but was legitimately a good game.

And yes, I think making a completely derivative and uninspired game that works well is not the mark of making a good game, even if it technically works.


Being in the industry, I think I could give you a hundred examples of games I consider very good but not successful. Typically the reason why people think that is because the good games that are unpopular are obscure. I used to think the exact same thing before I did it for a living. You would counter that they aren't "good". (What does that even mean?) It becomes a circular argument. Good = successful, unpopular = not good.


> Wouldn't be surprised if something like ChatGPT allows games to be made in days in the semi-near future. If so, I just might make games anyway - still not for money, and now not for players - but just to finally let those ideas out of my head.

I'm currently doing exactly this. There's some games I never finished and released back in the 2009-2011 period when I tried self employment, so I'm using ChatGPT to build me a javascript game engine and Stable Diffusion to make some art.

The world doesn't need another vertical scrolling shooter, but I'd like games that don't waste time on loading screens and have nothing to do with the evil that is 'analytics'.


If your goal is only to have a few players rather than to make money, then find and advertise to the niche communities in genres closest to your game : if people like your game, the marketing is going to take care of itself !


Mailinator of course still works this way too. It has private domains, but it still fully supports public,free,disposable addresses @mailinator.com.

Just enter any inbox you want at the top of the homepage.


People loved Bob, it was truly just a magic he had.

In 2010, Zuck gave a lot of money to NJ schools. For reasons completely unknown to me, Bob, who was at Square at the time, was given the task of writing the code to accept the donations that would be given from the public alongside Zuck's gift. For reasons also completely unknown to me, Bob called me and asked me to help write the code. We had something like 3 days. I was working at my own startup at the time but was excited to go work on this one-off with Bob.

We had worked together at Google and spent a lot of time in the Java community thereafter. We were both Java geeks at the time, but I was confused why he couldn't find anyone at Square.

In any case - I went to Square each day for 3(?) days. I remember that we didn't need to process the transactions (thank goodness), but we just needed to validate and log them which made the problem much much easier. Bob insisted we write each transaction to 3 different machines for redundancy. He also insisted we used fsync which ensured the disk writes actually happened and didn't just get left in an output buffer. He was absolutely right, but I remember being saddened how much slower it made our system.

We finished the system with time to spare. Unbeknownst to me (and maybe Bob I guess) another team at Square also implemented the system in Ruby in competition with us. I recall them being rather "anti-Java" at the time.

In any case, we then benchmarked both systems and unsurprisingly, the Java version was many times faster than the Ruby version (in transactions stored per second). Of course, apart from fsync, this was also not just Java code - it was CrazyBob's Java code which wasted nothing. I really had a blast working on that project with him.

I now realize I really don't know so many finer points of why many of these decisions were made. If any early Square folks were there for this, I'd be interested in what you remember.


Oh yes - Our system ended up being good for several hundred or several thousand (don't recall) transactions per second.

Bob called me after Zuck's announcement that the system hit a peak of (wait for it), 5 transactions per second. And that for only a very short time. The system was mostly only loaded with a transaction every few seconds (or maybe even minutes)

The Ruby version would have been fine. Heck, we probably could have just printed the transactions to some screen and have someone write them down at that speed.

We did have a good laugh though.


Last year I upgraded to a Thinkpad Carbon. Almost immediately, I couldn't stand the keyboard. This sent me down a rabbit-hole of what was wrong with the keyboard (as it looked almost identical to my previous Thinkpad).

The answer is that Thinkpad has slowly evolved their keyboards to be slimmer. Reducing the travel on the keys with each step.

Some of the new ones are down to 1.3mm of travel. I realize this is a personal preference, but I noticeably make more errors under anything less than 1.8mm of travel.

Hence, I got rid of the Carbon and bought a Thinkpad T14 Gen 2 which is pretty much the latest model with 1.8mm. The Gen 3 was available, but they trimmed it down to 1.5mm.

I realize this is a pretty picky hill to die on, but I can't express how significant it is for the reliable use of keyboard for me. I'll be buying T14 Gen 2's until I die it seems from here on out. Everything else about the laptop is just fine. Runs any linux I throw at it, 4k screen if desired, all the upgrade-ability I might want.

(If you get lucky, you can even find one with a proper Ethernet port. Some come without because apparently during the chip-shortage, they removed that feature)


Where did you get the info about the key travel? I have an X1-Carbon Gen 6 and I feel that the key travel is good, but I wouldn't mind a better "click" feeling when I hit the keys, like in a Macbook or a Surface.


Is the Gen3 keyboard compatible with Gen2? If so you could upgrade to a Gen3 and swap out the keyboards.


The human is starting to look like the weakest link.

For such a closed track (known route to the centimeter, guaranteed no obstacles, etc) - a Self-Driving version of this could do it a good deal faster.


>The human is starting to look like the weakest link.

Human is a weakest link since 80s. That's the entire reason Group B has been cancelled and F-1 regulated into oblivion.


It took WRC all of two years to be faster than Group B. Turned out focusing on suspension and handling improved performance more than pure performance. That being said, Group B was just awesome and crazy. And not just because of the cars or the drivers, spectators were just as crazy, as were officials with their utter disregard for safety.


AI plays better chess but no one watches them in tournaments.


There was a driverless series in development https://en.wikipedia.org/wiki/Roborace

Unfortunatly it looks like it might have floundered


There is no point without a human, what's the difference between that and a bullet/missile?


You could still have human operators. Competitive drone racing is pretty neat.


Meh.


You would think that to be true, but I don't know that any serious attempts (of which several have been made, see RoboRace et al) have yet bested a human driver.


Yeah I'd like to see that one day on regular tracks, a true mechanical and AI engineering arms race. Though I think it would be too expensive


A Ph.D. really is about the journey. I'd be surprised if anyone ever thought it represented more money. It's unlikely to matter for that.

The time I spent getting a Ph.D. was one of the best times of my life. Not just the work, but the environment, the other people in the same situation, and the personal growth. I pursued it because 1) I really did love learning about Computers, and 2) I just loved the University environment. Every time I "graduated" I just signed up again for the next degree.

I never thought much about "using it" after I got it. I think it got me a higher starting title at a company or two and impressed a few (probably easily impressed) people along the way. I think the most measurable* impact however was listing it on my dating profile.

Again - it was a journey worth taking without much thought of the destination.

*Note, I said "measurable", not "good" or "bad"


What did it cost? Seems like it's not worth it unless it uniquely qualifies you for something.


>I don’t think many of these people will have trouble finding a job. After all - they most likely grinded the fuck out of leetcode and system design to get in. That type of persona to do hundreds of problems doesn’t easily waver in the face of minor adversity.

Not sure you meant it, but this is an inadvertent attestation for "leetcode". As in, it's not about the leetcode, it's about the type of person that masters the leetcode.


I think everyone agrees on that. It’s an arbitrary hoop at this point. We’re just trying to find those who are willing to go the distance on the hoops.


In the context of "History doesn't repeat, but it rhymes", this is just what happened when Java appeared. Everyone noting it was written in Java (as, believe it or not, it was the cool new language).

It might have actually been worse, because there was a strong penchant for naming products starting with a "J" to indicate the fact (i.e. JNotepad, JDatabase, etc).

It might actually be a good marketing technique to get other Rust aficionados to try your product. But otherwise, there isn't any real value now that "write once, run anywhere" doesn't just belong to Java anymore.


Even Javascript rode on the Java hype/marketing albeit having no correlation with it.


No real connection, but amusingly I think javascript ended up delivering on java's "run anywhere" pitch at least as well as java did in the end.

I can run javascript in a query to this DB apparently for instance, I can't do that with java.


Indeed. Shutoff Garbage collection completely and it can work. (And make sure your Java code creates no garbage - which is a new type of programming in and of itself)


Maybe I’m taking your comment wrong, why is this a bad thing? What other GC’d language just lets you turn it off?


I don't think their comment is intended to be negative really -- looks more like appreciative of the option, while cognizant of the fact that using it introduces a new challenge.


It's not about turning it off. You just don't allocate on the hot path.



> This is because Zing uses a unique collector called C4 (Continuously Concurrent Compacting Collector) that allows pauseless garbage collection regardless of the Java heap size.

This is incorrect. Firstly, C4 triggers two types of safepoints: thread-local, and jvm-wide. The latter can easily go into the region of ~200 micros for heaps of ~32GB even when running on fast, overclocked servers. Secondly, the design of C4 incurs performance penalty for accessing objects due to memory barriers. This impacts median latency noticeably.

You might not believe me, but ask Gil and he'll openly admit it. This article was written by someone who:

1) doesn't know how C4 works

2) doesn't analyze relevant metrics from their JVMs


Edit: My bad - unfortunate wording. I didn't mean this as negative at all. It's cool (if niche) ability.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: