I've ordered 2 sets off of AliExpress (the Stargate BC303 and BC304 MOCs) and was quite impressed. No box, digital instructions, and a few minor color swapped pieces; but complete and everything went together very well.
I'll throw a third (fourth, fifth because I know a couple of people who'd play this on Mac but who have no access to Linux or Windows) request for a Mac version on the pile.
On the contrary, making the tool that makes the tool is what I live for! My personal tech stack has benefited incredibly from this practice and fuels my startup, though it did take me 20 years of slow iteration to get here.
Well I’m not anti ;) … I just mean if your goal is to make the thing and you’re sure you need a tool to do it, watch out for the temptation to make the tool that makes the tool, which is the LONG way around, as OP was saying
that's really dope, but i'm not sure if it'll work out the same way nowadays. i think we're in a weird stage where momentum REALLY matters in a way that it didn't 10 years ago or 5 years down the line (probably)
It covers more than that, but it's not strictly mandatory.
> Content Filters: Discord users will need to be age-assured as adults in order to unblur sensitive content or turn off the setting.
> Age-gated Spaces – Only users who are age-assured as adults will be able to access age-restricted channels, servers, and app commands.
> Message Request Inbox: Direct messages from people a user may not know are routed to a separate inbox by default, and access to modify this setting is limited to age-assured adult users.
> Friend Request Alerts: People will receive warning prompts for friend requests from users they may not know.
> Stage Restrictions: Only age-assured adults may speak on stage in servers.
> > Stage Restrictions: Only age-assured adults may speak on stage in servers.
Does this mean that in panel-like settings where 100s of users are listening to a speaker, in order to ask or contribute in voice you need to be verified?
What gets deemed “adult” is incredibly random as far as I can tell, some of our servers/messages have triggered it, but no porn or anything is shared in them.
Year at the time X was OPENLY posting and later selling (feature hid behind paid subscription) CSAM(1) and non consensual nudity payment processors were still okay with it.
> "Put another way, Grok generated an estimated 190 sexualized images per minute during that 11-day period. Among those, it made a sexualized image of children once every 41 seconds."
Getting everyone to switch away from Discord has been hard because getting everyone to spontaneously switch with no clear benefit hasn't worked. They want to just keep using the app and get back into a game with their friend.
It's different to lock a door and task users with getting the key to come back in. This is more similar to an MMORPG that kills their audience because they cause the core group to stop playing and then all of the other players experiences get worse, which causes a downward trend that avalanches.
> getting everyone to spontaneously switch with no clear benefit hasn't worked
Somehow Discord pulled it off. It really didn't have much of an edge over the other chat apps at launch, just was slightly easier to use because it was simpler. A new site launching now could easily have that over Discord.
You're ignoring the massive edge it had over TeamSpeak and Mumble. Back when Discord was launched, it was significantly better than its competitors and the cherry on top was that you didn't have to install anything or host your own server, just make an account.
These days I would recommend PGlite for testing purposes when you use Postgres in production. That way you don't need any specific SGQLite vs Postgres behavior switches.
reply