I'd love to see some of your notes behind wiring up the pseudo-physics of the cards. Or was that provided by a library?
EDIT: I need to read more carefully - looks like it's part of http://www.phaser.io, and now I've just wrecked my productivity for the day! Great job on the solitaire game.
I think what strikes me most about a program, and what tends to speak to many people, are the little touches that are a delight to discover. Especially when you're not expecting them. So many comments have said "check this thing I made" and the thing proves to be well done, but somewhat utilitarian. Now, I like utilitarian as much as the next gold tunic wearing Terran, but these little touches created by the personal choice of a developer make a random human-computer interaction seem, even if by proxy, a little more human-human.
Over the last few years, probably most clearly highlighted after I was introduced to and started working with Bootstrap to create half-decent looking webpages (it is an HTML/JS/CSS UI toolkit/library/framework that makes it possible for web devs without a whole lot of design chops to simply create nice looking pages - https://getbootstrap.com/), I have been coming round to the idea that making something aesthetically pleasing and pleasant to visually/tactiley experience for the first time is not just beneficial or a small-but-helpful part of designing a well-liked and good webapp, but a key part of the experience. I really feel like if I have a good looking site versus an ugly one where the clickity-click part of the UX is identical, the good looking one will just really be more inherently comfortable, appealing, and embraceable for people. And not just other people - but me, the dev, who has seen behind the curtain and knows where all the ugly parts of the code or UX are. It's quite possible that if this is really true that it's obvious to loads of people, but it has been a slow realization for me, who used to get haircuts ridiculously infrequently/not shave very often, etc., in some cases thinking "well I eat well and exercise - what's inside is good stuff! Anybody who doesn't appreciate that is <adjective-indicating-poor-appraisal-skils>."
I guess what I'm saying is, spending time on quality presentation is a key part of appreciating your user IMO, and in terms of spurring project motivation I think it can be really useful and inspiring to get a static webpage with dummy UI that just looks and feels good.
I really do feel a little silly saying this because in hindsight it seems like it is one of those things that might be obvious to everyone else, but it was not to me, and maybe there are others out there like me :)
Our games don't work as well as your does on mobile, but I think they're still very nice on computers.
I thought I'd give it a whirl on the old ThinkPad T43 I'm using at the moment. I never really expect much from its tiny Mobility Radeon X300, but this was especially hilarious, and a most unexpected, refreshing change from the usual "oh, the disk LED is solid... again..." that I usually get. :P
FWIW, the http://aframe.io/ examples all work perfectly - I was actually slightly gobsmacked when I saw them, I can tell you with a fair amount of confidence that the simpler demos run on my system as fast as yours - so this GPU isn't a total lost cause with WebGL, and if it's moderately trivial to fix this, you'll net a small extra niche in your userbase (along with mobile users in 3rd-world countries using really entry-level smartphones and tablets).
If the fault lies with Phaser.io, and you feel like poking their ticket thingy, I'm fine with you passing the screenshots along. I'm also fine with testing potential fixes, if there's any room to explore there.
If your build environment can split out 32-bit Linux binaries without too much effort, I'd be happy to test that too.
The Aframe examples you linked seem to be simple geometric shapes with no textures so you may just be getting lucky there.
Also, Phaser, the library GP is using uses Pixi.js for the webGL drawing code if I remember correctly and Pixi is supposed to fall-back to plain canvas rendering if webGL isn't supported -- so disabling WebGL for the site may make the solitaire site work.
I tried killing webGL, which technically worked - I see a card deck! - but it didn't practically work: said card deck interacted at about 0.75-0.9fps. XD
It looks like I exceeded your texture memory. I use pretty high resolutions (targeting 4K displays), and dynamically build the card textures at runtime based on your screen size. Let me know if the native app improves the situation.
The native build sadly didn't fix much; Electron === Chrome at this level of the stack pretty much. Thanks for taking the time to build it though :)
Incidentally, I tested the 32-bit build on my T43, and also tried the x64 build on my ever so slightly newer T60; both seem to have exactly the same issue: I'm getting pages and pages of
r300: CS space validation failed. (not enough memory?) Skipping rendering.
Behold what 2005-2007 had to offer...! xD
The T43 says:
radeon 0000:01:00.0: VRAM: 128M 0x00000000C0000000 - 0x00000000C7FFFFFF (64M used)
radeon 0000:01:00.0: GTT: 512M 0x00000000A0000000 - 0x00000000BFFFFFFF
[drm] Detected VRAM RAM=128M, BAR=128M
[drm] RAM width 64bits DDR
[TTM] Zone kernel: Available graphics memory: 440758 kiB
[TTM] Zone highmem: Available graphics memory: 509818 kiB
[drm] radeon: 64M of VRAM memory ready
[drm] radeon: 512M of GTT memory ready.
[drm] PCIE GART of 512M enabled (table at 0x00000000C0040000).
radeon 0000:01:00.0: VRAM: 128M 0x0000000000000000 - 0x0000000007FFFFFF (128M used)
radeon 0000:01:00.0: GTT: 512M 0x0000000008000000 - 0x0000000027FFFFFF
[drm] Detected VRAM RAM=128M, BAR=128M
[drm] RAM width 128bits DDR
[TTM] Zone kernel: Available graphics memory: 1534580 kiB
[drm] radeon: 128M of VRAM memory ready
[drm] radeon: 512M of GTT memory ready.
[drm] PCIE GART of 512M enabled (table at 0x0000000000040000).
I'll try and check back here to see if you've thought of anything else; my email's in my profile.
For a few years now Google has had a little canvas benchmark in search results pages to figure out whether to enable endless scrolling. The game might springboard off of a technique like that to decide whether or not to use lower-resolution textures. (IMO, I'd target the detected screen resolution myself, I think that'd work...?)
Actually, it (https://solitaire.gg/play/freecell on Chrome) locked up my (Intel HD Graphics 3000) 2011 Mac, so it's not just ancient machines.
edit: also, when i play solitaire via an ipad app there is always a 'top score' by someone for the hand i was dealt. i dont understand how this is probabilistically possible
That is, is there enough entropy in your typical PRNG to actually generate all of those shuffles with equal probability.
But in reality you don't care. You could seed the PRNG with the current system time in nanoseconds or something, and every game you'll ever play is different. Even if the PRNG has limited state, you'll never notice it as a human player.
So it might be possible that your secure PRNG still cannot generate all permutations.
The key difference is that your PRNG could be cryptographically secure even though X < Y. It turns out that Y ≈ 226 bits, and so it's plausible that your cryptographically secure RNG would have X = 128 bits, for example, even though I'd typically expect a larger number.
People often conflate entropy with security. A random number source could have high amounts of entropy and be insecure, while a random number source with less entropy could be very secure. Or let me put it this way: entropy is necessary but not sufficient for security.
It is not the case that using /dev/random somehow gets you an information-theoretic security level that /dev/urandom doesn't.
I was thinking, "Oh, the mixing is good enough for practical use." But that's not what information-theoretic means.
I was thinking, "Oh, assuming the entropy estimation works." But we know it doesn't.
It was a stupid argument. But then again, /dev/random is stupid. Nobody gives a damn about information-theoretic security, and we don't have it anyway.
Something like a properly seeded 256 bit xorshift might satisfy this as a random number generator. But even that setup may not be sufficent.
Just one thought though... The starter screen is a bit complicated. (Talking only about the web version, haven't tried the phone apps)
I was just thinking my parents would love this game as Klondike and FreeCell are pretty much bare requirements when they buy a new computer or when I set one up for them. And your game works great, looks good and has all these variants. Except... They wouldn't have a clue which one to pick!
I don't know what would be the best way to show case the games for selecting them. A carousel showing an animation with each of them? Or a preview split screen that shows that animation when you hover on one? Or directly small animations for each button?
No clue, but if you can find a way to make it easier (and to then pin your "favorite" games alongside the "popular" picks) you've got yourself a super winner.
Looks super fun, hope you open source it soon!
(Yes, Unity's WebGL exporter is also in the mix, but browser rendering and support for the exported game is still wildly different across browsers.)
If you download any of the native apps (iOS, Android, Linux, Win, OS X), you can play offline. This is the magic of Scala.js. The same code that my server uses gets compiled down to JS, and the client calls it directly, forgoing the socket connection. You can also see offline mode in the browser by replacing the word 'play' with 'offline'.
Yep, that was it. Just tried offline mode and it's super snappy.
For a solely single-player game, it seems to me like being able to host everything off of a static fileserver instead of a Scala server seems like a scalability win for you and a bandwidth win for users. I don't doubt your choice and your tech stack (you know way more about your project and its needs than I do!), but I'd be super-curious to hear more about the reasoning and process.
After the feedback from this thread, I'm going to move to a "hybrid" model where the model logic all happens locally, but still fires websocket events to the server. Best of both worlds.
Are you joking? Wes Cherry said he's made $.08!! in Solitaire royalties alone!
Maybe this will help?
Screenshot captured with phantomjs/rasterize.js and shrunk with pngquant.
I found the dark screens fashionable in Linuxland and themes very hard to look at for a long time. Weirdly, now I code with Monokai theme in OSX and it is very enjoyable.
Perhaps the white background was easier when you weren't typing on two 24" screens and only had a 12" screen to peep into.
But this klond.c is not the original source, judging by the inclusion of comments referencing Win 3.1
EDIT: re-reading I see that he later on says it was eventually bundled in 3.0. So he wrote it for 2.1 and it didn't get picked up until 3.0. In either case, source referencing 3.1 in a comment is clearly not the original intern-developed source from 1988, although it's likely pretty close.
What if Wes, the intern, wrote Solitaire for Win2.1, then it got redone for Win3.1 et al. by someone else? That makes all of this somewhat morally underhanded.
I thought I'd ask here. https://www.reddit.com/comments/3zfadv//cyntkus
Of course, I'm incredibly late, so it might never be seen, but who knows.
It is a pretty interesting read if you're interested in C programming. In particular Microsoft's internal style and error handling.
I vaguely recall hearing something about getting the Win2K kernel to build, but I don't remember exactly what I read (I think it was on BetaArchive).
Then there's the Mac OS 7.x leak, which is also curious.
I need to dig my old archives up and poke at this stuff... heh
Its similar to the worse is better philosophy, low quality wins because it can be churned out quickly, it stays in place because the value proposition of replacing it is unclear or insufficient.
When it comes to critical things, its very important that changing things has a definite, demonstrable purpose in regards to its outward effects, not just itch scratching. Aesthetic concerns matter little because the potential for disruptive effects is a near certainty while the upside is much more nebulous typically (the devil you know etc)
I'm curious: what was the other time waster? :P
"Two of us tried to debug the program to figure out what was going on, but given that this was code written several years earlier by an outside company, and that nobody at Microsoft ever understood how the code worked (much less still understood it), and that most of the code was completely uncommented, we simply couldn’t figure out why the collision detector was not working. Heck, we couldn’t even find the collision detector!"
(ditto for the stuff that's languishing in Google's Usenet archive.)
There's this for Usenet: https://archive.org/details/usenethistorical and https://archive.org/details/usenet and https://archive.org/details/giganews
I'm saving up for a little 10TB disk array... at this rate it will be full shortly after it arrives.
This is gonna be so awesome to go through... thanks for this!!
More info at http://gizmodo.com/heres-how-a-macintosh-tutorial-taught-peo...
Anyone know why he would receive a "royalty" (esp. as an intern)? I've seen big bonuses/stock options but never a royalty.
A little clarification on Solitaire history. I wrote it
for Windows 2.1 in my own time while an intern at
Microsoft during the summer of 1988...
...A program manager on the Windows team saw it and
decided to include it in Windows 3.0. It was made clear
that they wouldn't pay me other than supplying me with an
IBM XT to fix some bugs during the school year - I was
perfectly fine with it and I am to this day.
I mean, they do everything they can to screw employees out of them using Hollywood-style accounting, but it is incorrect to say "No employee at a company ever received royalty for the code they write for the company".
> I eventually negotiated the final agreement with Guy Kawasaki, where, in addition to the $100,000, I managed to get a 10% royalty of the wholesale price if Apple sold Switcher separately, which Steve swore they would never do, but eventually the royalty delivered another $50,000.
This was written on a leave of absence, though.
I learned this when my parents tried to blame video games for why I wasn't doing homework. They didn't want to blame books when I substituted activities.
Anybody knows if that happened only once MS decided to publish the game?
Coding for Windows was fun back in the day ;)