Hacker News new | past | comments | ask | show | jobs | submit login
20 years ago Far Cry was released (wikipedia.org)
70 points by doener 35 days ago | hide | past | favorite | 63 comments



My memory of Far Cry was much earlier and less kind - there was a few people around the net at the time working for CryTek on several titles (I worked on Silent Space https://www.ign.com/articles/2001/02/10/crytek-to-bring-sile...)

The brothers went to E3 with a demo they were showing us all, but only some of us has the tech to fully run it (I had a GeForce256 I got for free from NVidia because of work I had done with the recently opened up Quake source code) - then things went crazy, they tried to get a bunch of us to move to Germany and put us up in hostels on very restrictive contracts (basically they'd move us then we'd be completely liable for our own costs)

Now, the X-Isle demo really did help push NVidia cards, but the company now had to ramp up quickly.

I heard that 'mafia' money was involved in making this happen (and they came calling for their interest a few years back which is why the company hasn't done so well) - it was odd that Cevat's first purchase was a Ferrari...

Anyway every other project got canned after this - a few of us went on to try set up our own studio, but that didn't pan our after a year.


Hey you should turn this tale into a thriller - would definitely read it


Bahh this was 23/24 years ago, and my memory of everything is hazy (and the mafia thing I heard about after - so I can't verify it - but what I know I believe it).


I remember being a kid and wanting to play Far Cry in high quality when it came out. I did not have the money to buy the best graphics card to play it smoothly on high settings. So, I could either play it in low quality with something like 60 FPS or in high quality with something like 15 FPS. Of course, 15 FPS is not enough to play the game properly, but to capture all the beautiful details, I just went with the highest settings and very slowly explored the beach, astounded by all the details. Good memories.


Same. And 20 years later I'm an aging man but do the exact same thing in Cyberpunk 2077. Despite spending a small fortune on a graphics card, it's still either smooth framerate or the gorgeous path traced environments, because I haven't splurged on the 3090/4080 yet. Nothing has changed.


Out of genuine curiosity, what GPU are you running? Friends have recommended Cyberpunk 2077, but I'm only running a 3080. Wondering how many compromises I'll need to make.


I have an (equally aging) 2070 Super. It runs the game just fine at 1440p and perhaps not maxed out. Your 3080 will of course run it even better. No need to upgrade unless you want 60fps path traced or 4K.

Can highly recommend the game.


That was Splinter Cell for me, 2 years before Far Cry. Always saw Far Cry as more... toonish whereas SC was all about realism


light/shadows in Splinter Cell were not very realistic though


nah but physics were. I remember a level in a hospital with one of those privacy curtains between two beds that you could walk through and that thing would move so beautifully


I miss the days when the Internet just didn't have a long standing archive of reminders of how old I'm getting, every day.

And somehow I don't feel like anything, but the absolute obvious, is any better.


> And somehow I don't feel like anything, but the absolute obvious, is any better.

I don't think it is either, speaking as someone who also started using the internet when hardly anyone was. I mean, if you think about it, human needs (even up to self-actualization) require just a modest amount of innovation. After that, it's just the trick that was invented at the turn of the 20th century with psychological manipulation and advertising: they found a way to make us think things are getting better, but not really in the sense of absolute human enjoyment. For example:

* Yes, I do love my Macbook Air M1 which has undreamt of power compared to my old 486 computer that could barely run "Tomato Blaster", but the power my current Mac is only valuable in regard to the rest of the technology that exists...but if computer development stopped at the 486 would I really be less happy? Not really, I would still enjoy using it and developing basic junk on it.

* Yes, I love my 24MP full-frame camera with my 500mm prime lens. Looking back at a photo guide to animals from the 80s, my equipment destroys the shots back then. But then again, if NO ONE had this advanced stuff, I doubt I would be less happy. In fact, I recently put an old mirror lens introduced in 1983 on my camera. The shots are way worse than the modern prime, but yet...I can get some interesting images and honestly I'd be happy "perfecting my craft" on that lens if no modern lenses were available

The truth is, the very knowledge of the existence of newer technology makes us want it, but do we really need it?

In most cases no, and because creating new tech requires more resources, I think that endless technological development (especially with regard to new computing) is an enormous mistake.


On your second point, the book Camera Work contains lots of really good photography from the late 1800s and very early 1900s, before they had even really figured out what photography is.

Looking at how beautiful pictures they were able to make with potatoes helped me stop worrying so much about gear and technical perfection. (I mean, art is subjective, but this I would judge to be beautiful and yet I can almost count the pixe^wsilver grains: https://kurtgippert.cdn.bibliopolis.com/pictures/018668_4.jp...)

(Playing really good old video games has the same effect for me except in that area.)


Indeed! Modern photographic technology (which I will admit is quite impressive) allows you to get certain kinds of shots more often, but it doesn't necessarily improve the "health" of the creative sphere of humanity. Sadly, it's the obsession with the new that sells, and I think to some extent it also harms the photographic endeavours of people because they get a bit too obsessed about gear.


This reminds of of a quote by Zygmunt Bauman: https://untested.sonnet.io/Proteus+-+Uncertainty+is+the+only...


Indeed, a relevant quotation!


I take the opposite perspective: I'm so glad I got the opportunity to see all these amazing things be conjured out of thin air in front of my eyes. I feel sorry for the younger people who missed it, and only get to view the end result we have today. (Of course, they will go on to see other fantastic things I will miss, but that's hopefully far into the future so I'm less concerned about that.)

(On the contrary, I feel sad when things I missed are brought up. There's a lot that happened technologically in the late '60s and through the '70s that I can only read about in history books.)


I met with my high-school friend and band buddy Thomas Bartchi in 2004 at a café in Copenhagen.

We were discussing this audio job he was applying to in Germany for a gaming company. He had an upcoming interview with them and we were brainstorming what new ways you could utilize audio engineering to write more interactive soundtracks.

The normal way was simply to change soundtrack when you went from normal mode to high action. But we started discussing a more interactive approach I had been playing around with in the CD-ROM lingo days and because Thomas was an excellent composer and understood audio engineering at a very deep level we came up with the idea of constructing a sound track in layers so that there would be some audio layer that were always there, but others would then be added or removed to provide a almost a gradient audio intensity experience.

The challenge was obviously to make it feel very different gradually even though you were mostly using many of the same elements and just layering them.

He went there and while there were other great candidates they gave him the job because of his way of thinking about music in games and the rest is history.

His successor for the next game was none other than Hans Zimmer.

He later went on to do music for Hitman.

Here is the FarCry soundtrack if you want to be taken back in time: https://soundcloud.com/j-wilhelm-139558882/far-cry-2004-main...


It's still a good multiplayer game. I used to play with my students on special occations in 2016 (teacher in an academic highschool) because FC runs on any hardware evenwithout dedicated GPUs.

I always told them I'd let them go 15 mins earlier if someone has more kills than me in deathmatch. Only got beaten once


> because FC runs on any hardware evenwithout dedicated GPUs

Twenty years of memes disagrees wholeheartedly


You are mixing it up with Crysis?


I absolutely am! Doh!


It's fair, though - it was the original Crysis

There was a time when it took fairly impressive hardware. I think this was one of the first popular 64bit games, upgrading into it


> I think this was one of the first popular 64bit games, upgrading into it

I don't think so. I remember struggles and patches necessary to get it run when I moved to a 64 bit machine a few years after it came out and I wanted to replay it.


A trip down memory lane :) The patch for Far Cry to become 64bit:

https://www.anandtech.com/show/1677

They were technically beat by Chronicles of Riddick who shipped something on disk

Looking back, this did little for performance. I suspect the memory limitations and introduction of SMP around that time to be a lot of warts we recall


I think I remember seeing someone run Crysis in software on a 128core AMD Epyc and get a decent frame rate.



I remember building an Ahtlon 64 system with a GeForce 7900GTX at that time and I played with Far Cry everything maxed out. Half of the cost of that computer was the graphics card and it was the absolute highest end model that money could buy.

Recently I bought a graphics card for 600 when the prices came down, a 3070 RTX, a more conservative choice since the top models now go for 2k.


Don't worry, MWLL, which runs on Crysis Wars engine, still can make a modern PC crawl.


I never loved the original Far Cry as a player, but I did deeply appreciate it as a game designer.

I was working as a game programmer and technical designer on a big budget FPS back when the original Half-Life was released, and immediately "AI AI AI!!!!!" became a stifling buzzword and thought-terminating (ironically) slogan, heavily reorienting how people thought about shooter design and, essentially, ending boomer shooters as a thing for a good long while and ushering in the era of Halo, Call of Duty, cover-based shooters, and so on.

I happened to adore boomer shooters and have good taste for their rhythms and making them, so the transition is not one I personally enjoyed at all.

But worse in a way, Half-Life ALSO ushered in much more linearity in level design because of their awesome interactive set pieces and the particular interactive way they got across their story. Certainly that was the way its release was experienced in the studio I was in, anyway. Less sprawling, unfolding, and backtracking like in Doom (where the space unfolds over the course of a level in something like a fractal way), more following a string of pearls and hitting all the scripted events, like a haunted house ride. You didn't want the players getting lost, you didn't want them to get bored backtracking, and you didn't want them to miss any of the cool one-off scripted content you'd made for them.

(I love Half-Life, so I don't blame it for any of this - it's a vastly more interesting game than many of the games it inspired, which I think is typical of highly innovative games)

At the time, I wasn't quite yet a thoughtful enough, perceptive enough game designer to recognize how deeply in tension those two changes ended up being with each other. And so I spent a miserable year of eventual burnout trying to make "good enemy AI that players actively notice" as a programmer for a game whose levels kept getting progressively tighter, more linear, and more constrained to support the haunted house ride of scripted events.

As a point of contrast, games like Thief and Thief 2 were magnificently structured for good, cool AI that players could notice, and it was specifically because of the non-linear ways the levels were built, the slow speed of the game, the focus on overtly drawing attention to both player and enemy sense information, and the relationship between the amount of space available to players at any given point to the amount of enemies they faced, as well as the often longer length of time players engaged with any particular set of enemies... and of course, despite all these cool features, poor, poor Thief was released to store shelves just 11 or 12 days after the original Half-Life. Business advice 101 is don't release your first person game 11 or 12 days after Half-Life.

Anyway, that all leads in to my admiration for Far Cry's design. Their outdoor levels actually steered their game design in a direction that could let enemy AI breathe and be an interesting feature of their design, in turn giving players higher level choices about when, where, and how to make initiate fights. In that sense, it reminded me of where Thief had already gone previously, but in the context of a high profile shooter. But doing that required actively relinquishing the control of the haunted house ride-style of pacing, which I think was kind of brave at the time.


This was the most amazing game for me at that time but when Company of Heroes was released COH became my favorite. The graphics and specially the sound. The open game play that rivals GTA series.


CoH was amazing. Too bad the current iteration is dogshit.


Oh I Still didn’t get around to play Far Cry, it’s not been out for thaat long. It’s in the Steam Library waiting for when I have a couple hours for playing a video game.


Just reading the title unlocked memories for me. Far cry came out in my life at the peak of local competitive counter strike, 1.6 at the time... CS was everything to me at that point and I hated the people coming to the lan house to play far cry instead of being on the cs servers... I remember it looking great tho I never played it.


I felt that way about what CS did to the Quake 2 community.


At the time I installed Windows on a free partition on my desktop PC for the sole purpose of playing FarCry. It was great fun. The graphics blew me away.

I watched the movie a few years later, it was ... not as good as the game. But movie adaptations of video games almost never are, Postal being a notable exception.


Fun fact, both films were made by one of the worst directors of all time: Uwe Boll


IIRC he had/has a talent for taking advantage of German media laws. He's made a lot of derivative things like this


It was a massive tax dodge. Because of German arts funding laws investors could make money even if the film flopped.

https://www.cinemablend.com/features/Uwe-Boll-Money-Nothing-...

They closed the loophole he was using in 2006.


Ah, thank you for the reminder!


I know. I think he retired a couple of years ago.

He was a fairly competent producer, if he had gotten a better director on board, things might have worked out differently.

Postal, though, is brilliant, IMHO, because the movie captures the gross, absurd humor of the game perfectly. It's campy, which is what makes it work.


Scott Pilgrim vs the world is another good adaptation IMO.


I loved the way this game allowed for very different tactics (go all guns blazing or go stealth).


Yup an open world and quite smart enemies made me explore the same checkpoint dozens of times, to get the most perfect approach.


FC didn't have checkpoints. That was introduced in FC2 (set in Africa).

FC2, FC3, FC4, FC5 and FC6 had checkpoints


I believe the game did autosave at "checkpoints" when certain triggers were met.


I don't think that was what the parent I replied to meant by "checkpoints" when he said

> made me explore the same checkpoint dozens of times,

FC1 checkpoints were game progression checkpoints. FC2 checkpoints were in-game road checkpoints.


I did mean the autosave checkpoints (and never played FC2).

So attacking, in the end load again, different strategy ..


Yep. You could also open in-game console and use a command to force checkpoint


In before, “Hey you, in the shirt!”

One thing which struck me at the time was the fierce competition between ATI and NVIDIA … Squinting at side by side aliasing comparisons in Far Cry and Doom 3. Now it doesn’t feel so competitive.

Feels bad for NVIDIA to have this same kind of ridiculous clause in the AI EULA, “ 9.12 Customer may not use the AI Products for the purpose of developing competing products or technologies or assisting a third party in such activities.” as OpenAI shamefully writes, and Anthropic. And Microsoft. And Inflection. And Google. If you can’t enforce a noncompete on your own employees, then it makes no sense to try to pull that stuff on customers. “Not for resale” is one thing, but “no competitive work based on this” is appallingly beyond. Literally only Mistral AI fixed the issue so far!

Name one great athlete who would have been equally skilled in a world with no competitors.


I wish CryEngine was more of a thing. I think we need more engines from companies that actually make games with them


Unfortunately, internally it was a mess. They had several versions of the engine floating around for each game that would diverge and be a pain to merge all together. Later on it got way better but it was too late, Unity and Unreal had taken over and CE had no chance of making money.

Tying the games to the engine allowed them to move fast but at the expense of a truly general purpose engine.


FarCry is running with highest graphics settings in a WinDOS VM on my Intel Mac, and it works astonishingly well. I still immensely enjoy playing it.


I was 15, I have an older brother. He had the game, but of course it being CD and protection, I couldn't play it. He had one of his college mates round and I had just got FarCry working with some no-cd patch I found on some 47th page of Google from some moonspeak forum.

As my bro was heading to go out; I showed it to both of them as they were passing by.

I received kudos of coolness, and because I normally have music playing in the background, Gary Jules - Mad World started playing next in the playlist. Only for my brothers mate to go "Woah, I was just thinking about that song".

And that's my memory of FarCry.


My only memory of playing Far Cry was struggling to get the demo to run at all.


The water was wet in that game. Absurdly amazing graphics for the time.


[flagged]


They unironically are, gaming is no longer a small nerd niche, but rather a mainstream activity.


Not sure if the original comment is sarcasm or not, but the current state of gaming is definitely worse than 20 years ago. Companies abuse the consumer with greedy monetization practices and intrusive DRM, hype-driven marketing tactics pushing pre-sales and then relying on years-long updates to align the product with what was advertised, online services that make games unplayable when they shut down, mobile gaming is adware that popularized microtransations, etc. Modern gaming is a rotten industry.

There is a lot more offer than before, but most of it is shovelware, asset flips, etc. You can find indie studios that still respect the art and produce good games, but these are the exception, not the norm. And you can basically count good AAA studios on one hand.


>>Companies abuse the consumer with greedy monetization practices and intrusive DRM

I'm sorry, have we already forgotten that 20 years ago you had crazy intrusive DRM that would only let your game run if your CD read successfully, if you had even a scratch on the disc goodbye game. And before that the same crap with floppy disks, where developers were literally finding out of spec ways of modifying the floppies to prevent copying and again, if your drive struggled to read it then that's your problem.

I agree about monetisation, it's rampant and not in a good way. But I do actually think the large amount of choice counteracts that - I hate monetisation in Destiny for instance....but there are several other looter shooters to pick from so it doesn't bother me that much.

>>mobile gaming is adware that popularized microtransations

And that one has never been good. Even at the advent of mobile gaming it was already like this, from the very start.

>>hype-driven marketing tactics pushing pre-sales

This absolutely existed 20 years ago already. Hype trains and preorders are not new by any means.

>>And you can basically count good AAA studios on one hand.

I think that's generalising too much - there are very good games coming out of AAA studios as well as incredibly poor ones. Ubisoft for example has something like 40k employees, some of their studios produce absolute gems and some produce....not that. It's not so simple as saying AAA is bad and Indies are not(the amount of that "shovelware" coming out of indies is absolutely staggering, people complaining that their 6728th farming sim doesn't get any sales on steam....like.....there's just too much of everything).


> have we already forgotten that 20 years ago you had crazy intrusive DRM that would only let your game run if your CD read successfully, if you had even a scratch on the disc goodbye game.

C'mon, the requirement for a CD to be inserted can hardly be considered DRM. It was annoying, sure, but at least all game content was physically on the disc, unlike today.

Even things like secret code lookup tables, and all the inventive ways studios attempted to fight piracy back then, pale in comparison with DRM as we know it today, which became popular around the 00s. Users are forced to install rootkits and anti-cheat software that have total control over their system, introduce vulnerabilities, and noticeably degrade the gaming experience (e.g. [1]). This is the definition of _intrusive_, not simple CD checks.

> And before that the same crap with floppy disks, where developers were literally finding out of spec ways of modifying the floppies to prevent copying

Sure, but none of these tactics were as rampant as they are today. I would also argue that copy protection is not the same as DRM. Not being able to make backups is not great for preservation, but it doesn't affect the experience of actually using the product.

>>mobile gaming is adware that popularized microtransations

> And that one has never been good. Even at the advent of mobile gaming it was already like this, from the very start.

Yes, microtransactions existed before, but they really took off on mobile devices.

> This absolutely existed 20 years ago already. Hype trains and preorders are not new by any means.

Again, not to the extent they do today. Practically every modern AAA game relies on hype, and most include pre-order bonuses, day-1 DLCs, etc. It wasn't as popular before since digital delivery hadn't really been perfected, so shipping updates was actually complicated and expensive. It used to be a major milestone for a game to "go gold", whereas now it barely gets mentioned. QA is often left up to consumers who pre-ordered the game, and it takes many months, if not years, for games to become stable. A game launching with minimal issues is a newsworthy event today.

I'm not sure why you're disputing that the state of modern gaming has been progressively getting worse over the years, when the signs of that are evident. As physical media is being phased out, consumers have even less control over their digital purchases. Modern gamers must accept license agreements that grant them temporary access to digital content, which may be revoked at any point. This is inarguably worse for the consumer in every sense than before.

[1]: https://www.pcgamer.com/resident-evil-village-drm-denuvo-stu...


Oh I'm not saying it's not getting worse, I just think the image of gloom and doom is exaggerated. Maybe I'm biased working on the industry.

And it's not just about having to insert a CD - you had DRM system called SecuROM which was just as bad as Denuvo today.


It’s sarcasm.


This is very much true, but also has nothing to do with diversity or "Wokeness" the parent is referring to.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: