Cyberpunk is one of my favourite genres both in fiction and in gaming, and telling me a game is cyberpunk-influenced is a good way of guaranteeing my interest (well, that or cel-shading). So finding out that Itch.io– a game-dev community that I’ve often heard about but never really investigated- had a whole section of cyberpunk games ready to try definitely got my attention.
Here are a few highlights; mostly from the 2014 Cyberpunk Game Jam.
We’re used to seeing the heroes, villains, and morally-ambiguous characters of cyberpunk in action, but who are they when they finish work?
Who populates the world away from the camera?
VA11 Hall-A, described as a bartender-em-up, lets us look at the personal lives behind the often-impersonal world of neon, skyscrapers and megacorporations.
If you missed the first post, which was an overview of the different levels of optimisation a game can have, then you can find it here:
99% of all graphics cards are made by the duopoly of AMD or NVIDIA (NV). As well as controlling graphics hardware, both companies have expanded into software, creating a middle layer that goes between the graphics card hardware and the games software. Both companies have a similar box of tricks, and I’ll explain a little of what they both offer.
While they are similar in many regards, the major difference at the moment is how much influence the company can have both after a game is released and, more importantly, on the development process of a game. Current graphics card poster child Watch_Dogs is the game in focus today.
First things first, what is optimisation? The general definition is “the process of making a strategy maximally functional; removing deficiencies in a system or process”. In gaming terms, this means a game making good use of the resources it has been given to run in.
In consoles, optimisation is often not a big deal, due to standardised parts. Because all Xbox 360s (for example) have the same resources as each other, a 360-specific game will be developed on the exact same system it will be played on, meaning the game can be set up to work well on the hardware supplied. Of course, some games still don’t run properly even then, but that’s normally a fault of design or programming at a more basic level.
Optimisation can sometimes be an issue when a game is ported from one console to another without fully taking the differences between systems into account. Games released for the 6th and 7th generation consoles concurrently often had issues on the Wii that weren’t present on other consoles, due to the Will having technical specifications lower than the other 7th generation consoles. Games released for the 7th and 8th generation concurrently have the same issue with the Wii U. (For a look at the minor flame war about the Wii U port of Mass Effect 3, look here)
A common media stereotype is that gamers are obsessed with the graphics of their games and systems. There’s an element of truth to this, but usually with good reason. This media portrayal can just happen because graphical improvements are the easiest way to demonstrate that one game/system is newer or more advanced than another, and also the most obvious difference to explain when talking about consoles to a non-gamer.
I don’t normally put too much attention on a game’s graphics – as long as they aren’t incredibly bad or otherwise distracting, I won’t think too much about them. However, one way to instantly get me interested in a game is to tell me it’s cel-shaded.