The word “innovation” has been lost somewhere in translation within the gaming industry. Companies tend to flaunt the word out at every opportunity given, and this has caused a strange divide of opinions on what exactly should be the next big step in the gaming world. With the Nintendo Switch around the corner, and the company single-handedly reinventing each new iteration of their consoles for the sake of true console innovation at the expense of gaming innovation, it’s about time we look at the importance of each practice and whether or not one holds higher importance than the other.
Console innovation has become a standard that most companies adhere to. In terms of Microsoft and Sony, each new generation of their consoles brings their own minor perks and tweaks in console innovation. Sony’s PlayStation has largely been unchanged since the debut of the original console in the mid-90’s. While new features like the wireless controller, Blu-Ray support, and touchpad on the Dualshock 4 are impressive, they never deter from the main objective of putting the games first. Instead, these features work hand-in-hand with the games themselves, allowing Sony to focus their efforts on more powerful hardware for the sake of pushing greater video game ideas and visions. The same can be said about Microsoft’s Xbox, which at one point was the prime example of the most powerful hardware in the industry to support some truly mesmerizing games on both a visual and inventive level.
Gaming innovation rests purely on the developers part. Utilizing the engine power given to them on each generation of console, developers are freely allowed to experiment with several innovative visual and storytelling ideas. A great example of pushing these boundaries is developer Naughty Dog, which their Uncharted series and The Last of Us have proven how important graphics and gameplay are in complementing one another. These were done on the PlayStation 3, which considering its gigantic competitor at the time, really skewed the market in Sony’s favor. The games were a means to sell the console, and not the other way around. As unfortunate as this is, Nintendo boldly went against this very business practice and have continued to do so all the way into 2017’s Switch.
Where console and gaming innovation really begin to differ is largely thanks to Nintendo. In the early 2000’s, the Nintendo GameCube was on a surprising equilibrium with the PlayStation 2 and Xbox. Not only was it a worthy competitor in terms of hardware and third-party support, but in many cases it exceeded them. With titles like Resident Evil 4 and Super Smash Bros. setting the benchmark of their respective genres, Nintendo was in a good space to compete in the next generation as well. This was until the Nintendo Wii pushed the idea of motion controlled gaming and appealing to a more family-oriented market. While the idea was commendable as it seemed Nintendo had a fairly good grasp on the concept of the motion controller, the issues came in the less than stellar hardware and the exit of third-party developers.
The Xbox 360 and PlayStation 3 pushed impressive hardware and High-Definition in an era where that was becoming the norm, but Nintendo stuck to a more traditional Standard-Definition for the Wii. Where the Wii ultimately failed to really sell its “next big thing” in the motion controller, its competitors dominated by giving power to the developers through sheer hardware capabilities that the Wii severely lacked. It seemed less and less likely that this was the byproduct of console innovation and fell more on the side of a glorified gimmick that hurt the console itself. When Microsoft and Sony responded to the motion controlled trend, it brought with it the Kinect and Move, respectively.
Needless to say, they were both bombs in the market and were certainly given the worthy title of a gimmick. However, it didn’t directly affect the consoles themselves as developers continued to produce innovative games for the high-end machines that used the basic controllers, whereas the Wii could not rely on any alternative besides the very motion controllers deemed “gimmicky” by the mass media (although efforts were made to create a standard controller for the Wii later on, the damage had already been done).
Gaming innovation, however, continued to flourish outside of Nintendo’s spectrum. As hardware improved in the consoles, developers gravitated towards utilizing the potentials of these advancements in technology, and thus brought about modern gaming as we know it. Halo revolutionized the first-person shooter genre, Uncharted revitalized the platformer, and The Last of Us and Heavy Rain took several leaps in video game storytelling to great effect, in their own rights. The innovations that Xbox and PlayStation succeeded in was largely due to the games and exclusives taking priority, and their consoles simply becoming a canvas for developers to work on. Where the Nintendo Wii and WiiU failed was, despite its already impressive catalog of great exclusives, never received the warmest support from third-party developers who simply preferred more powerful tech as a means to benefit their projects.
So what takes higher importance in this subject? It’s true that without innovative console ideas, advancements in technology within the industry can never truly move forward, but I feel as if gaming has become too comfortable in this one concept of the traditional handheld controller to really step beyond this zone. Nintendo Switch is commendably attempting to bring more inventive ideas to the table, but whether that all sticks or not, we will see in due time – but given the track record, nobody is holding their breaths. In the honest opinion of a person who enjoys gaming on every platform regardless of tech power, the games should always come first. It’s what ultimately creates memories for us. In 10 years time, people will surely look back at the Wii, not for its motion control, but reminiscing what great games Super Mario Galaxy and The Legend of Zelda: Skyward Sword were.