What would have happened if Congress had decided that the United States shouldn’t fight to liberate Kuwait in 1990? Where would I be if George H. W. Bush had instead decided that an economic war against Iraq was in Kuwait’s best interests from their point of view?
I think about that a lot.
Kuwait is a tiny country. It’s twenty-four times smaller than California and can barely be seen on a world map. More conspicuous is Iraq, situated to the north of Kuwait, which was ruled by a dictator who assembled throughout his ruling years one of the most powerful armies in the world.
The Iraqi army totaled over one million soldiers, with another half-million ready to join the fight. The Kuwaiti army numbered just below eight thousand. Below. Eight. Thousand. Let that sink in.
Have you ever played a game of Civilization before? You’d probably surrender if you were against Iraq, cursing the unfair AI in protest.
In mere hours, Kuwait was under siege. If this was a Call of Duty scenario, players would be angry at the developers for crafting a fictional story wherein the good guys win despite literally impossible forces against them.
The members of the White House were in agreement that Kuwait must be rescued, but a division in ideology arose: one party supported the idea of economic sanctions that would eventually force Iraq out, and the other believed a military solution was the only way. The sanctions team argued that their solution would take two years, but would prevent a war. The other team thought that nothing short of going to war would stop Saddam. They were right.
I think a lot about what the ramifications of the other decision would have been. I thank the lord a Republican with a spine was president then. Kuwait could have been a second Vietnam—or at least that’s what Saddam was banking on. Can you imagine if Twitter was a thing in 1990? Liberals would have tweeted vigilantly against the war as they were sipping their expensive teas and as Kuwaitis were gunned down by the hour. Others would have screamed that war is the only answer, a repulsive thought to even write down. Everyone is right and everyone is wrong and no one knows what to believe in because the world is vast and complicated and we’re not equipped to think that far outside of novels with ambiguous protagonists. The thought of all this idiocy makes me cringe.
Kuwait fought its most vicious diplomatic wars to win the approval of Congress and the world’s nations, all to help reclaim her sovereignty through militaristic force. Kuwaitis thus earned the right to take pride in their country’s strategic international investments and assets, as well as its strong diplomatic ties and relationships, for they helped steer public opinion to embrace Kuwait’s independence and ownership of its rightful assets. We need only look around us to see that war is gruesome and no nation comes out of it unscathed. The fact that Kuwait did, relatively speaking, is yet another miracle.
I think about the interplay between Kuwait’s insignificant size and significant wealth a lot. Would Bush and Thatcher bat an eye if not for the millions—billions—their governments would make from striking deals with us? (No.) Would I have felt any differently if I were in their shoes? (No.) Nevertheless, it’s an immeasurable privilege to be a Kuwaiti citizen in 2019, having a relatively easy life (even if difficult in other unspeakable ways) thanks to the sacrifices of women and men far greater and braver than me. I look at some of the troubled countries in the surrounding region—at Yemen and Syria and Sudan—and shudder to think that Kuwait could have had a calamity like that dropped on its lap if not for what often seemed like divine intervention, my own spiritual skepticism aside.
What is the meaning of survival? Yikes, I don’t know. The word “survival” feels funny to write. I was but a two-year-old when Saddam invaded this tiny land. Yet a collective feeling of survivorship permeates the national consciousness. I feel lucky to be alive and to be in the position that I am today: a middle-class shift worker who runs a record label. (It’s quite the upgrade considering the dire could-be alternatives.) I am grateful despite my two-hour commute, or my nearly sixty-hour work week, or my debilitating chronic health, or the existential crises that come with age and books, or my failing friendships, or the forced absence of love, or the universally futile search for meaning. In February, Kuwaitis feel an extra dose of patriotism and affection, but my Februarys are often spent looking inwardly. Now that tens of thousands of brave women and men have put their lives on the line for me, what do I need to do in return? How can I give back? Is art ever enough? I shout these questions into the atmosphere, but nothing echoes back. It is up to us to decide. Brave Wave is my tiny contribution to this gift of resurrection, of second coming, of being granted a megaphone and a chance to shout. I just hope it’s loud enough.
Last week I bought a long-awaited and exciting hardware for my room, the 55-inch LG B6. It’s an exceptional OLED TV that does 4K and HDR. Don’t let the similarity of the panel acronyms fool you: going from LED to OLED is a monumental leap in image quality, as significant as the jump to Retina screens in Apple devices. While the colors are richer and more vibrant on an OLED panel, it’s the black levels that are jaw-droppingly good. I never owned a Plasma TV, so I never witnessed blacks like this before. It took a few minutes for the LED TVs across the house to feel ancient in comparison.
I went from a 1080p non-HDR LED TV to a 4K HDR OLED TV. I was curious to track the source of my amazement. Am I impressed by the increased resolution (4K is quadruple the pixels!), the increased dynamic range in HDR content, or the panel technology? Am I attributing my amazement correctly? Is it the culmination of all these display technologies, or is there a major player in all of this?
Below is by no means comprehensive or thorough, so take it with a grain of salt, but this is what I noticed in my brief time with the TV:
– From a normal viewing distance, 4K streams from Netflix and Amazon Video Prime are indistinguishable from 1080p. The Amazon Video app on the LG TV is helpful: when I pause a video, it tells me whether it’s currently displaying a 1080p or Ultra HD picture. I sit about 6 feet away from the TV, and I never – not once – noticed the drop from 4K to 1080p or vice versa. I’m sure the difference is noticeable if I hold a magnifying glass to the TV, but under normal conditions it’s largely imperceptible. Some shows on the Amazon app, like The Grand Tour, stream in 4K and HDR. HDR works whether the stream is 1080p or 4K. When the video compression is handled well in a TV show (like in The Grand Tour) the colors and blacks are fantastic and perceptibly flawless regardless of the resolution.
– Games running at 4K from a capable PC are noticeably sharper and crisper, but that depends on the game itself. For some games that don’t have 4K textures, an upscaled 1080p is still nearly as good from 6 feet away. User interface elements will always look fizzy and bad when upscaled, however, so I find that I mostly don’t like to run games in non-native resolutions because of the resulting UI. The image itself is very similar.
– Surprisingly, my 1080p Blu-ray collection looks absolutely breathtaking. 4K is quadruple the resolution, so you’d think that a 1080p video would look compromised or revealing, but that’s not the case. The OLED screen adds a new dimension to my Blu-ray films, and upgrades the image quality at no added cost simply because of the display technology. The opening panoramas of Stanley Kubrick’s 2001: A Space Odyssey have never looked this vibrant and captivating before. Additionally, my brother and I were gasping at every outer-space scene because of the incredible black levels. It would be interesting to compare 1080p and 4K Blu-rays, but I bet the perceptual difference at that distance is negligible.
– HDR is a nice addition. I find that it sometimes detracts from the established mood or setting, so it’s not always a win in my opinion. A vibrant game like Forza Horizon 3 looks stunning in HDR, but The Last of Us – while looking appreciably brighter, with more dynamic range – looks a bit too bright or lively at points. (I didn’t progress far enough to form a definitive opinion.) The Last Guardian handles it well, in which the mood and feel are largely unchanged but the image quality is noticeably improved. I don’t mean to say that HDR is bad – it’s just stylistically different and varies per game. (I don’t have an opinion on its usage and implementation in Blu-rays or streaming services yet.)
– The OLED panel is the real showstopper here. It’s far more important and significant to my content (and eyes) than 4K and HDR ever hope to be. “OLED’s clearest improvement comes from its utter lack of backlight,” explains Sam Machkovech at Ars Technica. “Instead, individual OLED pixels are made from an organic material that emits light from within whenever it’s fed electric current. If a pixel receives no current, it emits no light in the red, blue, or green color spaces. This creates the purest black.
Once you can deliver pixels with absolute-zero values for color and brightness, you enter a new realm of contrast-ratio territory. Even the littlest hint of light in the blackest part of an image changes the perceptible contrast ratio.”
What this means is that when you step into a dark environment in a game, or a dark scene in a movie, the TV doesn’t crank itself to figure out how to properly display blacks. There’s no color manipulation, no backlight, no fake-black-that-looks-gray. The individual pixels shut themselves off and you end up with a properly pitch-black scene.
I tested a few selected favorites and the result is always stunning, with the image quality looking perceptually superior at any sitting distance. The true blacks are a wonder to behold when playing games or watching movies for anyone who never owned a Plasma TV before. Sections that previously looked washed out and unclear suddenly come to life, better than ever.
Given the choice, I’d pick a non-HDR 1080p OLED TV over any top-tier 4K LED/QLED TV in a heartbeat. 4K and HDR are nice to have, but at this time, they’re merely part of a checklist. I’d be bold and say you’re not missing much here. At least I don’t feel that I am. The vibrant colors and the true blacks of this OLED display make for a transformative experience, one that feels as significant as the jump from SD to HD. I’d invest in that above all else.
Last week, I was invited to speak at Buma Music in Motion in Amsterdam, an event centered around music within the different forms of visual arts: film, TV, and video games. The sole gaming panel featured writer, filmmaker, and Canada Research Chair in Interactive Audio at the Games Institute Karen Collins; Senior Music Supervisor at Sony Computer Entertainment Duncan James; and yours truly representing my music label Brave Wave Productions. Titled ”Video Games Music: Stuck in a Rut?,” the one-hour panel focused on the usage of music in video games and its evolution throughout the years.
It’s been said that artists work best under constraint, so does the absolute freedom offered in our modern times hinder the progress, or perhaps the inventiveness, of contemporary composers? This is a point of contention amongst game music enthusiasts, especially with the prolific employment of the orchestral Hollywood sound in triple-A games often cited by the community as a sign of the homogeneity of modern game soundtracks. In his book I AM ERROR (MIT Press, 2015), writer and musician Nathan Altice chronicles the NES and Famicom both as game machines and as cultural artifacts, and in the seventh chapter we get a rigorous dissection of the console’s audio processing unit and the arduous task of creating music for it. “[It’s] not like composing on a piano. It cannot be played with a keyboard; it has no inherent understanding of notes, scales, or tempo. Instead, composing [for the Famicom] is identical to programming: audio data is encoded, stored, and interpreted in the same format as code, byte by byte in hexadecimal numerals.” Altice argues that these technical restrictions, archaic in nature and seemingly counterproductive, are the underlying catalyst for that era’s best sounds.
This is a regular discussion that I have with the pioneering Famicom-era composers that I work with at Brave Wave. Composers such as Keiji Yamagishi (of Ninja Gaiden) and Manami Matsumae (of Mega Man) had to experiment with technique in relentless pursuit of great sound, often under unrealistic deadlines. Their experiments were the kindling for ideas that became standard practice, such as emulating percussions with the pesky noise channel, and the recording of drum sounds with the 1-bit sampling channel. The complex nature of the Famicom’s audio processing unit forced the composers to go above and beyond in order to produce sounds and melodies that complimented the visuals and were pleasing to the ear. They were not only in charge of composing the music, but also of arranging it themselves for the incredibly limited hardware. The imagination of the cornered musicians gave birth to some of the best and most effective works of video game music, from the jazzy sounds heard on Super Mario Bros. to the drama-heavy tunes of Mega Man and its sequels. In recent times, a new wave of acolytes have risen high thanks to those limitations, embracing the medium and keeping it alive by driving it to new limits — from the Gameboy-composed bangers of Chipzel and Danimal Cannon to Shovel Knight’s enthralling VRC6 soundtrack by Jake Kaufman.
While I firmly believe that the limitations of the NES era played a pivotal role in emphasizing the importance of the video game composer, I see modern trends as a natural progression for game soundtracks. It’s easy to think that the contemporary composer has such a number of tools at his disposal making it easy to follow Hollywood trends, and is no longer tempted to challenge the status quo. But at the same time, new and modern types of games and forms of play are merging to include diverse experiences, thanks in part to the rising importance of independent designers and composers. In the soundtrack to Fez, where one would expect chiptune music in accordance with the game’s looks and style, Disasterpeace laid down an intriguing juxtaposition with a score that is rich in ambient soundscapes and electronic fizz. Instead of drawing solely from the pixelated visuals, he worked to create music that matches the feeling of the game and its art direction rather than just its design. Eirik Suhrke followed suit with his quirky soundtrack to Spelunky, where the music for each world would play randomly in a rotation of world-specific tracks, embracing the roguelike design of the game. Another example is Kozilek’s shifting score to Luftrausers, a game that plays its music depending on the player’s custom design of the plane, where changing the front or rear design alters the music produced.
Such applications are found in both the independent and triple-A spaces, but we’ve come to expect more innovation from the former due to the overwhelming freedom allowed by indie game designers. Where the goal with Famicom-era music was to create catchy melodies within strict technical constraints, many modern composers choose to focus on genres that previously were difficult to implement (e.g. minimal and ambient music) and the different forms of play that require a new way of approaching the music.
There’s a simple answer to the question imposed by the Buma panel, which was in turn proclaimed by Final Fantasy composer Nobuo Uematsu, and that is to look beyond the triple-A games for the thought provoking and exciting scores. There’s no argument to be made for the superiority of any given era’s music, as it all boils down to each era’s designs and limitations requiring different methods and applications. Acknowledging this and embracing it allows us to advance the medium in ways that go beyond singling out substances as style and melody.