It’s a funny thing. Over the past few weeks I’ve read an increasing number of blogs and posts that posit the future of cinema being increasingly creatively challenged. It’s as if only now folks are coming to realize that the movie industry has been encountering a great divide – at one end the tiny, independently produced character driven stories, at the other the air-fuel-bomb inspired Hollywood tent-pole action showcases.
Meanwhile, the handful of really talented writing and creative types have been quietly migrating to that bastard cousin of ‘important’ media – television.
The thing is, I think that the watershed moment really occurred several years ago – specifically when “The Wire” became the single most respected bit of dramatic character-driven writing to ever find an audience. Yeah – “The Wire” is the creation that television needed to finally put a metaphorical stake into the heart of Hollywood.
Well, let’s face it, “The Wire” was emphatically NOT the first TV show to present well written drama, complex metaphor underlying stories, or anything else that it did so well. It was simply the first show to package all that shit into a single box and get noticed by a larger audience (and the first one where writers managed to convey a very complex bit of business while the characters never uttered a single word that wasn’t a variant on the f-bomb). The award for first-to-the-post with ridiculously clever plotting that dragged viewers back week-after-week goes to “Twin Peaks”. Flawed as it was, with the writers making it up as they went, it made for compelling television. The award for “morality tale well told” goes back to the sixties and “Star Trek”, which was the first TV series to truly grapple with socially relevant issues in a palatable format — while some might argue “The Twilight Zone” should hold that mantle… as much as I’d like to agree that Rod Serling was a tremendous inspiration, it was “Star Trek” that really grabbed hold of the audiences’ cerebral cortexes and squeezed mercilessly.
However, it wasn’t until the twenty-first century that we really witnessed intelligent writing forced off the feature-length cruise liners and left to swim for TV’s shore. Let’s face it – Hollywood has abandoned character for spectacle. If there’s a choice to be made between an intimate moment of dialogue or a big-freaking-explosion, we know which way that axe is going to fall. Boom, baby, boom.
Meanwhile, television, lacking the insane budgets of theatrical features, has quietly become the haven of folks interested in telling actual stories. Sure, TV writers are not adverse to blowing up the occasional building, but they have the security of knowing that the audience is ultimately more hooked to their series because of the characters and the slowly developing arcs that now stretch out over dozens (and potentially hundreds) of hours of media. An exploding building is an indulgence – a bit of fun. The bit of dialogue where one of the main characters reveals that they feel responsible for the heretofore-unrevealed death of their sibling – that’s the real hook.
You see, in the old days, TV was dominated by a singular principle… You did your 22 or 44 minutes of programming, and at the end of any episode the characters needed to be in exactly the same place as where they started. The golden rule of the TV series was that “nothing must change”. At the end of every episode of “Gilligan’s Island” the island-dwellers remained just as stranded as they were at the start. Every episode of “Star Trek” concluded with the crew still on their infinitely unending ‘five year mission’.
But sometime during the nineties television changed. Perhaps one of the instigators was “The X-Files”, where the show became more about the overall progression of the character’s individual quests – folks tuned-in not just to watch the ‘monster of the week’, but to find out if Mulder would uncover some tiny hint about the grand mystery he was pursuing. Heck – maybe it’s even more essential than that, since the crowd watching “Moonlighting” during the 80’s were really just waiting to see if the sexual tension between the leads would ever be resolved (and look what happened when it finally was…) We could even make a case for “Ren & Stimpy”, which utterly eschewed environmental consistency and simply dumped the characters into whatever random environment facilitated the story – a choice which was entirely foreign and frightening to the broadcaster at the time (and resulted in the creative evisceration of the program and its premature demise).
At some point during that era, television went from being an episodic media where the initial state of the characters was locked and unchanging, into a media where the continuing development of the characters became the primary hook that held the audience rapt. How astoundingly different that is from the most successful series of the sixties and seventies – Imagine if “All in the Family” had begun with Archie Bunker as a hothead bigoted asshole and then transformed him OVER THE COURSE OF YEARS into a sympathetic, reasonable advocate for racial tolerance? If you’d even suggested that intent in the seventies you’d have been laughed out of a network pitch session. Nowadays? That’s actually a pretty compelling premise (though you’d have to start with Archie being a Clansman who slowly turns on his ‘associates’ and risks life and limb – so drama more than comedy).
I suppose what got me thinking about this was ultimately the discovery that Frank Drabont, the guy responsible for writing one of the finest dramatic screenplays ever crafted (go re-watch “The Shawshank Redemption” immediately if you didn’t flash to it at the mention of Drabont) is now producing a TV series about a post-apocalyptic-zombie-takeover which is focused primarily on how the ‘characters deal with the day-to-day difficulties of survival’. Man, can you say drama-heaven? Yeah, I knew you could. The best way to build tension between central characters is to put external pressure on them, and the end-of-the-freaking-world is about as intense as it gets. Give them a goal, force them to work (unwillingly and with opposing intentions) together, and you’ve got a great stage on which to play. I just wish I’d thought of it first. Then again, I still have “Warped”, and an insane AI that is determined to kill everyone at a moment’s notice, so perhaps I’m not doing so badly (yes, I’ll post it in the screenplay section Real Soon Now).
What I’m really getting at here is that theatrical motion pictures are destined to become spectacle rather than compelling entertainment. For human beings, real bottom-line entertainment is about learning – we spend our lives hungry to further our understanding of the people around us, to improve the ‘models’ we hold in our brains that inform our decisions. We are compelled to understand marginal personalities because we don’t encounter them daily and do not have readily available responses to them. We are fascinated by the Dexters and Stackhouses because we don’t have expertise in coping with those interactions otherwise. And so we watch.
“Quality” entertainment is really just a prophylactic that conceals learning and processing – we all seek a better understanding of the world around us (which, to a great degree, explains why “The Social Network” has been so successful – everyone is pretty much baffled by the Facebook phenomena). Good entertainment is really nothing more than cleverly disguised social education. The other sort of entertainment – spectacle – is what Hollywood has become expert in providing. It’s a hell of a lot of fun, but it’s ultimately unsatisfying and fails to provide the illumination we crave.
Warner Bros. has just announced that they are *NOT* initially releasing the next Harry Potter film in 3D. While they still intend to do a 3D version, it’ll be released at a later date (this translates as “in theaters never” to those who may lack an understanding of studio-speak). This is a major reversal in that they’ve been actively promoting 3D releases for the past few years – though one must keep in mind that they’re the studio responsible for the abysmal “Clash of the Titans” 3D release — methinks they may have learned their lesson.
What does this really mean? Well, 3D has become a value-add in the past few years. But audiences are rapidly coming to the conclusion that it doesn’t enhance the theatrical experience beyond the novelty value. Films which exploit 3D for shock value are cool. Films which push 3D to its practical limits are cool. Anything else is kinda lame. So if you’re not making “Piranha” or “Avatar” then you need to treat 3D as an “extra” rather than a key selling point.
In fact, releasing a film in “3D” is probably a kiss-of-death if the production doesn’t fall into the aforementioned categories – in order to be acceptable the show must showcase 3D as a key component of the viewing experience. Just “tacking it on” is going to be perceived as an insult to viewers and a good reason to NOT participate in a theatrical viewing. I think this is already happening, but it’s damn hard to interpret box-office receipts given the deliberately confusing numbers spewed out by studios. Six months from now, however, it’s likely to be considered common knowledge. I suppose we can only hope.
I’m going to stick with the assessment I’ve promoted for some time now – that 3D is neither essential nor game-changing in the theatrical world. The rise of widely distributed, independent low-budget productions in 2-freaking-D has far more impact on the entertainment industry than adding that extra dimension.
Meh – that’s where I’ll leave my sentiments for now. /rant ends.
OK, so I’m being prolific today. That’s likely because I’m working on the fourth or fifth script outline that I’ve come up with in the past two months, and none of them are coming together. So I’m scribbling by way of avoiding the “I can’t come up with a working plot” monster.
However, I just watched an awesome documentary – “Nollywood Babylon” – about feature film production in Nigeria. Turns out that those industrious recyclers and pirates are also the third largest producers of “feature length movies” in the world after India and the United States. In 2008 some 2,500 movies were made. Now, to be fair, this is a bit deceptive because there is virtually no other visual entertainment produced in Nigeria, and these movies are mostly shot on video gear that’s been obsolete here for a decade – we’re talking 8mm consumer-grade standard-def analog tape. The production quality is terrible, and the copious visual effects are laughable (the still-image of someone’s happily-seated pet dog exploding out of a “witch” is beyond comedic).
What really caught me off guard, however, was the sincerity of the people involved. They’re really trying to tell their own stories – stuff that can’t be imported because it simply doesn’t exist outside their culture. They’re doing it on a per-project budget that rarely exceeds $15,000. And, at least in the case of the movie that the documentary crew primarily observed, they’re having a damn good time being passionate about their subject matter.
They’re not making art. Hell, the lowliest Roger Corman picture was shot better than any of these movies. But they’re telling stories that resonate with the audience and sell. Astoundingly, in Lagos (a city with fourteen million residents) there are only three cinemas, and those only show imported product. The only television is 100% state controlled and focused on disseminating propaganda. So the trade in indigenously produced product is limited to CD-video sold in open markets and screened either at home or in retail shops.
Consequently it made me contemplate Canadian cinema and wonder “why the hell doesn’t anyone even try”? I suppose the answer is evident – we don’t have a strong cultural identity (other than the Quebecois), and we’re forever doomed to compare anything we do to the stuff that comes out of Hollywood. Canadians automatically limit the scope of their creative vision to fit within what they perceive is the limit of the financial environment (you’ll never get a show off the ground unless it can be made with a Telefilm budget). While that is technically an accurate statement, it’s pretty damn tragic.
Canadians are good at making entertainment. People forget that James Francis Cameron (yes, that really is his middle name) was born in Kapuskasing and raised in Chippawa, Ontario. By the time his family relocated to LA (he was 17) he’d reportedly already imagined “The Abyss” and who knows what else. William Shatner… Canadian. Jim Carry… Canadian. A significant portion of “American” filmmakers, actors and technicians… Canadian. And every one of them had to move to the US before they could make a decent living. As did I. But I couldn’t stand it.
All right – this is turning into a rant, rather than what I’d intended, which was a positive comment about an interesting show…
Which was financed by the NFB.
Gotta love that little irony.
So along with all the other “MasterChef” variants out there, the Australian licensors have unleashed “Junior MasterChef AU”, in which the contestants are all 8-12 years old.
OK, I’m not a parent, and don’t have any day-to-day contact with kids, so my initial reaction was “hmm – maybe a couple of steps above grilled cheese and smores”, which was about where I was at when I was that age (I wasn’t allowed to go near a stove until I was about twelve, and was not allowed to touch a power tool until I was fourteen, so I think my personal experience is probably somewhat skewed). I didn’t really start cooking creatively until I was living on my own dagnabbit…
So, it came as an absolute shock to watch a bunch of *children* produce dishes that are, in every way, equal if not superior to the dishes that have been produced by the adult MasterChef contestants I’ve seen. Steamed snapper with sweetened soy sauce and julienne veg by a ten-year-old… Perfectly prepared Thai fishcakes with sides by a NINE-YEAR-OLD.
Sure these kids aren’t going to be working “the line” in a professional kitchen any time soon, but they’re knocking out dishes that you’d happily pay for in a restaurant. While I was watching this I felt, in the words of Anthony Bourdain, like abandoning pretty much everything and just signing up for one of those “Drive the Big Rigs” courses. Yeah… Hauling produce cross-country is about all I’m good for if this is any indication.
But seriously – I’m just astounded, and I mean COMPLETELY astounded, watching these youngsters. They don’t *sound* the way I’ve imagined kids should sound – they’re just like adults in every respect except for size and corrective-dental-work. Maybe it’s because they’ve spent too much time watching cooking shows. Perhaps they’re parroting back the kinds of reactions they’ve seen from the adult contestants they obviously admire (when a ten-year-old says they’re “floored” because a well-known food critic has entered the room, it gives me pause – I didn’t even know that food critics existed at that age).
Then I wonder if perhaps it’s just fundamentally true that we can’t relate to younger generations from a personal perspective because their world is so wholly different from our own. You so often hear clichés like “they grow up so fast…” Perhaps the more accurate statement is “They know so much more than we did at that age.”
There are two things these kids lack – a desire to make their cooking needlessly complex, and the cynicism that abounds in adults. I don’t see that as a negative. It makes me feel old watching individuals who haven’t been broken by the world around them yet, and it also makes me feel a bit melancholy that I never had the opportunity to have children of my own.
Then again… with my luck I’d have ended up with a brilliant entrepreneurial twelve-year-old busily cooking Meth in the basement.
If you’ve ever experienced a bad case of writer’s block then you know how frustrating and debilitating it can be. In my case it generally spasms through me when I reach a point in the story and realize it’s gone off the rails – usually because I’ve been unconsciously (up to that point) relying on some essential trope or clichéd character.
In August I had found myself wandering into the second act of my Fallout Shelter script when the whole mess just stopped making sense. I knew what needed to happen, but the whole thing suddenly became too convenient — I’d painted my characters into a corner where any action they took would either undermine prior decisions or be utterly non-dramatic.
I fiddled with it for about a week before realizing that I wasn’t actually getting anything written. Typing and deleting does not count as writing – I might as well have been batting out “All work and no play makes Nick a dull boy.”
What I needed to do was write. Anything. And I realized suddenly that it had to be something entirely for me, something with no commercial value whatsoever. As I stood in front of my DVD stockpile, eyes wandering, I ended up pondering the nearly adjacent copies of 2009’s Star Trek and the original 1977 Star Wars and I suddenly had an awful, horrible thought.
“What if J.J. Abrams rebooted Star Wars?”
And then I had an even more terrifying possibility spring to mind…
“Heck, what if Michael Bay rebooted it?”
The whole thing just rolled out in front of me. Rewrite Star Wars in the style of Transformers. And I wouldn’t stop there. I’d mercilessly strip away all of the naive charm, the childhood wonder, everything that made Star Wars what it is, reducing it to an explosion filled summer tent pole movie. I’d turn Obi Wan Kenobie into a drunken has-been! Hell, I even thought about making it a musical, but that would have been too much work.
It took just three days, and by the end it was everything I’d imagined it would be and worse. And, it’s probably exactly the movie that’d get made if Star Wars went into production now, and not thirty-three years ago.
Like Frankenstein before me, I had created a monster, built upon the skeleton of a classic motion picture, stitched together with bits of flesh sliced from twenty-first-century blockbusters.
But that monster ripped through my writer’s block. It didn’t solve the problems I was having with Fallout Shelter – it allowed me to see that those problems were so fundamental that the entire story needed to be told differently. I’d lost sight of the original premise and gotten so wound up in intricacies that I had written myself not just into a corner, but into a maze.
Done with that.
As an afterthought, I sent copies of the script to a few of my friends, suggesting that I’d found it on the net and was concerned that it might be legitimate. Those who read some of it and replied responded with suitable outrage.
My sincere apologies for that, but those of you who did peruse the script would never have been honest about how infuriating it is if you’d known it was me.
Anyway, for the personal amusement of those who weren’t on the mailing list, I now present Star Wars in a way that I hope never comes to be.
I watch cooking shows. Let’s be really clear here – I watch *a lot* of cooking shows. This is not a surprise to anyone who knows me – I’ve been obsessed with cooking for a very long time, and I take it pretty damn seriously — how many people do you know who buy twenty-pound bags of onions just to practice knife skills – the French Onion soup that results is just an inadvertent consequence of that activity (though it’s damn tasty).
For many years I’ve been a fan of the BBC show “Masterchef”. The format is fairly straightforward – amateur cooks from around Britain compete for a handful of “on air” places and then progress through a series of challenges. The winner is dubbed “Masterchef” and these individuals generally go on to professional culinary careers. It’s a typically British program – very sedate and laid back. The contestants are chosen prior to the main production, and their participation is based entirely on the quality of the food they produce. Most of them lack any sort of “typical” television charisma, so the charm of the show comes from watching really regular folks cook astoundingly good food with virtually no experience. Honestly, the program screams Home-Brew-BBC throughout, and I can’t imagine it being interesting to anyone outside the UK other than someone as food-crazy as I am.
But that turned out to not be entirely true. MasterChef developed a dedicated following in… Australia. Aware of this, an Australian production company acquired the rights to the “MasterChef” IP and went about re-inventing the property in a rather dramatic fashion. Instead of a compressed broadcast schedule that consisted entirely of highlights from the contest, the AU production locked the contestants in a “Top Chef”-like house, limiting their outside contact for the duration of competition. Then they staged their challenges on a daily basis over a three month period. And they recorded absolutely everything. The result? A single season of “Masterchef AU” is seventy-plus (yes, that seven plus a zero) episodes. The show is broadcast daily over a period of fourteen-to-sixteen weeks. It’s an insane amount of content.
A year after the first series of Masterchef AU aired, a New Zealand broadcaster caved in and created a local version. Lacking the resources of the AU production, Masterchef NZ duplicated the format of its’ AU parent, but condensed the program to a once-a-week experience. While this reduced audience involvement with the various participants, it didn’t eliminate it, and the NZ program proved to be a huge success with the local audience.
The NZ program, compacted to a “manageable” format, consequently ended up on the North American radar. The format had become condensed to a manageable, dramatic program. Casting was a no-brainer given the success of the AU program. The judges consist of a distant, silent chef who reserves judgment, an enthusiastic and experienced chef who supports the contestants, and a hyper-critical food master. Inevitably Gordon Ramsay was asked to participate (heck, he was likely involved from minute one).
Consequently, we now have “MasterChef US”. I don’t think it’s a horrible thing. Hell, any show that celebrates the skill of utterly unknown cooks is a good thing – it reminds us that anyone can cook a great meal – though it’s important to keep in mind that being able to cook does not make anyone great… Just that a great cook can emerge from anywhere (yes, I’m quoting “Ratatouille” here, but it’s appropriate).
What fascinates me is that a US audience is now watching a program which is based on an NZ format, which was derived from an AU show, which had its genesis in a BBC program. Damn! What does this mean for “original content”? You can work your ass off developing an amazing idea, but that hour of primetime broadcasting is probably not going to get allocated to your masterpiece. It’s going to go to a show that a handful of assholes like me made a sufficiently loud stink about. And there’s no guarantee the show will be any good. Hell, history suggests that it’ll probably stink. And “MasterChef US” is certainly not a masterpiece. It’s not terrible, but if you’ve got access to the Australian program then it’s little more than a poor imitation.
I wish the US producers had the opportunity to create a show modeled more closely on the AU version of the program, rather than getting themselves stuck into the standard North American weekly format. A daily show would be innovative within the US marketplace, and would likely find a huge audience.
I guess it’ll happen when it happens. And that will likely be sooner rather than later. After all, they’re going to have to replace those failing soap operas eventually.
It took me just twenty years to figure out how to bake a good loaf of bread reliably. Oh I had the occasional success, a few perfect products here and there, but being able to consistently produce something that absolutely beat out store-bought items had eluded me. Then, a few months ago, I started getting experimental.
I’d always followed traditional recipies for bread. You know the ones – bloom yeast in sugar water, add to dry stuff, mix, knead, proof, knock down, proof some more, bake. Sometimes it worked. Sometimes it didn’t. And then one day, while contemplating sour dough, I had a thought. Why bloom the yeast with sugar? Why not just bloom it with some of the flour and treat it like a loose starter? Worked like a charm, and the details are below. I’ve done it this way ever since and haven’t had a single problem.
I discovered later that my clever idea has been around for, oh, a few thousand years and is referred to as the “sponge” method. What surprised me is that, although I am by no means a rampant baker, I do spend a fair amount of time reading about this sort of stuff, and I’d never heard of it.
To any experienced bakers reading this please keep in mind that these instructions were originally written for a friend who had given up on baking thanks to her own repeated failure and subsequent oven-phobia. This was an effort to inspire her to take another crack at it.
At any rate, here’s my recipe. You can optionally switch out one or two cups of white flour for something more interesting like Rye or Spelt.
This will make one “average” loaf (in your standard loaf pan) or you can form it into whatever shape(s) you desire – I like doing single rustic round loaves generally.
1 cup of plain old all purpose flour – for a more airy crumb use bread flour
1 tbsp of dry yeast (the old fashioned dry kind, not instant, not fresh, the cheap stuff)
1 and ½ cups of very warm water (just at the point where it’s unpleasant but not painful to stick your finger in it)
Mix the flour and yeast in a bowl (preferably the bowl from a stand mixer if you’ve got one with a dough hook, but you don’t need it)
Add the very warm water and mix until you’ve got a smooth paste (I use a whisk for this)
Leave the mix in a warm place, uncovered, for at least one hour (the longer you leave it, the more the finished loaf will lean towards a sourdough flavor – I think the longest I’ve let it go is about four hours, any longer and you should probably cover with a damp kitchen towel).
2 cups of plain old all purpose flour
1 tbsp of kosher salt
Mix flour and salt together in a bowl and then add to the starter. Combine thoroughly (if you have a stand mixer, use the dough hook and let it mix for about ten minutes) until you have a dough that’s slightly denser then stiff porridge – it should be sticky, but not gloopy. In the mixer it will tend to partially coat the inside of the bowl.
Now, take this sticky mass and dump it onto a clean work surface. DO NOT TOSS FLOUR ON THE SURFACE! Yes, I know this is counter to pretty much every kneading process you’ve ever seen, but just trust me on this…
You are going to need a pastry scraper – you know, one of those flat squares of metal or plastic. The dough is going to want to stick to the work surface. Your goal is to knead the dough with the help of the scraper, but not add flour if at all possible. If the dough isn’t sticking to the surface when you start, then it’s too stiff and you need to get more water into it.
Now you knead the dough. It will take several minutes (dough kneading is sort of a Zen thing, and experience determines speed). The goal in kneading is to keep folding the dough onto itself and trapping as much air as possible inside while you do so.
As you knead, moisture will be absorbed by the flour, and the dough will become less sticky. If after several minutes the dough is still determined to stick to the work surface while you knead, then add a very tiny amount of flour (about ½ tsp). Keep kneading and adding tiny amounts of flour until the dough stops sticking while you knead (you know you’ve got the balance perfect when you can continually knead the dough without using the pastry scraper, but the dough glues itself to the work surface if you leave it sitting for more then ten seconds).
Now, take the dough, form it into a ball, give it a little sprinkle of water (I run my hand under the tap and then just smear it across the ball) and cover it with a bowl at least four times larger then the ball (yes, leave the dough on the work surface and just cover it with a bowl – none of this namby pamby “oiled bowl covered with damp cloth” garbage). If you don’t have a really big bowl then go ahead and do the old-fashioned oil-in-bowl method, but either use really good oil that will add some flavor, or a completely neutral vegetable oil. Using oil will change the color and texture of the finished crust, usually making it both browner and thicker.
Now leave it alone for about an hour. It will rise. After an hour, punch it back down to its original size, form it into whatever you want (and in or on whatever pan you’re going to bake it in) and leave it alone for another 90 minutes or so. It should be covered (and a bit more water smeared on it) but don’t cover it with anything that might come in contact with it while it rises. This is why I like those simple round loaves – I just toss the dough ball on some parchment on a sheet pan and recover with my big-assed bowl.
Preheat your oven as hot as you can – usually around 500 degrees.
Very gently transfer the risen dough to the oven, and as soon as you close the door reduce the temperature to between 410 and 425 degrees (ovens are notoriously inconsistent, so your mileage may vary).
Leave it alone to bake in solitude. It’s done when the crust is “golden brown and delicious ™”, and it sounds hollow when you tap it.
You’ll probably have to do the recipe a few times before you figure out a reliable time/temperature for your preferred loaf style. Also, depending on the heat distribution of your oven, you may find that you get a big air-pocket in the top of the loaf (this can also happen if the skin of the loaf gets too dry during the second rise).
If that happens, then slit the top of the dough with a very sharp knife right before baking (yes, it appears there is a reason bakers do that as it turns out – it’s not just cosmetic).
Hopefully this will propel you towards bread self-sufficency and a desire to experiment further, and without having to spend twenty years getting there.
I was reading something earlier today in which someone made yet another inaccurate use of the Schrödinger’s Cat analogy.
The cat. The damned cat. Why does everyone fixate on the cat?
Schrödinger did a terrible thing when he imagined that scenario – he made it seem like the question he was positing could be framed in macro-world terms. And a whole lot of people have since latched on to it and think that they have an understanding of quantum physics as a result.
The cat is a lie. Even more so then the cake, but that’s another issue entirely.
What Schrödinger was trying to communicate was a much more complex and confusing notion by way of the cat, and the cat just doesn’t do it justice. We all know (er, well, anybody who might be interested in reading this at any rate) the story of the cat. You put it in a box with a vial of poison and a trigger that can be tripped by sensing subatomic activity. There’s a 50/50 chance that the activity will occur, and thus the cat can die without warning.
However, as per Schrödinger, if you haven’t observed the subatomic activity (or the consequences of it) then it hasn’t actually occurred. Thus, until you open the box the cat is neither alive nor dead – it’s in an unspecified state because it has not been observed. The moment that the box is opened, observation takes place and the state of the cat becomes fixed. It’s either going to be alive or dead.
The analogy is nonsense, and that’s pretty much what Schrödinger intended it to be. He was presenting an allegory that was supposed to illustrate how absurd the then-new notions of quantum mechanics were, and in effect was trying to undermine them. Unfortunately (for him, though fortunately for us) he instead invented a popular physics-Koan. For scientists, the whole point of the cat is to illustrate how wonky the sub-atomic regions of reality really are.
It’s a thought experiment, but it’s as close to a metaphysical experience as you’re going to get from physics unless you devote a decade or two to its study. The cat actually made sense to a great many people, even though the understanding was fundamentally misguided. The notion that the cat could be both alive and dead was actually comprehensible to many people who didn’t have a lengthy education in physics (and to whom the details of quantum mechanics would cause exploding-head-syndrome). “Yes,” they said, “I get it. The cat is neither alive nor dead until I observe it!” But they take it literally.
That’s not how it works.
If you actually performed the experiment as described, then the cat would live or die regardless of observation. Probabilities always collapse into a singular outcome… well, unless you subscribe to the Many Worlds notion – in that case the each possible result will generate an entirely new copy of the universe – and while I’m not about to dismiss the possibility of that entirely, it seems improbable given the energy requirement of duplicating the entirety of space-time whenever anyone makes a decision about what they’re going to have for dinner.
What makes quantum physics so disturbing is that it suggests that nothing is real until it is “observed”, yet plenty of stuff became “real” long before there was a sentient creature available to make observations. So while quantum probability may be subject to change as a consequence of observation, it does not change the fact that probabilities will collapse into a result state regardless of observation.
What that means is that for most things, the likelihood that a quantum event will result in the observed outcome of a particular particle in a particular place is so near to 100% as to be not worth arguing. An atom of iron in a red blood cell in your body is going to exist there upon observation because it *must* exist there regardless of probability. Hunt long enough and you might find a missing molecule where you’d otherwise expect to find one, but it’s pretty damn unlikely.
However, while the probability of that occurring may be 99.9999% it doesn’t actually guarantee that the particular confluence of sub-atomic particles that *should* be found at a given location will actually be there. Until we look for it, or until it *needs* to be there in order for some subsequent reaction to take place.
But now we’re entering an arena of near metaphysical science that has no clear ground on which laymen can stand – the venue occupied by super-string theory, many worlds theory, and so forth.
Frankly, it’s not that I understand any of it, it’s just that I’m really tired of hearing about that damn cat.
I had a passionate five year love affair with Battlestar Galactica. From my trepidations first date with the mini-series, though the inevitable ups and downs in our relationship, until that final moment when the whole thing just quietly fizzled out. It ended on my birthday.
The conclusion of a long-running show with complex interweaving plotlines must be terrifying to the people responsible for shepherding it to the inevitable conclusion. Most shows never see the end coming. A quick bullet to the brain ends them while nobody is looking. Occasionally, producers will see the writing on the wall – random rescheduling and preempted airings, low ratings, and the like – and will make an effort to wrap up as many loose ends as possible so that the fans won’t be left hanging. When the “Terminator” series got the axe at the end of its second season, the show-runners made a heroic effort to end the series gracefully. Ever optimistic, they still left the door cracked open just enough to allow them to continue if the opportunity ever arose – and if “Terminator: Salvation” had been a blockbuster hit, it might well have gotten a green light, but the film tanked and the TV series was officially deceased after the film’s opening weekend.
So, when producers have the foreknowledge that the next season of their show will be the last, you’d think they’d scheme and plot at the height of their ability. Battlestar had plenty of things going on, much of it operating on a human level, some of it verging into mystical territory. There were plenty of questions that needed answers, and some questions that were best left as mysteries.
Sci Fi (now going by the horrific re-brand Sy Fy so that they can justify airing even more non-genre reality programming) threw a spanner into the works early on by choosing to drag out the final season for an additional year by taking a ten month break between the first and second half, but that shouldn’t have had a huge impact on the production. All twenty episodes were filmed on a normal schedule, and Sci Fi just shelved the back half until they felt ready to say goodbye to their flagship cash-cow show.
The first half of the season was filled with strong episodes and lead to the discovery of a devastated Earth in the cliffhanger finale. Since I (and I suspect many others) had guessed that this would be the ultimate outcome of the *end* of the series, it seemed like a huge twist and a portent of even darker days to come.
And things got pretty dark. Dualla dead by her own hand, Zarek and Gaeta hauled in front of a firing squad. There was some pretty amazing stuff going on.
But at the same time I felt a growing sense of trepidation. As each episode aired and the conclusion drew closer, the show seemed unable to confront any of the big issues directly. We veered wildly as the story became focused on Hera, and spent a lot of time bouncing between ever more human Cylons and expository nonsense pouring out of the final five. Oh, and the valiant Galactica, which has survived direct nuclear strikes, hurtling through atmosphere, and countless Cylon assaults, has suddenly worn out.
I still had hope, right up to episode nineteen…
With two shows to go, the second-to-last episode spent its precious forty-four minutes in flashbacks that told us absolutely nothing about the situation at hand. It was a reminiscence, and indulgence that allowed the writers to present us viewers with some back stories of the central characters. Under any other circumstances I would applaud an episode like this – season three’s “Unfinished Business” (the boxing episode) was a highlight of the series for me.
But with the fate of the human race hanging in the balance, and precious little screen time left to tell that story, it wasn’t the right time for character building. In all fairness, I didn’t know at the time that the episode had been made as “filler” when the network had shifted gears and told the producers that they’d be allowed to air a two hour conclusion. What had been episode 19 suddenly became the first half of episode 20, and the vacant slot in the schedule had to be filled with something that could be shot quickly and cheaply.
With a combination of both expectation and fear I started watching the final show. I knew there were just too many things left unfinished to be dealt with properly in a mere ninety minutes. And, of course, I was right.
It all goes downhill pretty damn quickly. Skip past the flashbacks to the attack on the Cylon base – which is floating at the edge of a black hole (gee, the huge Cylon civilization seems pretty paltry). Glactica, fragile though she might be, rams nose first into the impregnable fortress. A moment later, Anders and the rest of the Five put the entire base to sleep. Wow! What an incredible solution! Ron Moore is a genius for coming up with that one — except he did it back in 1990 when the “Star Trek” episode “The Best of Both Worlds” aired. The only difference? The Cylon base doesn’t explode, suggesting that Ron has learned to restrain his god-like powers.
Within moments, Hera is rescued and Athena shoots Boomer (or maybe it was Boomer shooting Athena… they all look alike to me), and everyone runs back to Galactica. Then Hera wanders off in the middle of a firefight and is saved from marauding Cylons by Balter. It is revealed in this moment of crisis that the mysterious opera house door we’ve been seeing for the past few years is actually… [ponderous drum roll]… the entrance to the Galactica’s control room. Wow. That’s underwhelming.
Cavil has somehow managed to bypass all the defenses, reaching the CIC before most of the others, and catches hold of Hera before anyone can stop him. Human-hating Cavil quickly agrees to a cease fire and the Five agree to give him resurrection – except that in order to burn a DVD for him the Five have to link minds, which reveals Tori’s responsibility for the Chief’s wife’s death, and he’s so enraged that he immediately kills Tori and the DVD burner craps out.
While nobody could stop the chief, the folks in CIC are easily able to dispatch the remaining Cylons in the room, though Cavil inexplicably eats a bullet rather than put up a fight. That done, the hand of god intervenes and causes some nuclear missiles to spontaneously fire from a nearly destroyed Raptor, and the entire Cylon base (which is, may I remind you, sitting at the edge of a black freaking hole) starts to pop like a balloon in an oven.
Adama orders a jump, and Starbuck taps in the notes from the mysterious music that everyone’s been fixating on for the past year or two. And suddenly the Galactica is safe and sound above a nice little retirement community.
When everyone else arrives, Adama declares that all the ships will be destroyed and that the entirety of their civilization will go native – no soap, no medicine, a life of grueling manual labor, early death – and everyone agrees (perhaps some of them find hairy native pre-humans attractive).
Adama takes the last Raptor for himself (at least he’ll have somewhere to go when it starts raining) while most everyone else wanders away half-heartedly. Except for Starbuck, who vanishes into whatever higher plane of existence she was resurrected from.
Flash forward ten-thousand years as two gods in the guise of Six and Baltar roam around present day Earth making snarky comments.
Curtains down. Lights up. That’s all she wrote, folks.
So what, in the end, did I think of the show? It had a few touching moments, lots of explosions, and did an enormous disservice to a series that has told some of the most compelling stories I’ve watched on TV. If there was a singular error on the part of the producers it was in trying to pack all the big answers into a single show, rather than more effectively spreading them through the entire stretch of those last ten episodes.
A lesser evil is that they tried to be clever, to not present the fans with any of the more obvious potential answers. Let’s face it, there was an enormous amount of conjecture floating around the internet about the directions the story could have taken, and a few were pretty dang smart. The writers had obviously considered making Starbuck’s father the missing member of the Final Five, and had laid a few breadcrumbs pointing in that direction, but instead choose to bring back Ellen (which makes everything he went through after poisoning her seem sort of trivial) because that would be something “completely unexpected”. Sure it was unexpected, but it was unexpected because it made no sense (and led to the frankly ghastly scenes of Cavil ranting on about defective manufacturing).
In all, the show deserved a better end, perhaps a much gentler one at that. I’d have been happy to forgo the big space battle in exchange for a verbal fight – imagine what the show might have been like if the Humans and Cylons had actually tried to negotiate a true peace with words instead of cannons? Impassioned argument was always the hallmark of the show’s strongest scenes – need I remind anyone of Lee’s amazing monologue in the season three finale – and there was a tragic lack of it at the end of the series.
No, I can’t say I was happy with Galactica’s swan song – it was like an awkward last dinner with someone you’re breaking up with. At least I can look back at the series and mutter softly to myself “We’ll always have Paris…” and think about the good times.
I wrote this as a private rant after seeing the movie, and one of the recipients was a long-time friend who was, at the time, managing editor of “Rue Morgue Magazine”. She published it as a capsule review, though I’ve never actually seen the issue.
– – – – –
In 1997 Paul Veerhoven turned Robert Heinlein’s science fiction masterpiece Starship Troopers into a blockbuster action picture. Along with amazing CGI bug battles, the film also managed to convey an anti-Fascist message cleverly and humorously. The film was only vaguely successful, and it was seven years before an ultra-cheap sequel got made and instantly forgotten. How cheap? The bugs looked like people.
And now there’s Starship Troopers 3: Marauder.
Casper (“What happened to my career?”) Van Dien reprises his role as John Rico from the 1997 original after wisely skipping the previous installment. Joining him is a cast of cannon fodder including Amanda (“I took my clothes off in a Ken Russel film”) Donhoe, who plays a scheming admiral, and Jolene (“Star Trek typecast me as a frigid bitch and now I inject collagen in my lips to get jobs”) Blaylock as the hard-assed starship captain.
Unlike Starship Troopers 2, this is not a no-budget movie. Some hard cash got spent on this sucker. Sure there are plenty of tragically bad visuals, most of which feature poorly animated bugs and giant killer robots. But some of the footage is slick enough that you’re momentarily fooled into thinking this is an actual Hollywood feature. Exploding heads? Awesome!
Like the first film, Marauders also appears to have some amusing social commentary to offer along with its cheese, primarily a lightly painted subtext that seems to be taking the piss out of conservative Christianity.
Then it all goes horribly wrong.
Without warning, right at the climax, the movie careens into an obscene display of rampant American Fundamentalism, transforming all the “amusing commentary” into serious statement. Lead characters fall to their knees praying for salvation and are miraculously rescued. Cue halo of shining stars around requisite Virgin Mary surrogate. Blaylock gets religion, and gets it bad. The forces of the Federation declare “God is on our side”! Cue montage of crucifix shaped barrels blasting bugs on the fiery battlefield. Dante would weep with joy.
Some might suggest that this is in the vein of Veerhoven’s original, which did such a great job of making Fascism look loony. But that’s not how it comes across, and if it was their intent then the filmmakers have failed miserably. Van Dien is apparently a “true believer”, and one wonders how much of his soul Ed Neumier sold just to get his director’s credit. Then again, maybe it’s his shtick too.
Frankly, flagrant fundamentalist propaganda makes me ill, but this direct-to-the-DVD-discount-bin picture also ends up embracing/endorsing the fascism that Verhoven tried so hard to diminish. The apparent moral of this story is that unthinking obedience is “what God wants.”
Now that’s scary.