Featured Post

What is this Project, Anyway?

What is this Project, Anyway? I'm very glad you decided to ask that question. This project, first and foremost, is to think about t...

Thursday, December 15, 2016

Conclusions

Conclusions

There's a limited amount of time in each day, and with a busy schedule, it's unfortunately not possible to watch every single episode of TV or movie, to listen to every song or read every book that touches a topic relevant to the Anthropocene. When human impact stretches from climate change to animal extinction, to artificial intelligence, body augmentation and virtual reality, there's almost too many sources too look at, and never enough time to view them all.

What I learned from this project was that my initial beliefs were correct. The media we consume seems to directly challenge our notion of what the limits of our own power are. The stories we tell have to be negative, it seems, or else there's no story. No great challenge for our protagonists to overcome, or moral crisis for them to face. That's not to say that there aren't positive stories out there--I'm sure there are--but perhaps they aren't as easily accessible, because we as a society don't want to read stories where everything is okay from start to finish. It's uplifting to read a real story about a drug that can help treat cancer patients, but in a fictional story, we want there to be a struggle that's going to keep us invested.

When it comes to the bigger topics, like genetic engineering and artificial intelligence, I think I've learned that we have a unique fear of ourselves. Humans are not ready to embrace the kind of power we have over the world. Perhaps it's because many of us believe that we are not the most powerful beings in our own Universe. A different project would focus on the religious influences of climate change denial, for example, or how religion can influence our moral positions on things like artificial intelligence or virtual reality. Maybe a different part of it is a fear of responsibility that many of us have. When something goes wrong, be it a dangerous dinosaur breakout or an ape revolution, we don't want to be the ones to figure out how to fix it, because it's challenging. We're afraid of being wrong, too. That might be the easiest answer to this question. We're afraid of being catastrophically wrong, but may not admit when it's a possibility.

The stories we tell through movies and television, I think, are representative of a larger group consensus on technology and mankind's place in the world. We're afraid of our own power, and are resigned to telling stories of our own failures rather than stories of our success. The Anthropocene is scary stuff, I'll admit--it's a lot to take in. But it's here and happening right now, whether we like it or not, so it's probably for the best if we learn to deal with it and start getting optimistic about how our ingenuity can be used.

Rise of the Planet of the Apes

Rise of the Planet of the Apes (2011)

Thought we were done exploring the consequences of technology? Of course not! There's still more to talk about here. 

Rise of the Planet of the Apes is the start of a cinematic revival, of one of the earliest cinematic franchises, based around the original "Planet of the Apes" movies from 1968. The story in "Rise" is the origin story (since no cinematic universe is complete without one these days). It runs alongside the animal testing of a regenerative drug for brain tissue, which scientists believe will be a cure for Alzheimer's disease. Once tested on chimpanzees, however, a strange side effect is observed: the chimpanzee being subjected to the gas develops an incredible intelligence.

The caretakers struggle with raising the chimpanzee, not because of the difficulty it causes (in many ways Caesar is more well behaved than a human child) but because the chimp's own intelligence is encouraging him to ask hard questions. Caesar knows that he is not human, and wonders if he's nothing more than a pet. He's driven to love and care for his adoptive parents, but they soon have to take him to an ape refuge, where he can live among his own kind.

Caesar leads a revolt against the abusive refuge workers, and exposes all of the apes within to the same drug that he was, creating an army of super-intelligent apes. The apes rise up against their oppressors, proving to be formidable fighters. The end of the film reveals that a plague is spreading among the human population, causing them to die out--soon the apes shall rule, and we'll be set up for a remake of the original Charlton Heston film after another four films confusingly named for the order of the story. (Side note: why did the first two films in this new series have to be named "Rise" and "Dawn"? I can never remember which is supposed to come first. The Dawn comes before the Rise if we're talking about the Sun, right? The third movie, coming out soon, is called War for the Planet of the Apes, so at least it stands separate enough...rant over). 

Once again, we're faced with the unpredictable consequences of our own technological progress. A miracle drug thought to be extremely beneficial to humans, causes a moral crisis when it grants apes superhuman intelligence. Are these apes to be treated as more than animals now, or will we lead ourselves to all out war? Based on the title of the upcoming movie, it seems that the latter is the case, but I can tell you that I'm pretty excited to find out!

Black Mirror (Part 4)

White Christmas

I was very foolish to think that the Christmas episode of this show would be any less dark or depressing than the other episodes. "White Christmas" has an incredibly crushing storyline--in fact, it manages to squeeze three crushing stories into one episode! All at the cost of just twenty extra minutes of runtime. There's a lot to cover here, and I'm not entirely sure whether I should cover it all, but I'm going to try.

For starters, here's your promised video introduction to this episode, which doesn't do a good job at explaining the plot of this episode at all. Two men are stuck in a remote, snowy outpost with each other. Matt (Jon Hamm) and Potter (Rafe Spall) share stories with one another about how they managed to land themselves isolated at this wintery cabin in the middle of nowhere.

Matt's story begins with what got him into trouble--helping an awkward young man get a date. Matt uses a neural implant that everyone receives to see through another man's eyes. Using an earpiece, he talks to young man through a party in order to land a girl. However, his plan backfires when the girl targeted turns out to be a homicidal schizophrenic. She sees the young man talking to real voices in his head, thinking that he understands her pain. She takes his romantic advice literally, taking a chance at a fresh start and poisoning them both. Now guilty of murder, Matt rushes to get rid of the evidence, but is discovered by his wife.

Matt tells us about the terrible experience he has being blocked. The neural implants act as a social media system in a way. You can "block" real people, and it erases them from your life. Their figure is filled with static, their voice muffled, and any video or picture of them is similarly blocked.

This, however, was not Matt's day job. His day job was to help program personal assistants, as he explains to Potter. The personal assistant is based off of a "cookie". An artificial intelligence that is inserted into your brain for a short period of time to learn your thought patterns, preferences, and the like. The computer becomes you, and then is removed. Using software, Matt is able to break down the cookie (the brunette woman in the white room shown in the trailer is this cookie). He can simulate the passage of time, forcing the cookie to spend three weeks, then six months, in solitary confinement. Without anything to do, the cookie goes mad and begs to be put to work, now gladly operating as a personal assistant without resistance. There's a tricky moral issue here. The cookie believes itself to be a real person, despite just being lines of code. Should we feel sympathy for this coded woman, being forced to serve a real, biological woman's needs?

After this, Matt finally gets Potter to open up. He was in love with a woman, Beth. He tells us that they were together for years, and that her father never liked him but they stayed together all the same. We see her start to grow distant for reasons unclear to us, and Potter finally discovers that she's pregnant. She's not happy with the pregnancy, however, and in an argument blocks Potter completely, enraging him. She never unblocks him, and it drives him mad. He tries to forget, assuming she will soon get an abortion or miscarry because of her drinking, but when he sees her blurred form on the street, still very much pregnant, he tries to confront her about it. This confrontation turns physical enough to be deemed assault, and the block now comes with a programmed restraining order.

Potter realizes he can still see Beth at her father's each year for Christmas. He wants to see his child desperately, but discovers that the block transfers to the child as well. He hovers around for several years each Christmas, sometimes leaving gifts for the child he soon discovers is a girl. Beth dies in a train crash, removing the block. Potter sees this as an opportunity to finally see his daughter. But when he goes to talk to the little girl, he discovers that she is not his--she looks nothing like him. He confronts Beth's father, and in an argument, throws his gift at him, killing the old man. The little girl is left alone, and dies in the cold when she goes for help. This tragic event leads Potter to the outpost, or so we think.

As it turns out, Potter (the version we've seen) has been a cookie this entire time, involuntarily taken from the real Potter in order to get a confession. Matt's story was still very much real, and the police used his skill to get the confession in exchange for being released from jail. The only consequence is that Matt is now blocked by everyone as a registered sex offender, meaning he can interact with nobody in the real world. Potter sits in a jail cell in the meantime, but the police decide to punish his cookie with a thousand years of imprisonment in the snowy outpost.

There's lots of questions posed by this episode: the first is the more human aspect. If we can block people in real life...is it a morally correct thing to do, to be able to unilaterally shut someone out of your life? We see how Matt's wife does it, and how Beth manages to drive Potter slightly mad by cutting him out of her life, and cutting her out of his forcefully. Does Matt's freedom at the end of the episode have any meaning if he can't interact with anybody at all?

The Cookies raise another interesting point. Once again, at what point is a person a person? Is it morally reprehensible to force a computer to do something, if that computer believes it's a real person? Is anything accomplished by torturing Potter's cookie instead of the real man? Should we feel bad for the computer, which has done nothing wrong? Was justice served, or was Potter simply the victim of two different technological advancements?

When we have the ability to use technology for these purposes, we'll be forced to ask ourselves what the consequences are in their use. The stories that Black Mirror tell are overwhelmingly negative, but in a fascinating way that keeps you coming back for more. The creators seem to be keen on forcing introspection through these stories, and they succeed fantastically. 

Black Mirror (Part 3)

Playtest

I'm a bit more thankful for this episode now than when I watched it, and that's because of this video, shown above. The featurette for "Playtest" gives a little insight into the creator's thought process, especially for this unique episode from Season 3. 

"Playtest" follows Cooper, an American travelling across the world to escape his humdrum life at home. When he lands himself in England, he meets a girl (Sonja) who gets him to open up. Cooper isn't just travelling, he's running away from home. He lives at home with his mother still, as he spent the past several years taking care of his aging father, who was suffering from Alzheimer's disease. Cooper never connected with his mother well, so when his father passed away, he had to get out of the house and rediscover himself. We discover that his mother has called him consistently over his journey, worried about her son. He ignores her, deciding that he'll face her in person when he returns home.

Cooper runs out of money, and is out of a way to get himself back home. He tells Sonja that he's been using an app to find odd jobs everywhere he's traveled in order to scrape together cash for each leg of his journey. He sees an add to become a play tester for a video game company in the area, offering enough money to get him home. Sonja tells him that if he can get some footage of their latest project, that she (as a tech journalist) would be able to get him even more money.

The CEO of the game company is a bit of a horror aficionado, and the latest in virtual reality technology has allowed him to create the ultimate horror experience. Cooper uses his phone in a brief moment alone to capture a picture of the technology that's been installed in the back of his head. He fears he's been caught, when his phone rings. The true test, following the brief experience with a virtual gopher shown in the video above, is to spend time in an old mansion to face the virtually created fears in a game. The device implanted in the back of his neck allows the game creators to see what makes Cooper tick, and use it to scare him.

The first steps are relatively harmless. Lights flicker in the house, the portrait above the mantle changes. A spider scuttles across the floor as we learn Cooper is an arachnophobe. He confronts his high school bully, and then a hideous conglomeration of his high school bully and a giant spider. Cooper is consistently reminded that his hallucinations aren't real, but the fears he's being forced to face are getting more and more real. Sonja bursts in to warn him, but is it really Sonja? She attacks him, and he swears that he can feel pain as she does so. He begs to be let out, only to discover that he's being tricked into submission, that the company is trying to see how far fear can push a person.

Leaving the resolution a mystery for this post, you're going to want to see it. Well, "want" is a strong word, but I can tell you that it is easily the most intense episode of this show that I have seen. "Playtest" shows us that we can blur the line between reality and virtual reality a little too easily, and makes us question what reality even is. If our brains can be fooled into thinking things are real, like giant spiders and crazed attackers, then how can we trust that anything at all is real? How do I know I'm even typing this blog post right now?

...okay, that's a bit of a stretch, but just writing about this episode is going to keep me up tonight as I'm trying to sleep. The Anthropocene's forcing me to think about what I know is real, and I'm not sure that I like it. Humans have control over their own realities now. Virtual reality and augmented reality are becoming more and more popular (just look at the phenomenon of Pokemon Go). Even some physical conditions such as colorblindness make us think about what our reality is compared to someone whose eyes are wired differently. Technology can provide us with a great escape, yes, but at what cost?

Black Mirror (Part 2)

Be Right Back

Black Mirror's second season premiere seemed, solely from the description like it was going to be the saddest episode that I was going to watch. Unfortunately for me (and you if you choose to watch this show), I was very, very, wrong.

The trailer for that episode, shown above, should clue you in to the basic premise. Martha and Ash are long time lovers, but when Ash dies suddenly and unexpectedly, Martha has difficulty coping. I don't think it's hard for anyone to sympathize with her. Eventually, every one of us has to experience the loss of a loved one, and I don't suppose for a minute that any of us wouldn't cherish the opportunity to speak to or see someone that we were close to for just a little while longer--for just one last time.

Martha originally rejects the technology, which takes a person's online presence, via text messages, videos, and social media posts, and constructs a software recreation of their personality for the grieving party to speak to. It feels like a major violation of privacy, but then again--do the dead have privacy? This and other questions will be addressed at the end of this post. 

Martha eventually caves in and signs herself up for the program. Her connection to Ash is reestablished, and she feels whole again. The computer does a good enough job of impersonating him, and with a little input from Martha it begins to learn. Tragically, Martha learns that she is pregnant, and must learn how to deal with expecting Ash's child without Ash in her life. She throws herself more into the program. She establishes a voice connection, using phone messages and videos to teach the computer program Ash's voice and mannerisms. "You sound just like him," she says.

She communicates with robo-Ash through her cell phone, which has now taken a position as important as Ash once did. When she drops her phone at the doctor's office, she has an emotional breakdown, having lost him a second time (and this time being responsible for Ash's "death"). She's able to reconnect shortly thereafter with a new phone and her laptop, but the software tells her of something else she can try...

The company that provides this grief software also creates blank androids for you to upload personalities and people onto. It operates in the same way as the text messaging and voice calling, only now this recreation has a physical embodiment as well. It's starting to feel like we're taking this too far. Shouldn't she be coping with her loss instead of finding ways to reinstate Ash in her life? And who am I to judge? It's not mine or anyone else's place to tell another person how they should grieve, right?

Martha begins to struggle with the android. It has its limitations. It's based upon Ash's social media, not Ash, so it misses lots of his quirks, or behaviors that even Martha can't tell it how to reenact. It has a limited range, and must stay within 25 meters of her--when she tells it to leave the house, robo-Ash can't even get out the front gate. He doesn't need food, or water, or sleep, and must be taught to act more human for her.

The disturbing part of this technology, which the episode doesn't force us to confront directly, is that the software encouraged Martha to continue upgrading itself. This software, which comes from a private company, was taking advantage of a woman's grief in order to sell product. 

Moreover, Martha's belief in this technology is keeping her from truly moving past her grief and accepting the death of her loved one. And it raises questions about our current use of social media, as well. Is what we post online an accurate representation of who we are as people? In Ash's case, it was almost good enough, but not quite. Am I nothing more than a collection of photos and Facebook statuses? I shudder at the thought.

When we have the power to bring people to life through software, we are once more forced to ask what makes someone human (just like Blade Runner, yet again). What is that intangible quality that made Ash different from his software recreation? Who knows, but it certainly isn't easy to find out...

Black Mirror (Part 1)

The Entire History of You 

Let me preface this with a warning that I've been providing to just about everyone I've spoken to over the last few weeks: I'm going to tell you to watch Black Mirror, and then I'm going to tell you to stop watching it and never watch it again. Black Mirror is a British television show which began running in 2011. You can think of it as a contemporary Twilight Zone. Each episode stands by itself and serves to mess with your mind as best it can. 

The series taps into our uneasiness around technology, something that this project has explored with other stories told in film. Every episode introduces a technology that range between improbable and quite possible, along with everywhere in between. Regardless of the feasibility of these technologies, when you stop to think about an individual episode of Black Mirror, you consider the technology used in it as something that you may want to use for yourself someday. The episode introduces you to this technology, allows you to fall into a sense of complacency thinking that the technology is cool or may be useful, and then proceeds to hit you over the head with why it's not and make you feel bad for ever entertaining the idea.

Hyperbole aside, Black Mirror's writers do an incredible job of connecting you with their characters in a short ~1 hour episode. And through the advanced technologies they employ, you get to see yourself in these characters as well. This will be the first of four entries about Black Mirror, and I had to limit myself to that, lest this project become solely focused on that series.

"The Entire History of You" is the series' first season finale. You can watch the trailer, which gives a brief overview of the premise, here:

The technology employed in this episode allows you to rewatch memories. Our main character, Liam, uses it to play over a job interview he has at the beginning of the episode. Other characters talk about how they can show other people their memories. Want to share pictures of your vacation with your family without lugging around a camera? Just project your memories onto the television and you can show them everything.

The memory device has some practical applications to it as well. Liam is told to rewind his previous 48 hours of memories when he goes through airport security, for example.

And while the ability to rewind and review our memories seems enticing, Black Mirror decides to show us why we're wrong. It breeds paranoia. Even in the initial conversation about his job interview, Liam goes back and analyzes the interviewers' mannerisms and listens to their words over and over and over again, trying to convince himself that they were not as interested in him as he might have thought if he didn't dwell on those memories.

Perhaps, without this device, Liam could still be self-conscious about his interview skills and constantly replay them in his head (to the best of his organic ability). But this machine sits, implanted in his head, tacitly begging him to use it. His dinner party guests try to make a game of sitting around and watching Liam's interview so that he can receive feedback.

Liam finds himself watching his wife interact with their guest, Jonas. He replays their interactions over and over and over and over and over again, convinced that he's seeing something unusual in how they behave around one another. He sees a light in her eyes, and a brightness to her smile when she looks at him that he sees disappearing when she looks back to him. He starts to wonder about Jonas's relationship with his wife. 

He digs deeper, accusing her of lying to him about her real relationship with him. Jonas's comment over dinner bugs him--that he sometimes goes back and looks at certain sexual experiences he's had with past partners when he needs a pickmeup, or even when he's with other women. After uncovering that Jonas dated his wife, Liam's paranoia drives him to confront Jonas over the memories he's certain to be rewatching.

Black Mirror is forcing us to consider that this memory technology is not the convenience it seems to be. When your memories aren't secret, nothing is. Liam had the ability to find the truth to his suspicions, but we're left wondering whether it was all worth it in the end? He's certainly not satisfied to learn the truth. The memory device ruins his life, where he could have continued on in blissful ignorance otherwise.

This kind of personal augmentation has Anthropocene written all over it. We can change ourselves, and how we function as biological creatures, with the tiniest of machines. Is technology solely a good thing? What are the moral implications of it all? More Black Mirror to come....

Dinosaurs Galore!

Life uh...Finds a Way...


Is it okay to include two movies in one post? Because that's what we're doing here. The Jurassic Park series is one of the most well-known movie franchises in the world. The original film is widely touted as a classic, and the recent reboot, 2015's Jurassic World, grossed 1.67 billion dollars. You'll forgiving me for assuming that you all know the story, but here's a brief overview of the plot of Jurassic Park (and subsequently World as well, but we'll get to that).

Scientists discover a way to bring dinosaurs back to life by replicating the DNA found inside of an ancient piece of amber. They set up a large facility on a tropical island to clone the dinosaurs, and establish it as a park for visitors to come and see. What could go wrong?

Well, as it turns out, everything can go wrong. That's Murphy's Law, I suppose. When the electric fences go down, the dinosaurs wreak havoc on the defenseless guests. 

Jeff Goldblum's character, Ian Malcolm, has a few words to say on the inevitable failure of the park. In the video to the right, he explains his work with chaos theory. Ultimately, he says, that nothing can be predicted, and tiny variations throughout the environment can create different outcomes, despite our predictions. 

He has much the same to say in the video at the beginning of this post (while it would have been more appropriate to put in the middle here, the title opportunity was too good to pass up). The scientists claim that the park is completely safe, because they've genetically altered every dinosaur to be female. Malcolm claims the contrary, that just because they've made these alterations, does not mean that they are permanent. He says that eventually, their technique will stop working, or the dinosaurs will correct themselves, and then everything will go South, and quickly. 

Unfortunately for Dr. Malcolm, his suspicions are confirmed when the dinosaurs begin to run amok. Tiny variations allowed for park security to fail and for lives to be at risk. 

I find that Jurassic Park is raising similar questions to GATTACA, as far as the Anthropocene is concerned. Which you wouldn't expect considering the difference in tone between the two movies. Both raise the technology of genetic engineering, and Jurassic Park introduces just a dash of cloning as well. Though not human cloning, like Never Let Me Go, the combinations of technologies in Jurassic Park are not super outlandish.

The miniscule levels of genetic modification the scientists at Jurassic Park perform are not too different to what we've begun to do to ourselves. And the idea of cloning dinosaurs (or other megafauna, such as wooly mammoths or saber-toothed tigers) is one that we've toyed with for sometime. A project for a different time would explore how Jurassic Park inspired these wants and wishes, despite the dangerous outcomes of the story.

Jurassic Park raises an interesting question: Just because we have the ability to do something, does that mean we should do it? Does science obligate us to seeing out the abilities of a technology, no matter the potential cost? We can't predict the future, so is it safe to mess with things beyond our control?

The Anthropocene implies that humans have been thrust into a role of great power over the Earth. But we are not omniscient beings. We have our faults, and our stories tell these faults very well. We cannot predict the future. We often assume too much of our own capabilities, and oftentimes that means that our best intentions come crashing down on our heads.

Jurassic Park's 2015 reboot, Jurassic World, tells much the same tale. It takes the genetic engineering bent of Jurassic Park and cranks the dial to 11. The primary antagonist (if that's how you'd like to refer to it) is the Indominous Rex, a human created breed of dinosaur with all the biggest, baddest, best parts of every cool dinosaur you saw in an illustrated encyclopedia. Much like the movie that started it all, human desire to create an amazing attraction and make tons of money backfired spectacularly.

The Jurassic movies make us wonder if we can trust ourselves with the science behind these creations. If it were possible, should we bring back dinosaurs? Movies are telling me not to do it, but it's making it hard to believe that we'd fare much better in the real world than the characters in these movies. 

The Anthropocene is all about changing the world around us. And that means reviving long dead portions of it too. The dinosaurs in Jurassic Park and World are human creations that change the world in unforeseen ways. These stories are telling me that humans are not capable of controlling these kinds of creations.